Wiley, Jeffrey B.
2006-01-01
Five time periods between 1930 and 2002 are identified as having distinct patterns of annual minimum daily mean flows (minimum flows). Average minimum flows increased around 1970 at many streamflow-gaging stations in West Virginia. Before 1930, however, there might have been a period of minimum flows greater than any period identified between 1930 and 2002. The effects of climate variability are probably the principal causes of the differences among the five time periods. Comparisons of selected streamflow statistics are made between values computed for the five identified time periods and values computed for the 1930-2002 interval for 15 streamflow-gaging stations. The average difference between statistics computed for the five time periods and the 1930-2002 interval decreases with increasing magnitude of the low-flow statistic. The greatest individual-station absolute difference was 582.5 percent greater for the 7-day 10-year low flow computed for 1970-1979 compared to the value computed for 1930-2002. The hydrologically based low flows indicate approximately equal or smaller absolute differences than biologically based low flows. The average 1-day 3-year biologically based low flow (1B3) and 4-day 3-year biologically based low flow (4B3) are less than the average 1-day 10-year hydrologically based low flow (1Q10) and 7-day 10-year hydrologic-based low flow (7Q10) respectively, and range between 28.5 percent less and 13.6 percent greater. Seasonally, the average difference between low-flow statistics computed for the five time periods and 1930-2002 is not consistent between magnitudes of low-flow statistics, and the greatest difference is for the summer (July 1-September 30) and fall (October 1-December 31) for the same time period as the greatest difference determined in the annual analysis. The greatest average difference between 1B3 and 4B3 compared to 1Q10 and 7Q10, respectively, is in the spring (April 1-June 30), ranging between 11.6 and 102.3 percent greater. Statistics computed for the individual station's record period may not represent the statistics computed for the period 1930 to 2002 because (1) station records are available predominantly after about 1970 when minimum flows were greater than the average between 1930 and 2002 and (2) some short-term station records are mostly during dry periods, whereas others are mostly during wet periods. A criterion-based sampling of the individual station's record periods at stations was taken to reduce the effects of statistics computed for the entire record periods not representing the statistics computed for 1930-2002. The criterion used to sample the entire record periods is based on a comparison between the regional minimum flows and the minimum flows at the stations. Criterion-based sampling of the available record periods was superior to record-extension techniques for this study because more stations were selected and areal distribution of stations was more widespread. Principal component and correlation analyses of the minimum flows at 20 stations in or near West Virginia identify three regions of the State encompassing stations with similar patterns of minimum flows: the Lower Appalachian Plateaus, the Upper Appalachian Plateaus, and the Eastern Panhandle. All record periods of 10 years or greater between 1930 and 2002 where the average of the regional minimum flows are nearly equal to the average for 1930-2002 are determined as representative of 1930-2002. Selected statistics are presented for the longest representative record period that matches the record period for 77 stations in West Virginia and 40 stations near West Virginia. These statistics can be used to develop equations for estimating flow in ungaged stream locations.
Space Station UCS antenna pattern computation and measurement. [UHF Communication Subsystem
NASA Technical Reports Server (NTRS)
Hwu, Shian U.; Lu, Ba P.; Johnson, Larry A.; Fournet, Jon S.; Panneton, Robert J.; Ngo, John D.; Eggers, Donald S.; Arndt, G. D.
1993-01-01
The purpose of this paper is to analyze the interference to the Space Station Ultrahigh Frequency (UHF) Communication Subsystem (UCS) antenna radiation pattern due to its environment - Space Station. A hybrid Computational Electromagnetics (CEM) technique was applied in this study. The antenna was modeled using the Method of Moments (MOM) and the radiation patterns were computed using the Uniform Geometrical Theory of Diffraction (GTD) in which the effects of the reflected and diffracted fields from surfaces, edges, and vertices of the Space Station structures were included. In order to validate the CEM techniques, and to provide confidence in the computer-generated results, a comparison with experimental measurements was made for a 1/15 scale Space Station mockup. Based on the results accomplished, good agreement on experimental and computed results was obtained. The computed results using the CEM techniques for the Space Station UCS antenna pattern predictions have been validated.
NASA Technical Reports Server (NTRS)
Kennedy, J. R.; Fitzpatrick, W. S.
1971-01-01
The computer executive functional system design concepts derived from study of the Space Station/Base are presented. Information Management System hardware configuration as directly influencing the executive design is reviewed. The hardware configuration and generic executive design requirements are considered in detail in a previous report (System Configuration and Executive Requirements Specifications for Reusable Shuttle and Space Station/Base, 9/25/70). This report defines basic system primitives and delineates processes and process control. Supervisor states are considered for describing basic multiprogramming and multiprocessing systems. A high-level computer executive including control of scheduling, allocation of resources, system interactions, and real-time supervisory functions is defined. The description is oriented to provide a baseline for a functional simulation of the computer executive system.
Evolving technologies for Space Station Freedom computer-based workstations
NASA Technical Reports Server (NTRS)
Jensen, Dean G.; Rudisill, Marianne
1990-01-01
Viewgraphs on evolving technologies for Space Station Freedom computer-based workstations are presented. The human-computer computer software environment modules are described. The following topics are addressed: command and control workstation concept; cupola workstation concept; Japanese experiment module RMS workstation concept; remote devices controlled from workstations; orbital maneuvering vehicle free flyer; remote manipulator system; Japanese experiment module exposed facility; Japanese experiment module small fine arm; flight telerobotic servicer; human-computer interaction; and workstation/robotics related activities.
Biolik, A; Heide, S; Lessig, R; Hachmann, V; Stoevesandt, D; Kellner, J; Jäschke, C; Watzke, S
2018-04-01
One option for improving the quality of medical post mortem examinations is through intensified training of medical students, especially in countries where such a requirement exists regardless of the area of specialisation. For this reason, new teaching and learning methods on this topic have recently been introduced. These new approaches include e-learning modules or SkillsLab stations; one way to objectify the resultant learning outcomes is by means of the OSCE process. However, despite offering several advantages, this examination format also requires considerable resources, in particular in regards to medical examiners. For this reason, many clinical disciplines have already implemented computer-based OSCE examination formats. This study investigates whether the conventional exam format for the OSCE forensic "Death Certificate" station could be replaced with a computer-based approach in future. For this study, 123 students completed the OSCE "Death Certificate" station, using both a computer-based and conventional format, half starting with the Computer the other starting with the conventional approach in their OSCE rotation. Assignment of examination cases was random. The examination results for the two stations were compared and both overall results and the individual items of the exam checklist were analysed by means of inferential statistics. Following statistical analysis of examination cases of varying difficulty levels and correction of the repeated measures effect, the results of both examination formats appear to be comparable. Thus, in the descriptive item analysis, while there were some significant differences between the computer-based and conventional OSCE stations, these differences were not reflected in the overall results after a correction factor was applied (e.g. point deductions for assistance from the medical examiner was possible only at the conventional station). Thus, we demonstrate that the computer-based OSCE "Death Certificate" station is a cost-efficient and standardised format for examination that yields results comparable to those from a conventional format exam. Moreover, the examination results also indicate the need to optimize both the test itself (adjusting the degree of difficulty of the case vignettes) and the corresponding instructional and learning methods (including, for example, the use of computer programmes to complete the death certificate in small group formats in the SkillsLab). Copyright © 2018 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.
View northeast of a microchip based computer control system installed ...
View northeast of a microchip based computer control system installed in the early 1980's to replace Lamokin Tower, at center of photograph; panels 1 and 2 at right of photograph are part of main supervisory board; panel 1 controlled Allen Lane sub-station #7; responsiblity for this portion of the system was transferred to southeast Pennsylvania transit authority (septa) in 1985; panel 2 at extreme right controls catenary switches in a coach storage yard adjacent to the station - Thirtieth Street Station, Power Director Center, Thirtieth & Market Streets in Amtrak Railroad Station, Philadelphia, Philadelphia County, PA
Hydrologic Observatory Data Telemetry Network in an Extreme Environment
NASA Astrophysics Data System (ADS)
Irving, K.; Kane, D.
2007-12-01
A network of hydrological research data stations on the North Slope of Alaska using radio telemetry to gather data in "near real time" will be described. The network consists of approximately 25 research stations, 10 repeater stations, and 3 Internet-connected base stations (though data is also collected at repeater stations and research stations may also function as repeaters). With this operational network, radio link redundancy is sufficient to reach any research station from any base station. The data network is driven in "pull" mode using software running on computers in Fairbanks, and emphasis is placed on reliably collecting and storing data as found on the remote data loggers. Work is underway to deploy dynamic routing software on the controlling computers, at which point the network will be capable of automatically working around problems which may include icing on antennas, satellite sun outages, animal damage, and many others.
A computer-based specification methodology
NASA Technical Reports Server (NTRS)
Munck, Robert G.
1986-01-01
Standard practices for creating and using system specifications are inadequate for large, advanced-technology systems. A need exists to break away from paper documents in favor of documents that are stored in computers and which are read and otherwise used with the help of computers. An SADT-based system, running on the proposed Space Station data management network, could be a powerful tool for doing much of the required technical work of the Station, including creating and operating the network itself.
The development of the Canadian Mobile Servicing System Kinematic Simulation Facility
NASA Technical Reports Server (NTRS)
Beyer, G.; Diebold, B.; Brimley, W.; Kleinberg, H.
1989-01-01
Canada will develop a Mobile Servicing System (MSS) as its contribution to the U.S./International Space Station Freedom. Components of the MSS will include a remote manipulator (SSRMS), a Special Purpose Dexterous Manipulator (SPDM), and a mobile base (MRS). In order to support requirements analysis and the evaluation of operational concepts related to the use of the MSS, a graphics based kinematic simulation/human-computer interface facility has been created. The facility consists of the following elements: (1) A two-dimensional graphics editor allowing the rapid development of virtual control stations; (2) Kinematic simulations of the space station remote manipulators (SSRMS and SPDM), and mobile base; and (3) A three-dimensional graphics model of the space station, MSS, orbiter, and payloads. These software elements combined with state of the art computer graphics hardware provide the capability to prototype MSS workstations, evaluate MSS operational capabilities, and investigate the human-computer interface in an interactive simulation environment. The graphics technology involved in the development and use of this facility is described.
NASA Technical Reports Server (NTRS)
Hall, J. B., Jr.; Pickett, S. J.; Sage, K. H.
1984-01-01
A computer program for assessing manned space station environmental control and life support systems technology is described. The methodology, mission model parameters, evaluation criteria, and data base for 17 candidate technologies for providing metabolic oxygen and water to the crew are discussed. Examples are presented which demonstrate the capability of the program to evaluate candidate technology options for evolving space station requirements.
Low-flow characteristics of Indiana streams
Fowler, K.K.; Wilson, J.T.
1996-01-01
Knowledge of low-flow characteristics of streams is essential for management of water resources. Low-flow characteristics are presented for 229 continuous-record, streamflow-gaging stations and 285 partial-record stations in Indiana. Low- flow-frequency characteristics were computed for 210 continuous-record stations that had at least 10 years of record, and flow-duration curves were computed for all continuous-record stations. Low-flow-frequency and flow-duration analyses are based on available streamflow records through September 1993. Selected low-flow-frequency curves were computed for annual low flows and seasonal low flows. The four seasons are represented by the 3-month groups of March-May, June-August, September-November, and December- February. The 7-day, 10-year and the 7-day, 2 year low flows were estimated for 285 partial-record stations, which are ungaged sites where streamflow measurements were made at base flow. The same low-flow characteristics were estimated for 19 continuous-record stations where less than 10 years of record were available. Precipitation and geology directly influence the streams in Indiana. Streams in the northern, glaciated part of the State tend to have higher sustained base flows than those in the nonglaciated southern part. Flow at several of the continuous-record gaging stations is affected by some form of regulation or diversion. Low-flow characteristics for continuous-record stations at which flow is affected by regulation are determined using the period of record affected by regulation; natural flows prior to regulation are not used.
Shuttle mission simulator baseline definition report, volume 2
NASA Technical Reports Server (NTRS)
Dahlberg, A. W.; Small, D. E.
1973-01-01
The baseline definition report for the space shuttle mission simulator is presented. The subjects discussed are: (1) the general configurations, (2) motion base crew station, (3) instructor operator station complex, (4) display devices, (5) electromagnetic compatibility, (6) external interface equipment, (7) data conversion equipment, (8) fixed base crew station equipment, and (9) computer complex. Block diagrams of the supporting subsystems are provided.
Interior view to the south of computer work stations in ...
Interior view to the south of computer work stations in front of elevated work area 1570 on left and elevated glassed in work area 1870 on right - Over-the-Horizon Backscatter Radar Network, Mountain Home Air Force Operations Building, On Desert Street at 9th Avenue Mountain Home Air Force Base, Mountain Home, Elmore County, ID
List of Publications of the U.S. Army Engineer Waterways Experiment Station. Volume 2
1993-09-01
Station List of Publications of the U.S. Army Engineer Waterways Experiment Station Volume II compiled by Research Library Information Management Division...Waterways Experiment Station for Other Agencies Air Base Survivability Systems Management Office Headquarters .............................. Z-1 Airport... manages , conducts, and coordinates research and development in the Information Management (IM) technology areas that include computer science
Geoid undulation computations at laser tracking stations
NASA Technical Reports Server (NTRS)
Despotakis, Vasilios K.
1987-01-01
Geoid undulation computations were performed at 29 laser stations distributed around the world using a combination of terrestrial gravity data within a cap of radius 2 deg and a potential coefficient set up to 180 deg. The traditional methods of Stokes' and Meissl's modification together with the Molodenskii method and the modified Sjoberg method were applied. Performing numerical tests based on global error assumptions regarding the terrestrial data and the geopotential set it was concluded that the modified Sjoberg method is the most accurate and promising technique for geoid undulation computations. The numerical computations for the geoid undulations using all the four methods resulted in agreement with the ellipsoidal minus orthometric value of the undulations on the order of 60 cm or better for most of the laser stations in the eastern United States, Australia, Japan, Bermuda, and Europe. A systematic discrepancy of about 2 meters for most of the western United States stations was detected and verified by using two relatively independent data sets. For oceanic laser stations in the western Atlantic and Pacific oceans that have no terrestrial data available, the adjusted GEOS-3 and SEASAT altimeter data were used for the computation of the geoid undulation in a collocation method.
NASA Astrophysics Data System (ADS)
Jidin, Razali; Othman, Bahari
2013-06-01
The lower Sg. Piah hydro-electric station is a river run-off hydro scheme with generators capable of generating 55MW of electricity. It is located 30km away from Sg. Siput, a small town in the state of Perak, Malaysia. The station has two turbines (Pelton) to harness energy from water that flow through a 7km tunnel from a small intake dam. The trait of a run-off river hydro station is small-reservoir that cannot store water for a long duration; therefore potential energy carried by the spillage will be wasted if the dam level is not appropriately regulated. To improve the station annual energy output, a new controller based on the computed river flow has been installed. The controller regulates the dam level with an algorithm based on the river flow derived indirectly from the intake-dam water level and other plant parameters. The controller has been able to maintain the dam at optimum water level and regulate the turbines to maximize the total generation output.
Inman, Ernest J.
1997-01-01
Flood-frequency relations were computed for 28 urban stations, for 2-, 25-, and 100-year recurrence interval floods and the computations were compared to corresponding recurrence interval floods computed from the estimating equations from a 1995 investigation. Two stations were excluded from further comparisons or analyses because neither station had a significant flood during the period of observed record. The comparisons, based on the student's t-test statistics at the 0.05 level of significance, indicate that the mean residuals of the 25- and 100-year floods were negatively biased by 26.2 percent and 31.6 percent, respectively, at the 26 stations. However, the mean residuals of the 2-year floods were 2.5 percent lower than the mean of the 2-year floods computed from the equations, and were not significantly biased. The reason for this negative bias is that the period of observed record at the 26 stations was a relatively dry period. At 25 of the 26 stations, the two highest simulated peaks used to develop the estimating equations occurred many years before the observed record began. However, no attempt was made to adjust the estimating equations because higher peaks could occur after the period of observed record and an adjustment to the equations would cause an underestimation of design floods.
Lin, Yunyue; Wu, Qishi; Cai, Xiaoshan; ...
2010-01-01
Data transmission from sensor nodes to a base station or a sink node often incurs significant energy consumption, which critically affects network lifetime. We generalize and solve the problem of deploying multiple base stations to maximize network lifetime in terms of two different metrics under one-hop and multihop communication models. In the one-hop communication model, the sensors far away from base stations always deplete their energy much faster than others. We propose an optimal solution and a heuristic approach based on the minimal enclosing circle algorithm to deploy a base station at the geometric center of each cluster. In themore » multihop communication model, both base station location and data routing mechanism need to be considered in maximizing network lifetime. We propose an iterative algorithm based on rigorous mathematical derivations and use linear programming to compute the optimal routing paths for data transmission. Simulation results show the distinguished performance of the proposed deployment algorithms in maximizing network lifetime.« less
Flow characteristics at U.S. Geological Survey streamgages in the conterminous United States
Wolock, David
2003-01-01
This dataset represents point locations and flow characteristics for current (as of November 20, 2001) and historical U.S. Geological Survey (USGS) streamgages in the conterminous United States. The flow characteristics were computed from the daily streamflow data recorded at each streamgage for the period of record. The attributes associated with each streamgage include: Station number Station name Station latitude (decimal degrees in North American Datum of 1983, NAD 83) Station longitude (decimal degrees in NAD 83) First date (year, month, day) of streamflow data Last date (year, month, day) of streamflow data Number of days of streamflow data Minimum and maximum daily flow for the period of record (cubic feet per second) Percentiles (1, 5, 10, 20, 25, 50, 75, 80, 90, 95, 99) of daily flow for the period of record (cubic feet per second) Average and standard deviation of daily flow for the period of record (cubic feet per second) Mean annual base-flow index (BFI: see supplemental information) computed for the period of record (fraction, ranging from 0 to 1) Year-to-year standard deviation of the annual base-flow index computed for the period of record (fraction) Number of years of data used to compute the base-flow index (years) Reported drainage area (square miles) Reported contributing drainage area (square miles) National Water Information System (NWIS)-Web page URL for streamgage Hydrologic Unit Code (HUC, 8 digit) Hydrologic landscape region (HLR) River Reach File 1 (RF1) segment identification number (E2RF1##) Station numbers, names, locations, and drainage areas were acquired through the National Water Information System (NWIS)-Web (http://water.usgs.gov/nwis) on November 20, 2001. The streamflow data used to compute flow characteristics were copied from the Water server (water.usgs.gov:/www/htdocs/nwisweb/data1/discharge/) on November 2, 2001. The missing value indicator for all attributes is -99. Some streamflow characteristics are missing for: (1) streamgages measuring flow subject to tidal effects, which cause flow to reverse directions, (2) streamgages with site information but no streamflow data at the time the data were retrieved, and (3) streamgages with record length too short to compute the base-flow index.
Fast multi-core based multimodal registration of 2D cross-sections and 3D datasets.
Scharfe, Michael; Pielot, Rainer; Schreiber, Falk
2010-01-11
Solving bioinformatics tasks often requires extensive computational power. Recent trends in processor architecture combine multiple cores into a single chip to improve overall performance. The Cell Broadband Engine (CBE), a heterogeneous multi-core processor, provides power-efficient and cost-effective high-performance computing. One application area is image analysis and visualisation, in particular registration of 2D cross-sections into 3D image datasets. Such techniques can be used to put different image modalities into spatial correspondence, for example, 2D images of histological cuts into morphological 3D frameworks. We evaluate the CBE-driven PlayStation 3 as a high performance, cost-effective computing platform by adapting a multimodal alignment procedure to several characteristic hardware properties. The optimisations are based on partitioning, vectorisation, branch reducing and loop unrolling techniques with special attention to 32-bit multiplies and limited local storage on the computing units. We show how a typical image analysis and visualisation problem, the multimodal registration of 2D cross-sections and 3D datasets, benefits from the multi-core based implementation of the alignment algorithm. We discuss several CBE-based optimisation methods and compare our results to standard solutions. More information and the source code are available from http://cbe.ipk-gatersleben.de. The results demonstrate that the CBE processor in a PlayStation 3 accelerates computational intensive multimodal registration, which is of great importance in biological/medical image processing. The PlayStation 3 as a low cost CBE-based platform offers an efficient option to conventional hardware to solve computational problems in image processing and bioinformatics.
Lynch, Rod; Pitson, Graham; Ball, David; Claude, Line; Sarrut, David
2013-01-01
To develop a reproducible definition for each mediastinal lymph node station based on the new TNM classification for lung cancer. This paper proposes an atlas using the new international lymph node map used in the seventh edition of the TNM classification for lung cancer. Four radiation oncologists and 1 diagnostic radiologist were involved in the project to put forward a reproducible radiologic description for the lung lymph node stations. The International Association for the Study of Lung Cancer lymph node definitions for stations 1 to 11 have been described and illustrated on axial computed tomographic scan images using a certified radiotherapy planning system. This atlas will assist both diagnostic radiologists and radiation oncologists in accurately defining the lymph node stations on computed tomographic scan in patients diagnosed with lung cancer. Copyright © 2013 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.
Quantum key distribution using card, base station and trusted authority
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nordholt, Jane E.; Hughes, Richard John; Newell, Raymond Thorson
Techniques and tools for quantum key distribution ("QKD") between a quantum communication ("QC") card, base station and trusted authority are described herein. In example implementations, a QC card contains a miniaturized QC transmitter and couples with a base station. The base station provides a network connection with the trusted authority and can also provide electric power to the QC card. When coupled to the base station, after authentication by the trusted authority, the QC card acquires keys through QKD with a trust authority. The keys can be used to set up secure communication, for authentication, for access control, or formore » other purposes. The QC card can be implemented as part of a smart phone or other mobile computing device, or the QC card can be used as a fillgun for distribution of the keys.« less
Quantum key distribution using card, base station and trusted authority
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nordholt, Jane Elizabeth; Hughes, Richard John; Newell, Raymond Thorson
Techniques and tools for quantum key distribution ("QKD") between a quantum communication ("QC") card, base station and trusted authority are described herein. In example implementations, a QC card contains a miniaturized QC transmitter and couples with a base station. The base station provides a network connection with the trusted authority and can also provide electric power to the QC card. When coupled to the base station, after authentication by the trusted authority, the QC card acquires keys through QKD with a trusted authority. The keys can be used to set up secure communication, for authentication, for access control, or formore » other purposes. The QC card can be implemented as part of a smart phone or other mobile computing device, or the QC card can be used as a fillgun for distribution of the keys.« less
Xu, Shuhang; Feng, Lingling; Chen, Yongming; Sun, Ying; Lu, Yao; Huang, Shaomin; Fu, Yang; Zheng, Rongqin; Zhang, Yujing; Zhang, Rong
2017-06-20
In order to refine the location and metastasis-risk density of 16 lymph node stations of gastric cancer for neoadjuvant radiotherapy, we retrospectively reviewed the initial images and pathological reports of 255 gastric cancer patients with lymphatic metastasis. Metastatic lymph nodes identified in the initial computed tomography images were investigated by two radiologists with gastrointestinal specialty. A circle with a diameter of 5 mm was used to identify the central position of each metastatic lymph node, defined as the LNc (the central position of the lymph node). The LNc was drawn at the equivalent location on the reference images of a standard patient based on the relative distances to the same reference vessels and the gastric wall using a Monaco® version 5.0 workstation. The image manipulation software Medi-capture was programmed for image analysis to produce a contour and density atlas of 16 lymph node stations. Based on a total of 2846 LNcs contoured (31-599 per lymph node station), we created a density distribution map of 16 lymph node drainage stations of the stomach on computed tomography images, showing the detailed radiographic delineation of each lymph node station as well as high-risk areas for lymph node metastasis. Our mapping can serve as a template for the delineation of gastric lymph node stations when defining clinical target volume in pre-operative radiotherapy for gastric cancer.
Computers in Public Broadcasting: Who, What, Where.
ERIC Educational Resources Information Center
Yousuf, M. Osman
This handbook offers guidance to public broadcasting managers on computer acquisition and development activities. Based on a 1981 survey of planned and current computer uses conducted by the Corporation for Public Broadcasting (CPB) Information Clearinghouse, computer systems in public radio and television broadcasting stations are listed by…
Wireless Headset Communication System
NASA Technical Reports Server (NTRS)
Lau, Wilfred K.; Swanson, Richard; Christensen, Kurt K.
1995-01-01
System combines features of pagers, walkie-talkies, and cordless telephones. Wireless headset communication system uses digital modulation on spread spectrum to avoid interference among units. Consists of base station, 4 radio/antenna modules, and as many as 16 remote units with headsets. Base station serves as network controller, audio-mixing network, and interface to such outside services as computers, telephone networks, and other base stations. Developed for use at Kennedy Space Center, system also useful in industrial maintenance, emergency operations, construction, and airport operations. Also, digital capabilities exploited; by adding bar-code readers for use in taking inventories.
Computer-assisted engineering data base
NASA Technical Reports Server (NTRS)
Dube, R. P.; Johnson, H. R.
1983-01-01
General capabilities of data base management technology are described. Information requirements posed by the space station life cycle are discussed, and it is asserted that data base management technology supporting engineering/manufacturing in a heterogeneous hardware/data base management system environment should be applied to meeting these requirements. Today's commercial systems do not satisfy all of these requirements. The features of an R&D data base management system being developed to investigate data base management in the engineering/manufacturing environment are discussed. Features of this system represent only a partial solution to space station requirements. Areas where this system should be extended to meet full space station information management requirements are discussed.
NASA Technical Reports Server (NTRS)
Yanosy, James L.
1988-01-01
This manual describes how to use the Emulation Simulation Computer Model (ESCM). Based on G189A, ESCM computes the transient performance of a Space Station atmospheric revitalization subsystem (ARS) with CO2 removal provided by a solid amine water desorbed subsystem called SAWD. Many performance parameters are computed some of which are cabin CO2 partial pressure, relative humidity, temperature, O2 partial pressure, and dew point. The program allows the user to simulate various possible combinations of man loading, metabolic profiles, cabin volumes and certain hypothesized failures that could occur.
VIEW-Station software and its graphical user interface
NASA Astrophysics Data System (ADS)
Kawai, Tomoaki; Okazaki, Hiroshi; Tanaka, Koichiro; Tamura, Hideyuki
1992-04-01
VIEW-Station is a workstation-based image processing system which merges the state-of-the- art software environment of Unix with the computing power of a fast image processor. VIEW- Station has a hierarchical software architecture, which facilitates device independence when porting across various hardware configurations, and provides extensibility in the development of application systems. The core image computing language is V-Sugar. V-Sugar provides a set of image-processing datatypes and allows image processing algorithms to be simply expressed, using a functional notation. VIEW-Station provides a hardware independent window system extension called VIEW-Windows. In terms of GUI (Graphical User Interface) VIEW-Station has two notable aspects. One is to provide various types of GUI as visual environments for image processing execution. Three types of interpreters called (mu) V- Sugar, VS-Shell and VPL are provided. Users may choose whichever they prefer based on their experience and tasks. The other notable aspect is to provide facilities to create GUI for new applications on the VIEW-Station system. A set of widgets are available for construction of task-oriented GUI. A GUI builder called VIEW-Kid is developed for WYSIWYG interactive interface design.
Mapping snow depth return levels: smooth spatial modeling versus station interpolation
NASA Astrophysics Data System (ADS)
Blanchet, J.; Lehning, M.
2010-12-01
For adequate risk management in mountainous countries, hazard maps for extreme snow events are needed. This requires the computation of spatial estimates of return levels. In this article we use recent developments in extreme value theory and compare two main approaches for mapping snow depth return levels from in situ measurements. The first one is based on the spatial interpolation of pointwise extremal distributions (the so-called Generalized Extreme Value distribution, GEV henceforth) computed at station locations. The second one is new and based on the direct estimation of a spatially smooth GEV distribution with the joint use of all stations. We compare and validate the different approaches for modeling annual maximum snow depth measured at 100 sites in Switzerland during winters 1965-1966 to 2007-2008. The results show a better performance of the smooth GEV distribution fitting, in particular where the station network is sparser. Smooth return level maps can be computed from the fitted model without any further interpolation. Their regional variability can be revealed by removing the altitudinal dependent covariates in the model. We show how return levels and their regional variability are linked to the main climatological patterns of Switzerland.
Itazawa, Tomoko; Tamaki, Yukihisa; Komiyama, Takafumi; Nishimura, Yasumasa; Nakayama, Yuko; Ito, Hiroyuki; Ohde, Yasuhisa; Kusumoto, Masahiko; Sakai, Shuji; Suzuki, Kenji; Watanabe, Hirokazu; Asamura, Hisao
2017-01-01
The purpose of this study was to develop a consensus-based computed tomographic (CT) atlas that defines lymph node stations in radiotherapy for lung cancer based on the lymph node map of the International Association for the Study of Lung Cancer (IASLC). A project group in the Japanese Radiation Oncology Study Group (JROSG) initially prepared a draft of the atlas in which lymph node Stations 1–11 were illustrated on axial CT images. Subsequently, a joint committee of the Japan Lung Cancer Society (JLCS) and the Japanese Society for Radiation Oncology (JASTRO) was formulated to revise this draft. The committee consisted of four radiation oncologists, four thoracic surgeons and three thoracic radiologists. The draft prepared by the JROSG project group was intensively reviewed and discussed at four meetings of the committee over several months. Finally, we proposed definitions for the regional lymph node stations and the consensus-based CT atlas. This atlas was approved by the Board of Directors of JLCS and JASTRO. This resulted in the first official CT atlas for defining regional lymph node stations in radiotherapy for lung cancer authorized by the JLCS and JASTRO. In conclusion, the JLCS–JASTRO consensus-based CT atlas, which conforms to the IASLC lymph node map, was established. PMID:27609192
Computation of Estonian CORS data using Bernese 5.2 and Gipsy 6.4 softwares
NASA Astrophysics Data System (ADS)
Kollo, Karin; Kall, Tarmo; Liibusk, Aive
2017-04-01
GNSS permanent station network in Estonia (ESTREF) was established already in 2007. In 2014-15 extensive reconstruction of ESTREF was carried out, including the establishment of 18 new stations, change of the hardware in CORS stations as well as establishing GNSS-RTK service for the whole Estonia. For GNSS-RTK service one needs precise coordinates in well-defined reference frame, i.e., ETRS89. For long time stability of stations and time-series analysis the re-processing of Estonian CORS data is ongoing. We re-process data from 2007 until 2015 with program Bernese GNSS 5.2 (Dach, 2015). For the set of ESTREF stations established in 2007, we perform as well computations with GIPSY 6.4 software (Ries et al., 2015). In the computations daily GPS-only solution was used. For precise orbits, final products from CODE (CODE analysis centre at the Astronomical Institute of the University of Bern) and JPL (Jet Propulsion Laboratory) for Bernese and GIPSY solutions were used, respectively. The cut-off angle was set to 10 degrees in order to avoid near-field multipath influence. In GIPSY, precise point positioning method with fixing ambiguities was used. Bernese calculations were performed based on double difference processing. Antenna phase centers were modelled based on igs08.atx and epnc_08.atx files. Vienna mapping function was used for mapping tropospheric delays. For the GIPSY solution, the higher order ionospheric term was modelled based on IRI-2012b model. For the Bernese solution higher order ionospheric term was neglected. FES2004 ocean tide loading model was used for the both computation strategies. As a result, two solutions using different scientific GNSS computation programs were obtained. The results from Bernese and GIPSY solutions were compared, using station repeatability values, RMS and coordinate differences. KEYWORDS: GNSS reference station network, Bernese GNSS 5.2, Gipsy 6.4, Estonia. References: Dach, R., S. Lutz, P. Walser, P. Fridez (Eds); 2015: Bernese GNSS Software Version 5.2. User manual, Astronomical Institute, Universtiy of Bern, Bern Open Publishing. DOI: 10.7892/boris.72297; ISBN: 978-3-906813-05-9. Paul Ries, Willy Bertiger, Shailen, Shailen Desai, & Kevin Miller. (2015). GIPSY 6.4 Release Notes. Jet Propulsion Laboratory, California Institute of Technology. Retrieved from https://gipsy-oasis.jpl.nasa.gov/docs/index.php
Slade, R.M.; Asquith, W.H.
1996-01-01
About 23,000 annual peak streamflows and about 400 historical peak streamflows exist for about 950 stations in the surface-water data-collection network of Texas. These data are presented on a computer diskette along with the corresponding dates, gage heights, and information concerning the basin, and nature or cause for the flood. Also on the computer diskette is a U.S. Geological Survey computer program that estimates peak-streamflow frequency based on annual and historical peak streamflow. The program estimates peak streamflow for 2-, 5-, 10-, 25-, 50-, and 100-year recurrence intervals and is based on guidelines established by the Interagency Advisory Committee on Water Data. Explanations are presented for installing the program, and an example is presented with discussion of its options.
NASA Technical Reports Server (NTRS)
Montag, Bruce C.; Bishop, Alfred M.; Redfield, Joe B.
1989-01-01
The findings of a preliminary investigation by Southwest Research Institute (SwRI) in simulation host computer concepts is presented. It is designed to aid NASA in evaluating simulation technologies for use in spaceflight training. The focus of the investigation is on the next generation of space simulation systems that will be utilized in training personnel for Space Station Freedom operations. SwRI concludes that NASA should pursue a distributed simulation host computer system architecture for the Space Station Training Facility (SSTF) rather than a centralized mainframe based arrangement. A distributed system offers many advantages and is seen by SwRI as the only architecture that will allow NASA to achieve established functional goals and operational objectives over the life of the Space Station Freedom program. Several distributed, parallel computing systems are available today that offer real-time capabilities for time critical, man-in-the-loop simulation. These systems are flexible in terms of connectivity and configurability, and are easily scaled to meet increasing demands for more computing power.
Space station Simulation Computer System (SCS) study for NASA/MSFC. Volume 1: Overview and summary
NASA Technical Reports Server (NTRS)
1989-01-01
NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned Marshall Space Flight Center (MSFC) Payload Training Complex (PTC) required to meet this need will train the space station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is the computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs. This study was performed August 1988 to October 1989. Thus, the results are based on the SSFP August 1989 baseline, i.e., pre-Langley configuration/budget review (C/BR) baseline. Some terms, e.g., combined trainer, are being redefined. An overview of the study activities and a summary of study results are given here.
Fast multi-core based multimodal registration of 2D cross-sections and 3D datasets
2010-01-01
Background Solving bioinformatics tasks often requires extensive computational power. Recent trends in processor architecture combine multiple cores into a single chip to improve overall performance. The Cell Broadband Engine (CBE), a heterogeneous multi-core processor, provides power-efficient and cost-effective high-performance computing. One application area is image analysis and visualisation, in particular registration of 2D cross-sections into 3D image datasets. Such techniques can be used to put different image modalities into spatial correspondence, for example, 2D images of histological cuts into morphological 3D frameworks. Results We evaluate the CBE-driven PlayStation 3 as a high performance, cost-effective computing platform by adapting a multimodal alignment procedure to several characteristic hardware properties. The optimisations are based on partitioning, vectorisation, branch reducing and loop unrolling techniques with special attention to 32-bit multiplies and limited local storage on the computing units. We show how a typical image analysis and visualisation problem, the multimodal registration of 2D cross-sections and 3D datasets, benefits from the multi-core based implementation of the alignment algorithm. We discuss several CBE-based optimisation methods and compare our results to standard solutions. More information and the source code are available from http://cbe.ipk-gatersleben.de. Conclusions The results demonstrate that the CBE processor in a PlayStation 3 accelerates computational intensive multimodal registration, which is of great importance in biological/medical image processing. The PlayStation 3 as a low cost CBE-based platform offers an efficient option to conventional hardware to solve computational problems in image processing and bioinformatics. PMID:20064262
Characterizing Multiple Wireless Sensor Networks for Large-Scale Radio Tomography
2015-03-01
with other transceivers over a wireless frequency. A base station transceiver collects the information and processes the information into something...or most other obstructions in between the two links [4]. A base station transceiver is connected to a processing computer to collect the RSS of each... transceivers at four different heights to create a Three-Dimensional (3-D) RTI network. Using shadowing- based RTI, this research demonstrated that RTI
Hanford meteorological station computer codes: Volume 9, The quality assurance computer codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burk, K.W.; Andrews, G.L.
1989-02-01
The Hanford Meteorological Station (HMS) was established in 1944 on the Hanford Site to collect and archive meteorological data and provide weather forecasts and related services for Hanford Site approximately 1/2 mile east of the 200 West Area and is operated by PNL for the US Department of Energy. Meteorological data are collected from various sensors and equipment located on and off the Hanford Site. These data are stored in data bases on the Digital Equipment Corporation (DEC) VAX 11/750 at the HMS (hereafter referred to as the HMS computer). Files from those data bases are routinely transferred to themore » Emergency Management System (EMS) computer at the Unified Dose Assessment Center (UDAC). To ensure the quality and integrity of the HMS data, a set of Quality Assurance (QA) computer codes has been written. The codes will be routinely used by the HMS system manager or the data base custodian. The QA codes provide detailed output files that will be used in correcting erroneous data. The following sections in this volume describe the implementation and operation of QA computer codes. The appendices contain detailed descriptions, flow charts, and source code listings of each computer code. 2 refs.« less
NASA Technical Reports Server (NTRS)
Gregg, Hugh; Healey, Kathleen; Hack, Edmund; Wong, Carla
1987-01-01
Expert systems that require access to data bases, complex simulations and real time instrumentation have both symbolic as well as algorithmic computing needs. These needs could both be met using a general computing workstation running both symbolic and algorithmic code, or separate, specialized computers networked together. The later approach was chosen to implement TEXSYS, the thermal expert system, developed to demonstrate the ability of an expert system to autonomously control the thermal control system of the space station. TEXSYS has been implemented on a Symbolics workstation, and will be linked to a microVAX computer that will control a thermal test bed. Integration options are explored and several possible solutions are presented.
Onboard Short Term Plan Viewer
NASA Technical Reports Server (NTRS)
Hall, Tim; LeBlanc, Troy; Ulman, Brian; McDonald, Aaron; Gramm, Paul; Chang, Li-Min; Keerthi, Suman; Kivlovitz, Dov; Hadlock, Jason
2011-01-01
Onboard Short Term Plan Viewer (OSTPV) is a computer program for electronic display of mission plans and timelines, both aboard the International Space Station (ISS) and in ISS ground control stations located in several countries. OSTPV was specifically designed both (1) for use within the limited ISS computing environment and (2) to be compatible with computers used in ground control stations. OSTPV supplants a prior system in which, aboard the ISS, timelines were printed on paper and incorporated into files that also contained other paper documents. Hence, the introduction of OSTPV has both reduced the consumption of resources and saved time in updating plans and timelines. OSTPV accepts, as input, the mission timeline output of a legacy, print-oriented, UNIX-based program called "Consolidated Planning System" and converts the timeline information for display in an interactive, dynamic, Windows Web-based graphical user interface that is used by both the ISS crew and ground control teams in real time. OSTPV enables the ISS crew to electronically indicate execution of timeline steps, launch electronic procedures, and efficiently report to ground control teams on the statuses of ISS activities, all by use of laptop computers aboard the ISS.
Wireless Augmented Reality Prototype (WARP)
NASA Technical Reports Server (NTRS)
Devereaux, A. S.
1999-01-01
Initiated in January, 1997, under NASA's Office of Life and Microgravity Sciences and Applications, the Wireless Augmented Reality Prototype (WARP) is a means to leverage recent advances in communications, displays, imaging sensors, biosensors, voice recognition and microelectronics to develop a hands-free, tetherless system capable of real-time personal display and control of computer system resources. Using WARP, an astronaut may efficiently operate and monitor any computer-controllable activity inside or outside the vehicle or station. The WARP concept is a lightweight, unobtrusive heads-up display with a wireless wearable control unit. Connectivity to the external system is achieved through a high-rate radio link from the WARP personal unit to a base station unit installed into any system PC. The radio link has been specially engineered to operate within the high- interference, high-multipath environment of a space shuttle or space station module. Through this virtual terminal, the astronaut will be able to view and manipulate imagery, text or video, using voice commands to control the terminal operations. WARP's hands-free access to computer-based instruction texts, diagrams and checklists replaces juggling manuals and clipboards, and tetherless computer system access allows free motion throughout a cabin while monitoring and operating equipment.
A 3D inversion for all-space magnetotelluric data with static shift correction
NASA Astrophysics Data System (ADS)
Zhang, Kun
2017-04-01
Base on the previous studies on the static shift correction and 3D inversion algorithms, we improve the NLCG 3D inversion method and propose a new static shift correction method which work in the inversion. The static shift correction method is based on the 3D theory and real data. The static shift can be detected by the quantitative analysis of apparent parameters (apparent resistivity and impedance phase) of MT in high frequency range, and completed correction with inversion. The method is an automatic processing technology of computer with 0 cost, and avoids the additional field work and indoor processing with good results. The 3D inversion algorithm is improved (Zhang et al., 2013) base on the NLCG method of Newman & Alumbaugh (2000) and Rodi & Mackie (2001). For the algorithm, we added the parallel structure, improved the computational efficiency, reduced the memory of computer and added the topographic and marine factors. So the 3D inversion could work in general PC with high efficiency and accuracy. And all the MT data of surface stations, seabed stations and underground stations can be used in the inversion algorithm.
Itazawa, Tomoko; Tamaki, Yukihisa; Komiyama, Takafumi; Nishimura, Yasumasa; Nakayama, Yuko; Ito, Hiroyuki; Ohde, Yasuhisa; Kusumoto, Masahiko; Sakai, Shuji; Suzuki, Kenji; Watanabe, Hirokazu; Asamura, Hisao
2017-01-01
The purpose of this study was to develop a consensus-based computed tomographic (CT) atlas that defines lymph node stations in radiotherapy for lung cancer based on the lymph node map of the International Association for the Study of Lung Cancer (IASLC). A project group in the Japanese Radiation Oncology Study Group (JROSG) initially prepared a draft of the atlas in which lymph node Stations 1-11 were illustrated on axial CT images. Subsequently, a joint committee of the Japan Lung Cancer Society (JLCS) and the Japanese Society for Radiation Oncology (JASTRO) was formulated to revise this draft. The committee consisted of four radiation oncologists, four thoracic surgeons and three thoracic radiologists. The draft prepared by the JROSG project group was intensively reviewed and discussed at four meetings of the committee over several months. Finally, we proposed definitions for the regional lymph node stations and the consensus-based CT atlas. This atlas was approved by the Board of Directors of JLCS and JASTRO. This resulted in the first official CT atlas for defining regional lymph node stations in radiotherapy for lung cancer authorized by the JLCS and JASTRO. In conclusion, the JLCS-JASTRO consensus-based CT atlas, which conforms to the IASLC lymph node map, was established. © The Author 2016. Published by Oxford University Press on behalf of The Japan Radiation Research Society and Japanese Society for Radiation Oncology.
Finite-fault source inversion using adjoint methods in 3D heterogeneous media
NASA Astrophysics Data System (ADS)
Somala, Surendra Nadh; Ampuero, Jean-Paul; Lapusta, Nadia
2018-04-01
Accounting for lateral heterogeneities in the 3D velocity structure of the crust is known to improve earthquake source inversion, compared to results based on 1D velocity models which are routinely assumed to derive finite-fault slip models. The conventional approach to include known 3D heterogeneity in source inversion involves pre-computing 3D Green's functions, which requires a number of 3D wave propagation simulations proportional to the number of stations or to the number of fault cells. The computational cost of such an approach is prohibitive for the dense datasets that could be provided by future earthquake observation systems. Here, we propose an adjoint-based optimization technique to invert for the spatio-temporal evolution of slip velocity. The approach does not require pre-computed Green's functions. The adjoint method provides the gradient of the cost function, which is used to improve the model iteratively employing an iterative gradient-based minimization method. The adjoint approach is shown to be computationally more efficient than the conventional approach based on pre-computed Green's functions in a broad range of situations. We consider data up to 1 Hz from a Haskell source scenario (a steady pulse-like rupture) on a vertical strike-slip fault embedded in an elastic 3D heterogeneous velocity model. The velocity model comprises a uniform background and a 3D stochastic perturbation with the von Karman correlation function. Source inversions based on the 3D velocity model are performed for two different station configurations, a dense and a sparse network with 1 km and 20 km station spacing, respectively. These reference inversions show that our inversion scheme adequately retrieves the rise time when the velocity model is exactly known, and illustrates how dense coverage improves the inference of peak slip velocities. We investigate the effects of uncertainties in the velocity model by performing source inversions based on an incorrect, homogeneous velocity model. We find that, for velocity uncertainties that have standard deviation and correlation length typical of available 3D crustal models, the inverted sources can be severely contaminated by spurious features even if the station density is high. When data from thousand or more receivers is used in source inversions in 3D heterogeneous media, the computational cost of the method proposed in this work is at least two orders of magnitude lower than source inversion based on pre-computed Green's functions.
Finite-fault source inversion using adjoint methods in 3-D heterogeneous media
NASA Astrophysics Data System (ADS)
Somala, Surendra Nadh; Ampuero, Jean-Paul; Lapusta, Nadia
2018-07-01
Accounting for lateral heterogeneities in the 3-D velocity structure of the crust is known to improve earthquake source inversion, compared to results based on 1-D velocity models which are routinely assumed to derive finite-fault slip models. The conventional approach to include known 3-D heterogeneity in source inversion involves pre-computing 3-D Green's functions, which requires a number of 3-D wave propagation simulations proportional to the number of stations or to the number of fault cells. The computational cost of such an approach is prohibitive for the dense data sets that could be provided by future earthquake observation systems. Here, we propose an adjoint-based optimization technique to invert for the spatio-temporal evolution of slip velocity. The approach does not require pre-computed Green's functions. The adjoint method provides the gradient of the cost function, which is used to improve the model iteratively employing an iterative gradient-based minimization method. The adjoint approach is shown to be computationally more efficient than the conventional approach based on pre-computed Green's functions in a broad range of situations. We consider data up to 1 Hz from a Haskell source scenario (a steady pulse-like rupture) on a vertical strike-slip fault embedded in an elastic 3-D heterogeneous velocity model. The velocity model comprises a uniform background and a 3-D stochastic perturbation with the von Karman correlation function. Source inversions based on the 3-D velocity model are performed for two different station configurations, a dense and a sparse network with 1 and 20 km station spacing, respectively. These reference inversions show that our inversion scheme adequately retrieves the rise time when the velocity model is exactly known, and illustrates how dense coverage improves the inference of peak-slip velocities. We investigate the effects of uncertainties in the velocity model by performing source inversions based on an incorrect, homogeneous velocity model. We find that, for velocity uncertainties that have standard deviation and correlation length typical of available 3-D crustal models, the inverted sources can be severely contaminated by spurious features even if the station density is high. When data from thousand or more receivers is used in source inversions in 3-D heterogeneous media, the computational cost of the method proposed in this work is at least two orders of magnitude lower than source inversion based on pre-computed Green's functions.
NASA Astrophysics Data System (ADS)
Morlot, Thomas; Perret, Christian; Favre, Anne-Catherine; Jalbert, Jonathan
2014-09-01
A rating curve is used to indirectly estimate the discharge in rivers based on water level measurements. The discharge values obtained from a rating curve include uncertainties related to the direct stage-discharge measurements (gaugings) used to build the curves, the quality of fit of the curve to these measurements and the constant changes in the river bed morphology. Moreover, the uncertainty of discharges estimated from a rating curve increases with the “age” of the rating curve. The level of uncertainty at a given point in time is therefore particularly difficult to assess. A “dynamic” method has been developed to compute rating curves while calculating associated uncertainties, thus making it possible to regenerate streamflow data with uncertainty estimates. The method is based on historical gaugings at hydrometric stations. A rating curve is computed for each gauging and a model of the uncertainty is fitted for each of them. The model of uncertainty takes into account the uncertainties in the measurement of the water level, the quality of fit of the curve, the uncertainty of gaugings and the increase of the uncertainty of discharge estimates with the age of the rating curve computed with a variographic analysis (Jalbert et al., 2011). The presented dynamic method can answer important questions in the field of hydrometry such as “How many gaugings a year are required to produce streamflow data with an average uncertainty of X%?” and “When and in what range of water flow rates should these gaugings be carried out?”. The Rocherousse hydrometric station (France, Haute-Durance watershed, 946 [km2]) is used as an example throughout the paper. Others stations are used to illustrate certain points.
GPS Monitor Station Upgrade Program at the Naval Research Laboratory
NASA Technical Reports Server (NTRS)
Galysh, Ivan J.; Craig, Dwin M.
1996-01-01
One of the measurements made by the Global Positioning System (GPS) monitor stations is to measure the continuous pseudo-range of all the passing GPS satellites. The pseudo-range contains GPS and monitor station clock errors as well as GPS satellite navigation errors. Currently the time at the GPS monitor station is obtained from the GPS constellation and has an inherent inaccuracy as a result. Improved timing accuracy at the GPS monitoring stations will improve GPS performance. The US Naval Research Laboratory (NRL) is developing hardware and software for the GPS monitor station upgrade program to improve the monitor station clock accuracy. This upgrade will allow a method independent of the GPS satellite constellation of measuring and correcting monitor station time to US Naval Observatory (USNO) time. THe hardware consists of a high performance atomic cesium frequency standard (CFS) and a computer which is used to ensemble the CFS with the two CFS's currently located at the monitor station by use of a dual-mixer system. The dual-mixer system achieves phase measurements between the high-performance CFS and the existing monitor station CFS's to within 400 femtoseconds. Time transfer between USNO and a given monitor station is achieved via a two way satellite time transfer modem. The computer at the monitor station disciplines the CFS based on a comparison of one pulse per second sent from the master site at USNO. The monitor station computer is also used to perform housekeeping functions, as well as recording the health status of all three CFS's. This information is sent to the USNO through the time transfer modem. Laboratory time synchronization results in the sub nanosecond range have been observed and the ability to maintain the monitor station CFS frequency to within 3.0 x 10 (sup minus 14) of the master site at USNO.
Bayesian Estimation of the Spatially Varying Completeness Magnitude of Earthquake Catalogs
NASA Astrophysics Data System (ADS)
Mignan, A.; Werner, M.; Wiemer, S.; Chen, C.; Wu, Y.
2010-12-01
Assessing the completeness magnitude Mc of earthquake catalogs is an essential prerequisite for any seismicity analysis. We employ a simple model to compute Mc in space, based on the proximity to seismic stations in a network. We show that a relationship of the form Mcpred(d) = ad^b+c, with d the distance to the 5th nearest seismic station, fits the observations well. We then propose a new Mc mapping approach, the Bayesian Magnitude of Completeness (BMC) method, based on a 2-step procedure: (1) a spatial resolution optimization to minimize spatial heterogeneities and uncertainties in Mc estimates and (2) a Bayesian approach that merges prior information about Mc based on the proximity to seismic stations with locally observed values weighted by their respective uncertainties. This new methodology eliminates most weaknesses associated with current Mc mapping procedures: the radius that defines which earthquakes to include in the local magnitude distribution is chosen according to an objective criterion and there are no gaps in the spatial estimation of Mc. The method solely requires the coordinates of seismic stations. Here, we investigate the Taiwan Central Weather Bureau (CWB) earthquake catalog by computing a Mc map for the period 1994-2010.
NASA Technical Reports Server (NTRS)
Biernacki, John; Juhasz, John; Sadler, Gerald
1991-01-01
A team of Space Station Freedom (SSF) system engineers are in the process of extensive analysis of the SSF requirements, particularly those pertaining to the electrical power system (EPS). The objective of this analysis is the development of a comprehensive, computer-based requirements model, using an enhanced modern structured analysis methodology (EMSA). Such a model provides a detailed and consistent representation of the system's requirements. The process outlined in the EMSA methodology is unique in that it allows the graphical modeling of real-time system state transitions, as well as functional requirements and data relationships, to be implemented using modern computer-based tools. These tools permit flexible updating and continuous maintenance of the models. Initial findings resulting from the application of EMSA to the EPS have benefited the space station program by linking requirements to design, providing traceability of requirements, identifying discrepancies, and fostering an understanding of the EPS.
NASA Technical Reports Server (NTRS)
1990-01-01
Lunar base projects, including a reconfigurable lunar cargo launcher, a thermal and micrometeorite protection system, a versatile lifting machine with robotic capabilities, a cargo transport system, the design of a road construction system for a lunar base, and the design of a device for removing lunar dust from material surfaces, are discussed. The emphasis on the Gulf of Mexico project was on the development of a computer simulation model for predicting vessel station keeping requirements. An existing code, used in predicting station keeping requirements for oil drilling platforms operating in North Shore (Alaska) waters was used as a basis for the computer simulation. Modifications were made to the existing code. The input into the model consists of satellite altimeter readings and water velocity readings from buoys stationed in the Gulf of Mexico. The satellite data consists of altimeter readings (wave height) taken during the spring of 1989. The simulation model predicts water velocity and direction, and wind velocity.
A framework for building real-time expert systems
NASA Technical Reports Server (NTRS)
Lee, S. Daniel
1991-01-01
The Space Station Freedom is an example of complex systems that require both traditional and artificial intelligence (AI) real-time methodologies. It was mandated that Ada should be used for all new software development projects. The station also requires distributed processing. Catastrophic failures on the station can cause the transmission system to malfunction for a long period of time, during which ground-based expert systems cannot provide any assistance to the crisis situation on the station. This is even more critical for other NASA projects that would have longer transmission delays (e.g., the lunar base, Mars missions, etc.). To address these issues, a distributed agent architecture (DAA) is proposed that can support a variety of paradigms based on both traditional real-time computing and AI. The proposed testbed for DAA is an autonomous power expert (APEX) which is a real-time monitoring and diagnosis expert system for the electrical power distribution system of the space station.
Shope, William G.; ,
1991-01-01
The U.S. Geological Survey is acquiring a new generation of field computers and communications software to support hydrologic data-collection at field locations. The new computer hardware and software mark the beginning of the Survey's transition from the use of electromechanical devices and paper tapes to electronic microprocessor-based instrumentation. Software is being developed for these microprocessors to facilitate the collection, conversion, and entry of data into the Survey's National Water Information System. The new automated data-collection process features several microprocessor-controlled sensors connected to a serial digital multidrop line operated by an electronic data recorder. Data are acquired from the sensors in response to instructions programmed into the data recorder by the user through small portable lap-top or hand-held computers. The portable computers, called personal field computers, also are used to extract data from the electronic recorders for transport by courier to the office computers. The Survey's alternative to manual or courier retrieval is the use of microprocessor-based remote telemetry stations. Plans have been developed to enhance the Survey's use of the Geostationary Operational Environmental Satellite telemetry by replacing the present network of direct-readout ground stations with less expensive units. Plans also provide for computer software that will support other forms of telemetry such as telephone or land-based radio.
ISS Expedition 18 Robotics Work Station (RWS) in the US Laboratory
2008-12-05
ISS018-E-010564 (5 Dec. 2008) --- Astronaut Michael Fincke, Expedition 18 commander, uses a computer at the robotics work station in the Destiny laboratory of the International Space Station. Using the station's robotic arm, Fincke and astronaut Sandra Magnus (out of frame), flight engineer, relocated the ESP-3 from the Mobile Base System back to the Cargo Carrier Attachment System on the P3 truss. The ESP-3 spare parts platform was temporarily parked on the MBS to clear the path for the spacewalks during STS-126.
NASA Astrophysics Data System (ADS)
Li, J.; Liu, L. Q.; Liu, T.; Xu, X. D.; Dong, B.; Lu, W. H.; Pan, W.; Wu, J. H.; Xiong, L. Y.
2017-02-01
A 10 kW@20 K refrigerator has been established by the Technical Institute of Physics and Chemistry, Chinese Academy of Sciences. A measurement and control system based on Siemens PLC S7-300 for this 10 kW@20 K refrigerator is developed. According to the detailed measurement requirements, proper sensors and transmitters are adopted. Siemens S7-300 PLC CPU315-2 PN/DP operates as a master station. Two sets of ET200M DP remote expand I/O, one power meter, two compressors and one vacuum gauge operate as slave stations. Profibus-DP field communication and Modbus communication are used between the master station and the slave stations in this control system. The upper computer HMI (Human Machine Interface) is compiled using Siemens configuration software WinCC V7.0. The upper computer communicates with PLC by means of industrial Ethernet. After commissioning, this refrigerator has been operating with a 10 kW of cooling power at 20 K for more than 72 hours.
SOCRATES, a Computer-Based Instructional System in Theory and Research. Technical Report.
ERIC Educational Resources Information Center
Stolurow, Lawrence M.
The paper describes a cybernetic computer-based instructional system, SOCRATES, the teaching model which led to its development, and some of the research accomplished with it. The acronym, SOCRATES, is System for Organizing Content to Review and Teach Educational Subject. It consists of a group of student input-output (I/O) stations wired to a…
Omega flight-test data reduction sequence. [computer programs for reduction of navigation data
NASA Technical Reports Server (NTRS)
Lilley, R. W.
1974-01-01
Computer programs for Omega data conversion, summary, and preparation for distribution are presented. Program logic and sample data formats are included, along with operational instructions for each program. Flight data (or data collected in flight format in the laboratory) is provided by the Ohio University Omega receiver base in the form of 6-bit binary words representing the phase of an Omega station with respect to the receiver's local clock. All eight Omega stations are measured in each 10-second Omega time frame. In addition, an event-marker bit and a time-slot D synchronizing bit are recorded. Program FDCON is used to remove data from the flight recorder tape and place it on data-processing cards for later use. Program FDSUM provides for computer plotting of selected LOP's, for single-station phase plots, and for printout of basic signal statistics for each Omega channel. Mean phase and standard deviation are printed, along with data from which a phase distribution can be plotted for each Omega station. Program DACOP simply copies the Omega data deck a controlled number of times, for distribution to users.
Robust Real-Time Wide-Area Differential GPS Navigation
NASA Technical Reports Server (NTRS)
Yunck, Thomas P. (Inventor); Bertiger, William I. (Inventor); Lichten, Stephen M. (Inventor); Mannucci, Anthony J. (Inventor); Muellerschoen, Ronald J. (Inventor); Wu, Sien-Chong (Inventor)
1998-01-01
The present invention provides a method and a device for providing superior differential GPS positioning data. The system includes a group of GPS receiving ground stations covering a wide area of the Earth's surface. Unlike other differential GPS systems wherein the known position of each ground station is used to geometrically compute an ephemeris for each GPS satellite. the present system utilizes real-time computation of satellite orbits based on GPS data received from fixed ground stations through a Kalman-type filter/smoother whose output adjusts a real-time orbital model. ne orbital model produces and outputs orbital corrections allowing satellite ephemerides to be known with considerable greater accuracy than from die GPS system broadcasts. The modeled orbits are propagated ahead in time and differenced with actual pseudorange data to compute clock offsets at rapid intervals to compensate for SA clock dither. The orbital and dock calculations are based on dual frequency GPS data which allow computation of estimated signal delay at each ionospheric point. These delay data are used in real-time to construct and update an ionospheric shell map of total electron content which is output as part of the orbital correction data. thereby allowing single frequency users to estimate ionospheric delay with an accuracy approaching that of dual frequency users.
Meyer, Frans J C; Davidson, David B; Jakobus, Ulrich; Stuchly, Maria A
2003-02-01
A hybrid finite-element method (FEM)/method of moments (MoM) technique is employed for specific absorption rate (SAR) calculations in a human phantom in the near field of a typical group special mobile (GSM) base-station antenna. The MoM is used to model the metallic surfaces and wires of the base-station antenna, and the FEM is used to model the heterogeneous human phantom. The advantages of each of these frequency domain techniques are, thus, exploited, leading to a highly efficient and robust numerical method for addressing this type of bioelectromagnetic problem. The basic mathematical formulation of the hybrid technique is presented. This is followed by a discussion of important implementation details-in particular, the linear algebra routines for sparse, complex FEM matrices combined with dense MoM matrices. The implementation is validated by comparing results to MoM (surface equivalence principle implementation) and finite-difference time-domain (FDTD) solutions of human exposure problems. A comparison of the computational efficiency of the different techniques is presented. The FEM/MoM implementation is then used for whole-body and critical-organ SAR calculations in a phantom at different positions in the near field of a base-station antenna. This problem cannot, in general, be solved using the MoM or FDTD due to computational limitations. This paper shows that the specific hybrid FEM/MoM implementation is an efficient numerical tool for accurate assessment of human exposure in the near field of base-station antennas.
Arya, Rahul; Morrison, Trevor; Zumwalt, Ann; Shaffer, Kitt
2013-10-01
A hands-on stations-based approach to teaching anatomy to third-year medical students is used at Boston University. The goal of our study was to demonstrate that such an interactive, team-based approach to teaching anatomy would be well received and be helpful in recall, comprehension, and reinforcement of anatomy learned in the first year of medical school. Each radiology-anatomy correlation lab was focused on one particular anatomic part, such as skull base, pelvis, coronary anatomy, etc. Four stations, including a three-dimensional model, computer, ultrasound, and posters, were created for each lab. Informed consent was obtained before online survey dissemination to assess the effectiveness and quality of radiology-anatomy correlation lab. This study was approved by our institutional institutional review board, and data were analyzed using a χ(2) test. Survey data were collected from February 2010 through March 2012. The response rate was 33.5%. Overall, the highest percentage of students (46%) found the three-dimensional model station to be the most valuable. The computer station was most helpful in recall of the anatomic principles from the first year of medical school. Regarding the quality of the anatomy lab, less than 2% of the students thought that the images were of poor quality or the material presented was not clinically relevant. Our results indicate that an interactive, team-based approach to teaching anatomy was well received by the medical students. It was engaging and students were able to benefit from it in multiple ways. Copyright © 2013 AUR. Published by Elsevier Inc. All rights reserved.
Streamflow characteristics at hydrologic bench-mark stations
Lawrence, C.L.
1987-01-01
The Hydrologic Bench-Mark Network was established in the 1960's. Its objectives were to document the hydrologic characteristics of representative undeveloped watersheds nationwide and to provide a comparative base for studying the effects of man on the hydrologic environment. The network, which consists of 57 streamflow gaging stations and one lake-stage station in 39 States, is planned for permanent operation. This interim report describes streamflow characteristics at each bench-mark site and identifies time trends in annual streamflow that have occurred during the data-collection period. The streamflow characteristics presented for each streamflow station are (1) flood and low-flow frequencies, (2) flow duration, (3) annual mean flow, and (4) the serial correlation coefficient for annual mean discharge. In addition, Kendall's tau is computed as an indicator of time trend in annual discharges. The period of record for most stations was 13 to 17 years, although several stations had longer periods of record. The longest period was 65 years for Merced River near Yosemite, Calif. Records of flow at 6 of 57 streamflow sites in the network showed a statistically significant change in annual mean discharge over the period of record, based on computations of Kendall's tau. The values of Kendall's tau ranged from -0.533 to 0.648. An examination of climatological records showed that changes in precipitation were most likely the cause for the change in annual mean discharge.
A Study of the Behavior of Children in a Preschool Equipped with Computers.
ERIC Educational Resources Information Center
Klinzing, Dene G.
A study was conducted: (1) to compare the popularity of computer stations with nine other activity stations; (2) to determine the differences in the type of play displayed by the children in preschool and note the type of play displayed at the computer stations versus the other activity stations; (3) to determine whether the preschool activities,…
ERIC Educational Resources Information Center
Kupiainen, Jari
2006-01-01
In the Western Pacific, the People First Network project has since 2001 been building a growing network of rural email stations across the conflict-ridden Solomon Islands. These stations are based on robust technology and consist of solar panels, short-wave radios and compatible modems, laptop computers and printers to provide email communication…
Zborowsky, Terri; Bunker-Hellmich, Lou; Morelli, Agneta; O'Neill, Mike
2010-01-01
Evidence-based findings of the effects of nursing station design on nurses' work environment and work behavior are essential to improve conditions and increase retention among these fundamental members of the healthcare delivery team. The purpose of this exploratory study was to investigate how nursing station design (i.e., centralized and decentralized nursing station layouts) affected nurses' use of space, patient visibility, noise levels, and perceptions of the work environment. Advances in information technology have enabled nurses to move away from traditional centralized paper-charting stations to smaller decentralized work stations and charting substations located closer to, or inside of, patient rooms. Improved understanding of the trade-offs presented by centralized and decentralized nursing station design has the potential to provide useful information for future nursing station layouts. This information will be critical for understanding the nurse environment "fit." The study used an exploratory design with both qualitative and quantitative methods. Qualitative data regarding the effects of nursing station design on nurses' health and work environment were gathered by means of focus group interviews. Quantitative data-gathering techniques included place- and person-centered space use observations, patient visibility assessments, sound level measurements, and an online questionnaire regarding perceptions of the work environment. Nurses on all units were observed most frequently performing telephone, computer, and administrative duties. Time spent using telephones, computers, and performing other administrative duties was significantly higher in the centralized nursing stations. Consultations with medical staff and social interactions were significantly less frequent in decentralized nursing stations. There were no indications that either centralized or decentralized nursing station designs resulted in superior visibility. Sound levels measured in all nursing stations exceeded recommended levels during all shifts. No significant differences were identified in nurses' perceptions of work control-demand-support in centralized and decentralized nursing station designs. The "hybrid" nursing design model in which decentralized nursing stations are coupled with centralized meeting rooms for consultation between staff members may strike a balance between the increase in computer duties and the ongoing need for communication and consultation that addresses the conflicting demands of technology and direct patient care.
Ahearn, Elizabeth A.
2008-01-01
Flow durations, low-flow frequencies, and monthly median streamflows were computed for 91 continuous-record, streamflow-gaging stations in Connecticut with 10 or more years of record. Flow durations include the 99-, 98-, 97-, 95-, 90-, 85-, 80-, 75-, 70-, 60-, 50-, 40-, 30-, 25-, 20-, 10-, 5-, and 1-percent exceedances. Low-flow frequencies include the 7-day, 10-year (7Q10) low flow; 7-day, 2-year (7Q2) low flow; and 30-day, 2-year (30Q2) low flow. Streamflow estimates were computed for each station using data for the period of record through water year 2005. Estimates of low-flow statistics for 7 short-term (operated between 3 and 10 years) streamflow-gaging stations and 31 partial-record sites were computed. Low-flow estimates were made on the basis of the relation between base flows at a short-term station or partial-record site and concurrent daily mean streamflows at a nearby index station. The relation is defined by the Maintenance of Variance Extension, type 3 (MOVE.3) method. Several short-term stations and partial-record sites had poorly defined relations with nearby index stations; therefore, no low-flow statistics were derived for these sites. The estimated low-flow statistics for the short-term stations and partial-record sites include the 99-, 98-, 97-, 95-, 90-, and 85-percent flow durations; the 7-day, 10-year (7Q10) low flow; 7-day, 2-year (7Q2) low flow; and 30-day, 2-year (30Q2) low-flow frequencies; and the August median flow. Descriptive information on location and record length, measured basin characteristics, index stations correlated to the short-term station and partial-record sites, and estimated flow statistics are provided in this report for each station. Streamflow estimates from this study are stored on USGS's World Wide Web application 'StreamStats' (http://water.usgs.gov/osw/streamstats/connecticut.html).
Methodology of automated ionosphere front velocity estimation for ground-based augmentation of GNSS
NASA Astrophysics Data System (ADS)
Bang, Eugene; Lee, Jiyun
2013-11-01
ionospheric anomalies occurring during severe ionospheric storms can pose integrity threats to Global Navigation Satellite System (GNSS) Ground-Based Augmentation Systems (GBAS). Ionospheric anomaly threat models for each region of operation need to be developed to analyze the potential impact of these anomalies on GBAS users and develop mitigation strategies. Along with the magnitude of ionospheric gradients, the speed of the ionosphere "fronts" in which these gradients are embedded is an important parameter for simulation-based GBAS integrity analysis. This paper presents a methodology for automated ionosphere front velocity estimation which will be used to analyze a vast amount of ionospheric data, build ionospheric anomaly threat models for different regions, and monitor ionospheric anomalies continuously going forward. This procedure automatically selects stations that show a similar trend of ionospheric delays, computes the orientation of detected fronts using a three-station-based trigonometric method, and estimates speeds for the front using a two-station-based method. It also includes fine-tuning methods to improve the estimation to be robust against faulty measurements and modeling errors. It demonstrates the performance of the algorithm by comparing the results of automated speed estimation to those manually computed previously. All speed estimates from the automated algorithm fall within error bars of ± 30% of the manually computed speeds. In addition, this algorithm is used to populate the current threat space with newly generated threat points. A larger number of velocity estimates helps us to better understand the behavior of ionospheric gradients under geomagnetic storm conditions.
Knowledge-based machine vision systems for space station automation
NASA Technical Reports Server (NTRS)
Ranganath, Heggere S.; Chipman, Laure J.
1989-01-01
Computer vision techniques which have the potential for use on the space station and related applications are assessed. A knowledge-based vision system (expert vision system) and the development of a demonstration system for it are described. This system implements some of the capabilities that would be necessary in a machine vision system for the robot arm of the laboratory module in the space station. A Perceptics 9200e image processor, on a host VAXstation, was used to develop the demonstration system. In order to use realistic test images, photographs of actual space shuttle simulator panels were used. The system's capabilities of scene identification and scene matching are discussed.
Engineering and Design: Control Stations and Control Systems for Navigation Locks and Dams
1997-05-30
of human intelli- hypothetical lock and dam configurations. Finally, b. Terminology. (1) PLC system. The computer- based systems utilize special...electrical industry for industrial use. There- fore, for purposes of this document, a computer- based system is referred to as a PLC system. (2) Relay- based ...be custom made, because most of today’s control systems of any complexity are PLC - based , the standard size of a given motor starter cubicle is not
Network modeling of PM10 concentration in Malaysia
NASA Astrophysics Data System (ADS)
Supian, Muhammad Nazirul Aiman Abu; Bakar, Sakhinah Abu; Razak, Fatimah Abdul
2017-08-01
Air pollution is not a new phenomenon in Malaysia. The Department of Environment (DOE) monitors the country's ambient air quality through a network of 51 stations. The air quality is measured using the Air Pollution Index (API) which is mainly recorded based on the concentration of particulate matter, PM10 readings. The Continuous Air Quality Monitoring (CAQM) stations are located in various places across the country. In this study, a network model of air quality based on PM10 concen tration for selected CAQM stations in Malaysia has been developed. The model is built using a graph formulation, G = (V, E) where vertex, V is a set of CAQM stations and edges, E is a set of correlation values for each pair of vertices. The network measurements such as degree distributions, closeness centrality, and betweenness centrality are computed to analyse the behaviour of the network. As a result, a rank of CAQM stations has been produced based on their centrality characteristics.
Technologies for space station autonomy
NASA Technical Reports Server (NTRS)
Staehle, R. L.
1984-01-01
This report presents an informal survey of experts in the field of spacecraft automation, with recommendations for which technologies should be given the greatest development attention for implementation on the initial 1990's NASA Space Station. The recommendations implemented an autonomy philosophy that was developed by the Concept Development Group's Autonomy Working Group during 1983. They were based on assessments of the technologies' likely maturity by 1987, and of their impact on recurring costs, non-recurring costs, and productivity. The three technology areas recommended for programmatic emphasis were: (1) artificial intelligence expert (knowledge based) systems and processors; (2) fault tolerant computing; and (3) high order (procedure oriented) computer languages. This report also describes other elements required for Station autonomy, including technologies for later implementation, system evolvability, and management attitudes and goals. The cost impact of various technologies is treated qualitatively, and some cases in which both the recurring and nonrecurring costs might be reduced while the crew productivity is increased, are also considered. Strong programmatic emphasis on life cycle cost and productivity is recommended.
Department of Defense Base Closure and Realignment Report
1993-03-01
European NATO allies wili continue to grapple with shaping 3n evolving regional security framework capable of crisis management and conflict prevention, as...Personnel, Arlington, Virginia (including the Office of Military Manpower Management . Arlington, Virg;uda) Naval Air Systems Command, Arlington, Virginia...Hawaii Enlisted Personnel Management Center, New Orleans, Louisiana Naval Computer & Telecommunications Station, New Orleans, Louisiana Naval Air Station
CIELO-A GIS integrated model for climatic and water balance simulation in islands environments
NASA Astrophysics Data System (ADS)
Azevedo, E. B.; Pereira, L. S.
2003-04-01
The model CIELO (acronym for "Clima Insular à Escala Local") is a physically based model that simulates the climatic variables in an island using data from a single synoptic reference meteorological station. The reference station "knows" its position in the orographic and dynamic regime context. The domain of computation is a GIS raster grid parameterised with a digital elevation model (DEM). The grid is oriented following the direction of the air masses circulation through a specific algorithm named rotational terrain model (RTM). The model consists of two main sub-models. One, relative to the advective component simulation, assumes the Foehn effect to reproduce the dynamic and thermodynamic processes occurring when an air mass moves through the island orographic obstacle. This makes possible to simulate the air temperature, air humidity, cloudiness and precipitation as influenced by the orography along the air displacement. The second concerns the radiative component as affected by the clouds of orographic origin and by the shadow produced by the relief. The initial state parameters are computed starting from the reference meteorological station across the DEM transept until the sea level at the windward side. Then, starting from the sea level, the model computes the local scale meteorological parameters according to the direction of the air displacement, which is adjusted with the RTM. The air pressure, temperature and humidity are directly calculated for each cell in the computational grid, while several algorithms are used to compute the cloudiness, net radiation, evapotranspiration, and precipitation. The model presented in this paper has been calibrated and validated using data from some meteorological stations and a larger number of rainfall stations located at various elevations in the Azores Islands.
Space station operating system study
NASA Technical Reports Server (NTRS)
Horn, Albert E.; Harwell, Morris C.
1988-01-01
The current phase of the Space Station Operating System study is based on the analysis, evaluation, and comparison of the operating systems implemented on the computer systems and workstations in the software development laboratory. Primary emphasis has been placed on the DEC MicroVMS operating system as implemented on the MicroVax II computer, with comparative analysis of the SUN UNIX system on the SUN 3/260 workstation computer, and to a limited extent, the IBM PC/AT microcomputer running PC-DOS. Some benchmark development and testing was also done for the Motorola MC68010 (VM03 system) before the system was taken from the laboratory. These systems were studied with the objective of determining their capability to support Space Station software development requirements, specifically for multi-tasking and real-time applications. The methodology utilized consisted of development, execution, and analysis of benchmark programs and test software, and the experimentation and analysis of specific features of the system or compilers in the study.
Forest fire autonomous decision system based on fuzzy logic
NASA Astrophysics Data System (ADS)
Lei, Z.; Lu, Jianhua
2010-11-01
The proposed system integrates GPS / pseudolite / IMU and thermal camera in order to autonomously process the graphs by identification, extraction, tracking of forest fire or hot spots. The airborne detection platform, the graph-based algorithms and the signal processing frame are analyzed detailed; especially the rules of the decision function are expressed in terms of fuzzy logic, which is an appropriate method to express imprecise knowledge. The membership function and weights of the rules are fixed through a supervised learning process. The perception system in this paper is based on a network of sensorial stations and central stations. The sensorial stations collect data including infrared and visual images and meteorological information. The central stations exchange data to perform distributed analysis. The experiment results show that working procedure of detection system is reasonable and can accurately output the detection alarm and the computation of infrared oscillations.
Bisese, James A.
1995-01-01
Methods are presented for estimating the peak discharges of rural, unregulated streams in Virginia. A Pearson Type III distribution is fitted to the logarithms of the unregulated annual peak-discharge records from 363 stream-gaging stations in Virginia to estimate the peak discharge at these stations for recurrence intervals of 2 to 500 years. Peak-discharge characteristics for 284 unregulated stations are divided into eight regions based on physiographic province, and regressed on basin characteristics, including drainage area, main channel length, main channel slope, mean basin elevation, percentage of forest cover, mean annual precipitation, and maximum rainfall intensity. Regression equations for each region are computed by use of the generalized least-squares method, which accounts for spatial and temporal correlation between nearby gaging stations. This regression technique weights the significance of each station to the regional equation based on the length of records collected at each cation, the correlation between annual peak discharges among the stations, and the standard deviation of the annual peak discharge for each station.Drainage area proved to be the only significant explanatory variable in four regions, while other regions have as many as three significant variables. Standard errors of the regression equations range from 30 to 80 percent. Alternate equations using drainage area only are provided for the five regions with more than one significant explanatory variable.Methods and sample computations are provided to estimate peak discharges at gaged and engaged sites in Virginia for recurrence intervals of 2, 5, 10, 25, 50, 100, 200, and 500 years, and to adjust the regression estimates for sites on gaged streams where nearby gaging-station records are available.
NASA Astrophysics Data System (ADS)
Snyder, R. L.; Mancosu, N.; Spano, D.
2014-12-01
This study derived the summer (June-August) reference evapotranspiration distribution map for Sardinia (Italy) based on weather station data and use of the geographic information system (GIS). A modified daily Penman-Monteith equation from the Food and Agriculture Organization of the United Nations (UN-FAO) and the American Society of Civil Engineers Environmental and Water Resources Institute (ASCE-EWRI) was used to calculate the Standardized Reference Evapotranspiration (ETos) for all weather stations having a "full" set of required data for the calculations. For stations having only temperature data (partial stations), the Hargreaves-Samani equation was used to estimate the reference evapotranspiration for a grass surface (ETo). The ETos and ETo results were different depending on the local climate, so two methods to estimate ETos from the ETo were tested. Substitution of missing solar radiation, wind speed, and humidity data from a nearby station within a similar microclimate was found to give better results than using a calibration factor that related ETos and ETo. Therefore, the substitution method was used to estimate ETos at "partial" stations having only temperature data. The combination of 63 full and partial stations was sufficient to use GIS to map ETos for Sardinia. Three interpolation methods were studied, and the ordinary kriging model fitted the observed data better than a radial basis function or the inverse distance weighting method. Using station data points to create a regional map simplified the zonation of ETos when large scale computations were needed. Making a distinction based on ETos classes allows the simulation of crop water requirements for large areas and it can potentially lead to improved irrigation management and water savings. It also provides a baseline to investigate possible impact of climate change.
The expanded role of computers in Space Station Freedom real-time operations
NASA Technical Reports Server (NTRS)
Crawford, R. Paul; Cannon, Kathleen V.
1990-01-01
The challenges that NASA and its international partners face in their real-time operation of the Space Station Freedom necessitate an increased role on the part of computers. In building the operational concepts concerning the role of the computer, the Space Station program is using lessons learned experience from past programs, knowledge of the needs of future space programs, and technical advances in the computer industry. The computer is expected to contribute most significantly in real-time operations by forming a versatile operating architecture, a responsive operations tool set, and an environment that promotes effective and efficient utilization of Space Station Freedom resources.
Computer integration of engineering design and production: A national opportunity
NASA Astrophysics Data System (ADS)
1984-10-01
The National Aeronautics and Space Administration (NASA), as a purchaser of a variety of manufactured products, including complex space vehicles and systems, clearly has a stake in the advantages of computer-integrated manufacturing (CIM). Two major NASA objectives are to launch a Manned Space Station by 1992 with a budget of $8 billion, and to be a leader in the development and application of productivity-enhancing technology. At the request of NASA, a National Research Council committee visited five companies that have been leaders in using CIM. Based on these case studies, technical, organizational, and financial issues that influence computer integration are described, guidelines for its implementation in industry are offered, and the use of CIM to manage the space station program is recommended.
Computer integration of engineering design and production: A national opportunity
NASA Technical Reports Server (NTRS)
1984-01-01
The National Aeronautics and Space Administration (NASA), as a purchaser of a variety of manufactured products, including complex space vehicles and systems, clearly has a stake in the advantages of computer-integrated manufacturing (CIM). Two major NASA objectives are to launch a Manned Space Station by 1992 with a budget of $8 billion, and to be a leader in the development and application of productivity-enhancing technology. At the request of NASA, a National Research Council committee visited five companies that have been leaders in using CIM. Based on these case studies, technical, organizational, and financial issues that influence computer integration are described, guidelines for its implementation in industry are offered, and the use of CIM to manage the space station program is recommended.
On the long-term stability of terrestrial reference frame solutions based on Kalman filtering
NASA Astrophysics Data System (ADS)
Soja, Benedikt; Gross, Richard S.; Abbondanza, Claudio; Chin, Toshio M.; Heflin, Michael B.; Parker, Jay W.; Wu, Xiaoping; Nilsson, Tobias; Glaser, Susanne; Balidakis, Kyriakos; Heinkelmann, Robert; Schuh, Harald
2018-06-01
The Global Geodetic Observing System requirement for the long-term stability of the International Terrestrial Reference Frame is 0.1 mm/year, motivated by rigorous sea level studies. Furthermore, high-quality station velocities are of great importance for the prediction of future station coordinates, which are fundamental for several geodetic applications. In this study, we investigate the performance of predictions from very long baseline interferometry (VLBI) terrestrial reference frames (TRFs) based on Kalman filtering. The predictions are computed by extrapolating the deterministic part of the coordinate model. As observational data, we used over 4000 VLBI sessions between 1980 and the middle of 2016. In order to study the predictions, we computed VLBI TRF solutions only from the data until the end of 2013. The period of 2014 until 2016.5 was used to validate the predictions of the TRF solutions against the measured VLBI station coordinates. To assess the quality, we computed average WRMS values from the coordinate differences as well as from estimated Helmert transformation parameters, in particular, the scale. We found that the results significantly depend on the level of process noise used in the filter. While larger values of process noise allow the TRF station coordinates to more closely follow the input data (decrease in WRMS of about 45%), the TRF predictions exhibit larger deviations from the VLBI station coordinates after 2014 (WRMS increase of about 15%). On the other hand, lower levels of process noise improve the predictions, making them more similar to those of solutions without process noise. Furthermore, our investigations show that additionally estimating annual signals in the coordinates does not significantly impact the results. Finally, we computed TRF solutions mimicking a potential real-time TRF and found significant improvements over the other investigated solutions, all of which rely on extrapolating the coordinate model for their predictions, with WRMS reductions of almost 50%.
Continuation of research into language concepts for the mission support environment
NASA Technical Reports Server (NTRS)
1991-01-01
A concept for a more intuitive and graphically based Computation (Comp) Builder was developed. The Graphical Comp Builder Prototype was developed, which is an X Window based graphical tool that allows the user to build Comps using graphical symbols. Investigation was conducted to determine the availability and suitability of the Ada programming language for the development of future control center type software. The Space Station Freedom Project identified Ada as the desirable programming language for the development of Space Station Control Center software systems.
A Fast Method for Embattling Optimization of Ground-Based Radar Surveillance Network
NASA Astrophysics Data System (ADS)
Jiang, H.; Cheng, H.; Zhang, Y.; Liu, J.
A growing number of space activities have created an orbital debris environment that poses increasing impact risks to existing space systems and human space flight. For the safety of in-orbit spacecraft, a lot of observation facilities are needed to catalog space objects, especially in low earth orbit. Surveillance of Low earth orbit objects are mainly rely on ground-based radar, due to the ability limitation of exist radar facilities, a large number of ground-based radar need to build in the next few years in order to meet the current space surveillance demands. How to optimize the embattling of ground-based radar surveillance network is a problem to need to be solved. The traditional method for embattling optimization of ground-based radar surveillance network is mainly through to the detection simulation of all possible stations with cataloged data, and makes a comprehensive comparative analysis of various simulation results with the combinational method, and then selects an optimal result as station layout scheme. This method is time consuming for single simulation and high computational complexity for the combinational analysis, when the number of stations increases, the complexity of optimization problem will be increased exponentially, and cannot be solved with traditional method. There is no better way to solve this problem till now. In this paper, target detection procedure was simplified. Firstly, the space coverage of ground-based radar was simplified, a space coverage projection model of radar facilities in different orbit altitudes was built; then a simplified objects cross the radar coverage model was established according to the characteristics of space objects orbit motion; after two steps simplification, the computational complexity of the target detection was greatly simplified, and simulation results shown the correctness of the simplified results. In addition, the detection areas of ground-based radar network can be easily computed with the simplified model, and then optimized the embattling of ground-based radar surveillance network with the artificial intelligent algorithm, which can greatly simplifies the computational complexities. Comparing with the traditional method, the proposed method greatly improved the computational efficiency.
Ishii, Audrey L.; Soong, David T.; Sharpe, Jennifer B.
2010-01-01
Illinois StreamStats (ILSS) is a Web-based application for computing selected basin characteristics and flood-peak quantiles based on the most recently (2010) published (Soong and others, 2004) regional flood-frequency equations at any rural stream location in Illinois. Limited streamflow statistics including general statistics, flow durations, and base flows also are available for U.S. Geological Survey (USGS) streamflow-gaging stations. ILSS can be accessed on the Web at http://streamstats.usgs.gov/ by selecting the State Applications hyperlink and choosing Illinois from the pull-down menu. ILSS was implemented for Illinois by obtaining and projecting ancillary geographic information system (GIS) coverages; populating the StreamStats database with streamflow-gaging station data; hydroprocessing the 30-meter digital elevation model (DEM) for Illinois to conform to streams represented in the National Hydrographic Dataset 1:100,000 stream coverage; and customizing the Web-based Extensible Markup Language (XML) programs for computing basin characteristics for Illinois. The basin characteristics computed by ILSS then were compared to the basin characteristics used in the published study, and adjustments were applied to the XML algorithms for slope and basin length. Testing of ILSS was accomplished by comparing flood quantiles computed by ILSS at a an approximately random sample of 170 streamflow-gaging stations computed by ILSS with the published flood quantile estimates. Differences between the log-transformed flood quantiles were not statistically significant at the 95-percent confidence level for the State as a whole, nor by the regions determined by each equation, except for region 1, in the northwest corner of the State. In region 1, the average difference in flood quantile estimates ranged from 3.76 percent for the 2-year flood quantile to 4.27 percent for the 500-year flood quantile. The total number of stations in region 1 was small (21) and the mean difference is not large (less than one-tenth of the average prediction error for the regression-equation estimates). The sensitivity of the flood-quantile estimates to differences in the computed basin characteristics are determined and presented in tables. A test of usage consistency was conducted by having at least 7 new users compute flood quantile estimates at 27 locations. The average maximum deviation of the estimate from the mode value at each site was 1.31 percent after four mislocated sites were removed. A comparison of manual 100-year flood-quantile computations with ILSS at 34 sites indicated no statistically significant difference. ILSS appears to be an accurate, reliable, and effective tool for flood-quantile estimates.
A user view of office automation or the integrated workstation
NASA Technical Reports Server (NTRS)
Schmerling, E. R.
1984-01-01
Central data bases are useful only if they are kept up to date and easily accessible in an interactive (query) mode rather than in monthly reports that may be out of date and must be searched by hand. The concepts of automatic data capture, data base management and query languages require good communications and readily available work stations to be useful. The minimal necessary work station is a personal computer which can be an important office tool if connected into other office machines and properly integrated into an office system. It has a great deal of flexibility and can often be tailored to suit the tastes, work habits and requirements of the user. Unlike dumb terminals, there is less tendency to saturate a central computer, since its free standing capabilities are available after down loading a selection of data. The PC also permits the sharing of many other facilities, like larger computing power, sophisticated graphics programs, laser printers and communications. It can provide rapid access to common data bases able to provide more up to date information than printed reports. Portable computers can access the same familiar office facilities from anywhere in the world where a telephone connection can be made.
Three-Dimensional Displays In The Future Flight Station
NASA Astrophysics Data System (ADS)
Bridges, Alan L.
1984-10-01
This review paper summarizes the development and applications of computer techniques for the representation of three-dimensional data in the future flight station. It covers the development of the Lockheed-NASA Advanced Concepts Flight Station (ACFS) research simulators. These simulators contain: A Pilot's Desk Flight Station (PDFS) with five 13- inch diagonal, color, cathode ray tubes on the main instrument panel; a computer-generated day and night visual system; a six-degree-of-freedom motion base; and a computer complex. This paper reviews current research, development, and evaluation of easily modifiable display systems and software requirements for three-dimensional displays that may be developed for the PDFS. This includes the analysis and development of a 3-D representation of the entire flight profile. This 3-D flight path, or "Highway-in-the-Sky", will utilize motion and perspective cues to tightly couple the human responses of the pilot to the aircraft control systems. The use of custom logic, e.g., graphics engines, may provide the processing power and architecture required for 3-D computer-generated imagery (CGI) or visual scene simulation (VSS). Diffraction or holographic head-up displays (HUDs) will also be integrated into the ACFS simulator to permit research on the requirements and use of these "out-the-window" projection systems. Future research may include the retrieval of high-resolution, perspective view terrain maps which could then be overlaid with current weather information or other selectable cultural features.
Open solutions to distributed control in ground tracking stations
NASA Technical Reports Server (NTRS)
Heuser, William Randy
1994-01-01
The advent of high speed local area networks has made it possible to interconnect small, powerful computers to function together as a single large computer. Today, distributed computer systems are the new paradigm for large scale computing systems. However, the communications provided by the local area network is only one part of the solution. The services and protocols used by the application programs to communicate across the network are as indispensable as the local area network. And the selection of services and protocols that do not match the system requirements will limit the capabilities, performance, and expansion of the system. Proprietary solutions are available but are usually limited to a select set of equipment. However, there are two solutions based on 'open' standards. The question that must be answered is 'which one is the best one for my job?' This paper examines a model for tracking stations and their requirements for interprocessor communications in the next century. The model and requirements are matched with the model and services provided by the five different software architectures and supporting protocol solutions. Several key services are examined in detail to determine which services and protocols most closely match the requirements for the tracking station environment. The study reveals that the protocols are tailored to the problem domains for which they were originally designed. Further, the study reveals that the process control model is the closest match to the tracking station model.
EVA worksite analysis--use of computer analysis for EVA operations development and execution.
Anderson, D
1999-01-01
To sustain the rate of extravehicular activity (EVA) required to assemble and maintain the International Space Station, we must enhance our ability to plan, train for, and execute EVAs. An underlying analysis capability has been developed to ensure EVA access to all external worksites as a starting point for ground training, to generate information needed for on-orbit training, and to react quickly to develop contingency EVA plans, techniques, and procedures. This paper describes the use of computer-based EVA worksite analysis techniques for EVA worksite design. EVA worksite analysis has been used to design 80% of EVA worksites on the U.S. portion of the International Space Station. With the launch of the first U.S. element of the station, EVA worksite analysis is being developed further to support real-time analysis of unplanned EVA operations. This paper describes this development and deployment of EVA worksite analysis for International Space Station (ISS) mission support.
Operator Station Design System - A computer aided design approach to work station layout
NASA Technical Reports Server (NTRS)
Lewis, J. L.
1979-01-01
The Operator Station Design System is resident in NASA's Johnson Space Center Spacecraft Design Division Performance Laboratory. It includes stand-alone minicomputer hardware and Panel Layout Automated Interactive Design and Crew Station Assessment of Reach software. The data base consists of the Shuttle Transportation System Orbiter Crew Compartment (in part), the Orbiter payload bay and remote manipulator (in part), and various anthropometric populations. The system is utilized to provide panel layouts, assess reach and vision, determine interference and fit problems early in the design phase, study design applications as a function of anthropometric and mission requirements, and to accomplish conceptual design to support advanced study efforts.
Device 2F112 (F-14A WST (Weapon System Trainers)) Instructor Console Review.
1983-12-01
Cockpit Section-Trainee Station, b. Instructor Operator Station (OS), c. Computer System, d. Wide-Angle Visual System (WAVS), e. Auxiliary Systems. The...relationship of the three stations can be seen in Figure 1. The stations will be reviewed in greater detail in following sections. Fhe computer system...d) Printer 2) TRAINEE AREA 3) HYDRAULIC POWFR ROOM 4) ELEC. POWER/AIR COMPRESSORS 5) COMPUTER /PERIPHERAL AREA Figure 1. Device 2FI12 general layout
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peterson, D.L.; Kaufmann, H.E.
1978-01-01
During July 1977, fifty-one gravity stations were obtained in the Gerlach Extension Known Geothermal Resource Area and vicinity, northwestern Nevada. The gravity observations were made with a Worden gravimeter having a scale factor of about 0.5 milligal per division. No terrain corrections have been applied to these data. The earth tide correction was not used in drift reduction. The Geodetic Reference System 1967 formula (International Association of Geodesy, 1967) was used to compute theoretical gravity. Observed gravity is referenced to a base station in Gerlach, Nevada, having a value based on the Potsdam System of 1930. A density of 2.67more » g per cm/sup 3/ was used in computing the Bouguer anomaly.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peterson, D.L.; Kaufmann, H.E.
1978-01-01
During July 1977, forty-four gravity stations were obtained in the Fly Ranch Extension Known Geothermal Resource Area and vicinity, northwestern Nevada. The gravity observations were made with a Worden gravimeter having a scale factor of about 0.5 milligal per division. No terrain corrections have been applied to these data. The earth tide correction was not used in drift reduction. The Geodetic Reference System 1967 formula (International Association of Geodesy, 1967) was used to compute theoretical gravity. Observed gravity is referenced to a base station in Gerlach, Nevada, having a value based on the Potsdam System of 1930 (fig. 1). Amore » density of 2.67 g per cm/sup 3/ was used in computing the Bouguer anomaly.« less
Spacecraft crew procedures from paper to computers
NASA Technical Reports Server (NTRS)
Oneal, Michael; Manahan, Meera
1993-01-01
Large volumes of paper are launched with each Space Shuttle Mission that contain step-by-step instructions for various activities that are to be performed by the crew during the mission. These instructions include normal operational procedures and malfunction or contingency procedures and are collectively known as the Flight Data File (FDF). An example of nominal procedures would be those used in the deployment of a satellite from the Space Shuttle; a malfunction procedure would describe actions to be taken if a specific problem developed during the deployment. A new FDF and associated system is being created for Space Station Freedom. The system will be called the Space Station Flight Data File (SFDF). NASA has determined that the SFDF will be computer-based rather than paper-based. Various aspects of the SFDF are discussed.
Mobility-Aware Caching and Computation Offloading in 5G Ultra-Dense Cellular Networks
Chen, Min; Hao, Yixue; Qiu, Meikang; Song, Jeungeun; Wu, Di; Humar, Iztok
2016-01-01
Recent trends show that Internet traffic is increasingly dominated by content, which is accompanied by the exponential growth of traffic. To cope with this phenomena, network caching is introduced to utilize the storage capacity of diverse network devices. In this paper, we first summarize four basic caching placement strategies, i.e., local caching, Device-to-Device (D2D) caching, Small cell Base Station (SBS) caching and Macrocell Base Station (MBS) caching. However, studies show that so far, much of the research has ignored the impact of user mobility. Therefore, taking the effect of the user mobility into consideration, we proposes a joint mobility-aware caching and SBS density placement scheme (MS caching). In addition, differences and relationships between caching and computation offloading are discussed. We present a design of a hybrid computation offloading and support it with experimental results, which demonstrate improved performance in terms of energy cost. Finally, we discuss the design of an incentive mechanism by considering network dynamics, differentiated user’s quality of experience (QoE) and the heterogeneity of mobile terminals in terms of caching and computing capabilities. PMID:27347975
Mobility-Aware Caching and Computation Offloading in 5G Ultra-Dense Cellular Networks.
Chen, Min; Hao, Yixue; Qiu, Meikang; Song, Jeungeun; Wu, Di; Humar, Iztok
2016-06-25
Recent trends show that Internet traffic is increasingly dominated by content, which is accompanied by the exponential growth of traffic. To cope with this phenomena, network caching is introduced to utilize the storage capacity of diverse network devices. In this paper, we first summarize four basic caching placement strategies, i.e., local caching, Device-to-Device (D2D) caching, Small cell Base Station (SBS) caching and Macrocell Base Station (MBS) caching. However, studies show that so far, much of the research has ignored the impact of user mobility. Therefore, taking the effect of the user mobility into consideration, we proposes a joint mobility-aware caching and SBS density placement scheme (MS caching). In addition, differences and relationships between caching and computation offloading are discussed. We present a design of a hybrid computation offloading and support it with experimental results, which demonstrate improved performance in terms of energy cost. Finally, we discuss the design of an incentive mechanism by considering network dynamics, differentiated user's quality of experience (QoE) and the heterogeneity of mobile terminals in terms of caching and computing capabilities.
NASA Technical Reports Server (NTRS)
Gerber, C. R.
1972-01-01
The development of uniform computer program standards and conventions for the modular space station is discussed. The accomplishments analyzed are: (1) development of computer program specification hierarchy, (2) definition of computer program development plan, and (3) recommendations for utilization of all operating on-board space station related data processing facilities.
Geiger, Linda H.
1983-01-01
The report is an update of U.S. Geological Survey Open-File Report 77-703, which described a retrieval program for administrative index of active data-collection sites in Florida. Extensive changes to the Findex system have been made since 1977 , making the previous report obsolete. A description of the data base and computer programs that are available in the Findex system are documented in this report. This system serves a vital need in the administration of the many and diverse water-data collection activities. District offices with extensive data-collection activities will benefit from the documentation of the system. Largely descriptive, the report tells how a file of computer card images has been established which contains entries for all sites in Florida at which there is currently a water-data collection activity. Entries include information such as identification number, station name, location, type of site, county, frequency of data collection, funding, and other pertinent details. The computer program FINDEX selectively retrieves entries and lists them in a format suitable for publication. The index is updated routinely. (USGS)
NASA Astrophysics Data System (ADS)
Perotti, Jose M.; Lucena, Angel R.; Mullenix, Pamela A.; Mata, Carlos T.
2006-05-01
Current and future requirements of aerospace sensors and transducers demand the design and development of a new family of sensing devices, with emphasis on reduced weight, power consumption, and physical size. This new generation of sensors and transducers will possess a certain degree of intelligence in order to provide the end user with critical data in a more efficient manner. Communication between networks of traditional or next-generation sensors can be accomplished by a Wireless Sensor Network (WSN) developed by NASA's Instrumentation Branch and ASRC Aerospace Corporation at Kennedy Space Center (KSC), consisting of at least one central station and several remote stations and their associated software. The central station is application-dependent and can be implemented on different computer hardware, including industrial, handheld, or PC-104 single-board computers, on a variety of operating systems: embedded Windows, Linux, VxWorks, etc. The central stations and remote stations share a similar radio frequency (RF) core module hardware that is modular in design. The main components of the remote stations are an RF core module, a sensor interface module, batteries, and a power management module. These modules are stackable, and a common bus provides the flexibility to stack other modules for additional memory, increased processing, etc. WSN can automatically reconfigure to an alternate frequency if interference is encountered during operation. In addition, the base station will autonomously search for a remote station that was perceived to be lost, using relay stations and alternate frequencies. Several wireless remote-station types were developed and tested in the laboratory to support different sensing technologies, such as resistive temperature devices, silicon diodes, strain gauges, pressure transducers, and hydrogen leak detectors.
Distributed computer taxonomy based on O/S structure
NASA Technical Reports Server (NTRS)
Foudriat, Edwin C.
1985-01-01
The taxonomy considers the resource structure at the operating system level. It compares a communication based taxonomy with the new taxonomy to illustrate how the latter does a better job when related to the client's view of the distributed computer. The results illustrate the fundamental features and what is required to construct fully distributed processing systems. The problem of using network computers on the space station is addressed. A detailed discussion of the taxonomy is not given here. Information is given in the form of charts and diagrams that were used to illustrate a talk.
47 CFR 80.771 - Method of computing coverage.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 5 2010-10-01 2010-10-01 false Method of computing coverage. 80.771 Section 80... STATIONS IN THE MARITIME SERVICES Standards for Computing Public Coast Station VHF Coverage § 80.771 Method of computing coverage. Compute the +17 dBu contour as follows: (a) Determine the effective antenna...
Device 2E6 (ACMS) Air Combat Maneuvering Simulator Instructor Console Review.
1983-12-01
While the device provides some new features which support training such as a debrief facility and a computer based instructor training module, the...Equipment Center, Orlando, FL (in printing). - 11 - -~.-. -- ~ --- NAVTRAEQUI PCEN 82-M-0767- 1 PROJECTORS DOE COMPUTER SYSTEMS Figure 1. General...arrangement (2E6) - 12 7 NAVTRAEQUIPCEN 82-M--0767-1 d. instructor stations, e. computer systems, ftarget model subsystem, g. debrief subsystem, h
NASA Technical Reports Server (NTRS)
Mckay, C. W.; Bown, R. L.
1985-01-01
The space station data management system involves networks of computing resources that must work cooperatively and reliably over an indefinite life span. This program requires a long schedule of modular growth and an even longer period of maintenance and operation. The development and operation of space station computing resources will involve a spectrum of systems and software life cycle activities distributed across a variety of hosts, an integration, verification, and validation host with test bed, and distributed targets. The requirement for the early establishment and use of an apporopriate Computer Systems and Software Engineering Support Environment is identified. This environment will support the Research and Development Productivity challenges presented by the space station computing system.
NASA Astrophysics Data System (ADS)
Hubert, G.; Federico, C. A.; Pazianotto, M. T.; Gonzales, O. L.
2016-02-01
In this paper are described the ACROPOL and OPD high-altitude stations devoted to characterize the atmospheric radiation fields. The ACROPOL platform, located at the summit of the Pic du Midi in the French Pyrenees at 2885 m above sea level, exploits since May 2011 some scientific equipment, including a BSS neutron spectrometer, detectors based on semiconductor and scintillators. In the framework of a IEAv and ONERA collaboration, a second neutron spectrometer was simultaneously exploited since February 2015 at the summit of the Pico dos Dias in Brazil at 1864 m above the sea level. The both high station platforms allow for investigating the long period dynamics to analyze the spectral variation of cosmic-ray- induced neutron and effects of local and seasonal changes, but also the short term dynamics during solar flare events. This paper presents long and short-term analyses, including measurement and modeling investigations considering the both high altitude stations data. The modeling approach, based on ATMORAD computational platform, was used to link the both station measurements.
A general-purpose development environment for intelligent computer-aided training systems
NASA Technical Reports Server (NTRS)
Savely, Robert T.
1990-01-01
Space station training will be a major task, requiring the creation of large numbers of simulation-based training systems for crew, flight controllers, and ground-based support personnel. Given the long duration of space station missions and the large number of activities supported by the space station, the extension of space shuttle training methods to space station training may prove to be impractical. The application of artificial intelligence technology to simulation training can provide the ability to deliver individualized training to large numbers of personnel in a distributed workstation environment. The principal objective of this project is the creation of a software development environment which can be used to build intelligent training systems for procedural tasks associated with the operation of the space station. Current NASA Johnson Space Center projects and joint projects with other NASA operational centers will result in specific training systems for existing space shuttle crew, ground support personnel, and flight controller tasks. Concurrently with the creation of these systems, a general-purpose development environment for intelligent computer-aided training systems will be built. Such an environment would permit the rapid production, delivery, and evolution of training systems for space station crew, flight controllers, and other support personnel. The widespread use of such systems will serve to preserve task and training expertise, support the training of many personnel in a distributed manner, and ensure the uniformity and verifiability of training experiences. As a result, significant reductions in training costs can be realized while safety and the probability of mission success can be enhanced.
Modular Manufacturing Simulator: Users Manual
NASA Technical Reports Server (NTRS)
1997-01-01
The Modular Manufacturing Simulator (MMS) has been developed for the beginning user of computer simulations. Consequently, the MMS cannot model complex systems that require branching and convergence logic. Once a user becomes more proficient in computer simulation and wants to add more complexity, the user is encouraged to use one of the many available commercial simulation systems. The (MMS) is based on the SSE5 that was developed in the early 1990's by the University of Alabama in Huntsville (UAH). A recent survey by MSFC indicated that the simulator has been a major contributor to the economic impact of the MSFC technology transfer program. Many manufacturers have requested additional features for the SSE5. Consequently, the following features have been added to the MMS that are not available in the SSE5: runs under Windows, print option for both input parameters and output statistics, operator can be fixed at a station or assigned to a group of stations, operator movement based on time limit, part limit, or work-in-process (WIP) limit at next station. The movement options for a moveable operators are: go to station with largest WIP, rabbit chase where operator moves in circular sequence between stations, and push/pull where operator moves back and forth between stations. This user's manual contains the necessary information for installing the MMS on a PC, a description of the various MMS commands, and the solutions to a number of sample problems using the MMS. Also included in the beginning of this report is a brief discussion of technology transfer.
NASA Technical Reports Server (NTRS)
Seneca, V. I.; Mlynarczyk, R. H.
1974-01-01
Tables of data are provided to show the availability of Skylab data to selected ground stations during the phases of Skylab preflight, Skylab unmanned condition, and Skylab manned condition. The mean time between failure (MTBF) of the same Skylab functions is tabulated for the selected ground stations. All reliability data are based on a 90 percent confidence interval.
NASA Astrophysics Data System (ADS)
Li, J.; Liu, L. Q.; Xu, X. D.; Liu, T.; Li, Q.; Hu, Z. J.; Wang, B. M.; Xiong, L. Y.; Dong, B.; Yan, T.
A 40l/h Helium Liquefier has been commissioned by the Technical Institute of Physics and Chemistry, Chinese Academy of Sciences. A measurement and control system based on Siemens PLC S7-300 for this Helium Liquefier is developed. Proper sensors are selected, for example, three types of transmitters are adopted respectively according to detailed temperature measurement requirements. Siemens S7-300 PLC CPU315-2PN/DP operates as a master station and three sets of ET200 M DP remote expand I/O operate asslave stations. Profibus-DP field communication is used between the master station and the slave stations. The upper computer HMI(Human Machine Interface) is compiled using Siemens configuration software WinCC V7.0. The upper computer communicates with PLC by means of industrial Ethernet. A specific control logic for this Helium Liquefier is developed. The control of the suction and discharge pressures of the compressor and the control of the turbo-expanders loop are being discussed in this paper. Following the commissioning phase, the outlet temperature of the second stage turbine has reached 8.6K and the temperature before the throttle valve has reached 13.1K.
NASA Astrophysics Data System (ADS)
Li, J.; Xiong, L. Y.; Peng, N.; Dong, B.; Wang, P.; Liu, L. Q.
2014-01-01
An experimental platform for cryogenic Helium gas bearing turbo-expanders is established at the Technical Institute of Physics and Chemistry, Chinese Academy of Sciences. This turbo-expander experimental platform is designed for performance testing and experimental research on Helium turbo-expanders with different sizes from the liquid hydrogen temperature to the room temperature region. A measurement and control system based on Siemens PLC S7-300 for this turbo-expander experimental platform is developed. Proper sensors are selected to measure such parameters as temperature, pressure, rotation speed and air flow rate. All the collected data to be processed are transformed and transmitted to S7-300 CPU. Siemens S7-300 series PLC CPU315-2PN/DP is as master station and two sets of ET200M DP remote expand I/O is as slave station. Profibus-DP field communication is established between master station and slave stations. The upper computer Human Machine Interface (HMI) is compiled using Siemens configuration software WinCC V6.2. The upper computer communicates with PLC by means of industrial Ethernet. Centralized monitoring and distributed control is achieved. Experimental results show that this measurement and control system has fulfilled the test requirement for the turbo-expander experimental platform.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, J.; Xiong, L. Y.; Peng, N.
2014-01-29
An experimental platform for cryogenic Helium gas bearing turbo-expanders is established at the Technical Institute of Physics and Chemistry, Chinese Academy of Sciences. This turbo-expander experimental platform is designed for performance testing and experimental research on Helium turbo-expanders with different sizes from the liquid hydrogen temperature to the room temperature region. A measurement and control system based on Siemens PLC S7-300 for this turbo-expander experimental platform is developed. Proper sensors are selected to measure such parameters as temperature, pressure, rotation speed and air flow rate. All the collected data to be processed are transformed and transmitted to S7-300 CPU. Siemensmore » S7-300 series PLC CPU315-2PN/DP is as master station and two sets of ET200M DP remote expand I/O is as slave station. Profibus-DP field communication is established between master station and slave stations. The upper computer Human Machine Interface (HMI) is compiled using Siemens configuration software WinCC V6.2. The upper computer communicates with PLC by means of industrial Ethernet. Centralized monitoring and distributed control is achieved. Experimental results show that this measurement and control system has fulfilled the test requirement for the turbo-expander experimental platform.« less
Extraction and visualization of the central chest lymph-node stations
NASA Astrophysics Data System (ADS)
Lu, Kongkuo; Merritt, Scott A.; Higgins, William E.
2008-03-01
Lung cancer remains the leading cause of cancer death in the United States and is expected to account for nearly 30% of all cancer deaths in 2007. Central to the lung-cancer diagnosis and staging process is the assessment of the central chest lymph nodes. This assessment typically requires two major stages: (1) location of the lymph nodes in a three-dimensional (3D) high-resolution volumetric multi-detector computed-tomography (MDCT) image of the chest; (2) subsequent nodal sampling using transbronchial needle aspiration (TBNA). We describe a computer-based system for automatically locating the central chest lymph-node stations in a 3D MDCT image. Automated analysis methods are first run that extract the airway tree, airway-tree centerlines, aorta, pulmonary artery, lungs, key skeletal structures, and major-airway labels. This information provides geometrical and anatomical cues for localizing the major nodal stations. Our system demarcates these stations, conforming to criteria outlined for the Mountain and Wang standard classification systems. Visualization tools within the system then enable the user to interact with these stations to locate visible lymph nodes. Results derived from a set of human 3D MDCT chest images illustrate the usage and efficacy of the system.
Multi-Station Broad Regional Event Detection Using Waveform Correlation
NASA Astrophysics Data System (ADS)
Slinkard, M.; Stephen, H.; Young, C. J.; Eckert, R.; Schaff, D. P.; Richards, P. G.
2013-12-01
Previous waveform correlation studies have established the occurrence of repeating seismic events in various regions, and the utility of waveform-correlation event-detection on broad regional or even global scales to find events currently not included in traditionally-prepared bulletins. The computational burden, however, is high, limiting previous experiments to relatively modest template libraries and/or processing time periods. We have developed a distributed computing waveform correlation event detection utility that allows us to process years of continuous waveform data with template libraries numbering in the thousands. We have used this system to process several years of waveform data from IRIS stations in East Asia, using libraries of template events taken from global and regional bulletins. Detections at a given station are confirmed by 1) comparison with independent bulletins of seismicity, and 2) consistent detections at other stations. We find that many of the detected events are not in traditional catalogs, hence the multi-station comparison is essential. In addition to detecting the similar events, we also estimate magnitudes very precisely based on comparison with the template events (when magnitudes are available). We have investigated magnitude variation within detected families of similar events, false alarm rates, and the temporal and spatial reach of templates.
Top 10 Uses for ClarisWorks in the One-Computer Classroom.
ERIC Educational Resources Information Center
Robinette, Michelle
1996-01-01
Suggests ways to use ClarisWorks to motivate students when only one computer is accessible: (1) class database; (2) grade book; (3) classroom journal; (4) ongoing story center; (5) skill-and-draw review station; (6) monthly class magazine/newspaper; (7) research base/project planner; (8) lecture and presentation enhancement; (9) database of ideas…
Stream-temperature characteristics in Georgia
Dyar, T.R.; Alhadeff, S. Jack
1997-01-01
Stream-temperature measurements for 198 periodic and 22 daily record stations were analyzed using a harmonic curve-fitting procedure. Statistics of data from 78 selected stations were used to compute a statewide stream-temperature harmonic equation, derived using latitude, drainage area, and altitude for natural streams having drainage areas greater than about 40 square miles. Based on the 1955-84 reference period, the equation may be used to compute long-term natural harmonic stream-temperature coefficients to within an on average of about 0.4? C. Basin-by-basin summaries of observed long-term stream-temperature characteristics are included for selected stations and river reaches, particularly along Georgia's mainstem streams. Changes in the stream- temperature regimen caused by the effects of development, principally impoundments and thermal power plants, are shown by comparing harmonic curves and coefficients from the estimated natural values to the observed modified-condition values.
NASA Technical Reports Server (NTRS)
1985-01-01
The primary purpose of the Aerospace Computer Security Conference was to bring together people and organizations which have a common interest in protecting intellectual property generated in space. Operational concerns are discussed, taking into account security implications of the space station information system, Space Shuttle security policies and programs, potential uses of probabilistic risk assessment techniques for space station development, key considerations in contingency planning for secure space flight ground control centers, a systematic method for evaluating security requirements compliance, and security engineering of secure ground stations. Subjects related to security technologies are also explored, giving attention to processing requirements of secure C3/I and battle management systems and the development of the Gemini trusted multiple microcomputer base, the Restricted Access Processor system as a security guard designed to protect classified information, and observations on local area network security.
Citardi, Martin J.; Herrmann, Brian; Hollenbeak, Chris S.; Stack, Brendan C.; Cooper, Margaret; Bucholz, Richard D.
2001-01-01
Traditionally, cadaveric studies and plain-film cephalometrics provided information about craniomaxillofacial proportions and measurements; however, advances in computer technology now permit software-based review of computed tomography (CT)-based models. Distances between standardized anatomic points were measured on five dried human skulls with standard scientific calipers (Geneva Gauge, Albany, NY) and through computer workstation (StealthStation 2.6.4, Medtronic Surgical Navigation Technology, Louisville, CO) review of corresponding CT scans. Differences in measurements between the caliper and CT model were not statistically significant for each parameter. Measurements obtained by computer workstation CT review of the cranial skull base are an accurate representation of actual bony anatomy. Such information has important implications for surgical planning and clinical research. ImagesFigure 1Figure 2Figure 3 PMID:17167599
Knowledge-based vision for space station object motion detection, recognition, and tracking
NASA Technical Reports Server (NTRS)
Symosek, P.; Panda, D.; Yalamanchili, S.; Wehner, W., III
1987-01-01
Computer vision, especially color image analysis and understanding, has much to offer in the area of the automation of Space Station tasks such as construction, satellite servicing, rendezvous and proximity operations, inspection, experiment monitoring, data management and training. Knowledge-based techniques improve the performance of vision algorithms for unstructured environments because of their ability to deal with imprecise a priori information or inaccurately estimated feature data and still produce useful results. Conventional techniques using statistical and purely model-based approaches lack flexibility in dealing with the variabilities anticipated in the unstructured viewing environment of space. Algorithms developed under NASA sponsorship for Space Station applications to demonstrate the value of a hypothesized architecture for a Video Image Processor (VIP) are presented. Approaches to the enhancement of the performance of these algorithms with knowledge-based techniques and the potential for deployment of highly-parallel multi-processor systems for these algorithms are discussed.
Propagation Velocity of Solid Earth Tides
NASA Astrophysics Data System (ADS)
Pathak, S.
2017-12-01
One of the significant considerations in most of the geodetic investigations is to take into account the outcome of Solid Earth tides on the location and its consequent impact on the time series of coordinates. In this research work, the propagation velocity resulting from the Solid Earth tides between the Indian stations is computed. Mean daily coordinates for the stations have been computed by applying static precise point positioning technique for a day. The computed coordinates are used as an input for computing the tidal displacements at the stations by Gravity method along three directions at 1-minute interval for 24 hours. Further the baseline distances are computed between four Indian stations. Computation of the propagation velocity for Solid Earth tides can be done by the virtue of study of the concurrent effect of it in-between the stations of identified baseline distance along with the time consumed by the tides for reaching from one station to another. The propagation velocity helps in distinguishing the impact at any station if the consequence at a known station for a specific time-period is known. Thus, with the knowledge of propagation velocity, the spatial and temporal effects of solid earth tides can be estimated with respect to a known station. As theoretically explained, the tides generated are due to the position of celestial bodies rotating about Earth. So the need of study is to observe the correlation of propagation velocity with the rotation speed of the Earth. The propagation velocity of Solid Earth tides comes out to be in the range of 440-470 m/s. This velocity comes out to be in a good agreement with the Earth's rotation speed.
NetMOD Version 2.0 Mathematical Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merchant, Bion J.; Young, Christopher J.; Chael, Eric P.
2015-08-01
NetMOD ( Net work M onitoring for O ptimal D etection) is a Java-based software package for conducting simulation of seismic, hydroacoustic and infrasonic networks. Network simulations have long been used to study network resilience to station outages and to determine where additional stations are needed to reduce monitoring thresholds. NetMOD makes use of geophysical models to determine the source characteristics, signal attenuation along the path between the source and station, and the performance and noise properties of the station. These geophysical models are combined to simulate the relative amplitudes of signal and noise that are observed at each ofmore » the stations. From these signal-to-noise ratios (SNR), the probabilities of signal detection at each station and event detection across the network of stations can be computed given a detection threshold. The purpose of this document is to clearly and comprehensively present the mathematical framework used by NetMOD, the software package developed by Sandia National Laboratories to assess the monitoring capability of ground-based sensor networks. Many of the NetMOD equations used for simulations are inherited from the NetSim network capability assessment package developed in the late 1980s by SAIC (Sereno et al., 1990).« less
Computer-Assisted Laboratory Stations.
ERIC Educational Resources Information Center
Snyder, William J., Hanyak, Michael E.
1985-01-01
Describes the advantages and features of computer-assisted laboratory stations for use in a chemical engineering program. Also describes a typical experiment at such a station: determining the response times of a solid state humidity sensor at various humidity conditions and developing an empirical model for the sensor. (JN)
Oberg, Kevin A.; Mades, Dean M.
1987-01-01
Four techniques for estimating generalized skew in Illinois were evaluated: (1) a generalized skew map of the US; (2) an isoline map; (3) a prediction equation; and (4) a regional-mean skew. Peak-flow records at 730 gaging stations having 10 or more annual peaks were selected for computing station skews. Station skew values ranged from -3.55 to 2.95, with a mean of -0.11. Frequency curves computed for 30 gaging stations in Illinois using the variations of the regional-mean skew technique are similar to frequency curves computed using a skew map developed by the US Water Resources Council (WRC). Estimates of the 50-, 100-, and 500-yr floods computed for 29 of these gaging stations using the regional-mean skew techniques are within the 50% confidence limits of frequency curves computed using the WRC skew map. Although the three variations of the regional-mean skew technique were slightly more accurate than the WRC map, there is no appreciable difference between flood estimates computed using the variations of the regional-mean technique and flood estimates computed using the WRC skew map. (Peters-PTT)
Selected Streamflow Statistics for Streamgaging Stations in Delaware, 2003
Ries, Kernell G.
2004-01-01
Flow-duration and low-flow frequency statistics were calculated for 15 streamgaging stations in Delaware, in cooperation with the Delaware Geological Survey. The flow-duration statistics include the 1-, 2-, 5-, 10-, 20-, 30-, 40-, 50-, 60-, 70-, 80-, 90-, 95-, 98-, and 99-percent duration discharges. The low-flow frequency statistics include the average discharges for 1, 7, 14, 30, 60, 90, and 120 days that recur, on average, once in 1.01, 2, 5, 10, 20, 50, and 100 years. The statistics were computed using U.S. Geological Survey computer programs that can be downloaded from the World Wide Web at no cost. The computer programs automate standard U.S. Geological Survey methods for computing the statistics. Documentation is provided at the Web sites for the individual programs. The computed statistics are presented in tabular format on a separate page for each station, along with the station name, station number, the location, the period of record, and remarks.
NASA Astrophysics Data System (ADS)
Zhao, L.; Chen, P.; Jordan, T. H.; Olsen, K. B.; Maechling, P.; Faerman, M.
2004-12-01
The Southern California Earthquake Center (SCEC) is developing a Community Modeling Environment (CME) to facilitate the computational pathways of physics-based seismic hazard analysis (Maechling et al., this meeting). Major goals are to facilitate the forward modeling of seismic wavefields in complex geologic environments, including the strong ground motions that cause earthquake damage, and the inversion of observed waveform data for improved models of Earth structure and fault rupture. Here we report on a unified approach to these coupled inverse problems that is based on the ability to generate and manipulate wavefields in densely gridded 3D Earth models. A main element of this approach is a database of receiver Green tensors (RGT) for the seismic stations, which comprises all of the spatial-temporal displacement fields produced by the three orthogonal unit impulsive point forces acting at each of the station locations. Once the RGT database is established, synthetic seismograms for any earthquake can be simply calculated by extracting a small, source-centered volume of the RGT from the database and applying the reciprocity principle. The partial derivatives needed for point- and finite-source inversions can be generated in the same way. Moreover, the RGT database can be employed in full-wave tomographic inversions launched from a 3D starting model, because the sensitivity (Fréchet) kernels for travel-time and amplitude anomalies observed at seismic stations in the database can be computed by convolving the earthquake-induced displacement field with the station RGTs. We illustrate all elements of this unified analysis with an RGT database for 33 stations of the California Integrated Seismic Network in and around the Los Angeles Basin, which we computed for the 3D SCEC Community Velocity Model (SCEC CVM3.0) using a fourth-order staggered-grid finite-difference code. For a spatial grid spacing of 200 m and a time resolution of 10 ms, the calculations took ~19,000 node-hours on the Linux cluster at USC's High-Performance Computing Center. The 33-station database with a volume of ~23.5 TB was archived in the SCEC digital library at the San Diego Supercomputer Center using the Storage Resource Broker (SRB). From a laptop, anyone with access to this SRB collection can compute synthetic seismograms for an arbitrary source in the CVM in a matter of minutes. Efficient approaches have been implemented to use this RGT database in the inversions of waveforms for centroid and finite moment tensors and tomographic inversions to improve the CVM. Our experience with these large problems suggests areas where the cyberinfrastructure currently available for geoscience computation needs to be improved.
STS-42 Commander Grabe works with MWPE at IML-1 Rack 8 aboard OV-103
NASA Technical Reports Server (NTRS)
1992-01-01
STS-42 Commander Ronald J. Grabe works with the Mental Workload and Performance Evaluation Experiment (MWPE) (portable laptop computer, keyboard cursor keys, a two-axis joystick, and a track ball) at Rack 8 in the International Microgravity Laboratory 1 (IML-1) module. The test was designed as a result of difficulty experienced by crewmembers working at a computer station on a previous Space Shuttle mission. The problem was due to the workstation's design being based on Earth-bound conditions with the operator in a typical one-G standing position. For STS-42, the workstation was redesigned to evaluate the effects of microgravity on the ability of crewmembers to interact with a computer workstation. Information gained from this experiment will be used to design workstations for future Spacelab missions and Space Station Freedom (SSF).
Looking at Earth from space: Direct readout from environmental satellites
NASA Technical Reports Server (NTRS)
1994-01-01
Direct readout is the capability to acquire information directly from meteorological satellites. Data can be acquired from NASA-developed, National Oceanic and Atmospheric Administration (NOAA)-operated satellites, as well as from other nations' meteorological satellites. By setting up a personal computer-based ground (Earth) station to receive satellite signals, direct readout may be obtained. The electronic satellite signals are displayed as images on the computer screen. The images can display gradients of the Earth's topography and temperature, cloud formations, the flow and direction of winds and water currents, the formation of hurricanes, the occurrence of an eclipse, and a view of Earth's geography. Both visible and infrared images can be obtained. This booklet introduces the satellite systems, ground station configuration, and computer requirements involved in direct readout. Also included are lists of associated resources and vendors.
View southeast of computer controlled energy monitoring system. System replaced ...
View southeast of computer controlled energy monitoring system. System replaced strip chart recorders and other instruments under the direct observation of the load dispatcher. - Thirtieth Street Station, Load Dispatch Center, Thirtieth & Market Streets, Railroad Station, Amtrak (formerly Pennsylvania Railroad Station), Philadelphia, Philadelphia County, PA
NASA Astrophysics Data System (ADS)
Alothman, Abdulaziz; Ayhan, Mehmet
In the 21st century, sea level rise is expected to be about 30 cm or even more (up to 60 cm). Saudi Arabia has very long coasts of about 3400 km and hundreds of islands. Therefore, sea level monitoring may be important in particular along coastal low lands on Red Sea and Arabian Gulf coasts. Arabian Gulf is connected to Indian Ocean and lying along a parallel course in the south-west of the Zagros Trust Belt. We expect vertical land motion within the area due to both tectonic structures of the Arabian Peninsula and oil production activities. Global Navigation Satellite System (GNSS) Continues observations were used to estimate the vertical crustal motion. Bahrain International GPS Service (IGS-GPS) station is the only continuous GPS station accessible in the region, and it is close to the Mina Sulman tide gauge station in Bahrain. The weekly GPS time series of vertical component at Bahrain IGS-GPS station referring to the ITRF97 from 1999.2 to 2008.6 are used in the computation. We fitted a linear trend with an annual signal and a break to the GPS vertical time series and found a vertical land motion rate of 0.46 0.11 mm/yr. To investigate sea level variation within the west of Arabian Gulf, monthly means of sea level at 13 tide gauges along the coast of Saudi Arabia and Bahrain, available in the database of the Permanent Service for Mean Sea Level (PSMSL), are studied. We analyzed separately the monthly mean sea level measurements at each station, and estimated secular sea level rate by a robust linear trend fitting. We computed the average relative sea level rise rate of 1.96 0.21 mm/yr within the west of Arabian Gulf based on 4 stations spanning longer than 19 years. Sea level rates at the stations are first corrected for vertical land motion contamination using the ICE-5G v1.2 VM4 Glacial Isostatic Adjustment (GIA) model, and the average sea level rate is found 2.27 0.21 mm/yr. Assuming the vertical rate at Bahrain IGS-GPS station represents the vertical rate at each of the other tide gauge stations studied here in the region, we computed average sea level rise rate of 2.42 0.21 mm/yr within the west of Arabian Gulf.
Naval Computer & Telecommunications Area Master Station, Eastern Pacific, Radio ...
Naval Computer & Telecommunications Area Master Station, Eastern Pacific, Radio Transmitter Facility Lualualei, Marine Barracks, Intersection of Tower Drive & Morse Street, Makaha, Honolulu County, HI
NASA Astrophysics Data System (ADS)
Xu, Guoping; Udupa, Jayaram K.; Tong, Yubing; Cao, Hanqiang; Odhner, Dewey; Torigian, Drew A.; Wu, Xingyu
2018-03-01
Currently, there are many papers that have been published on the detection and segmentation of lymph nodes from medical images. However, it is still a challenging problem owing to low contrast with surrounding soft tissues and the variations of lymph node size and shape on computed tomography (CT) images. This is particularly very difficult on low-dose CT of PET/CT acquisitions. In this study, we utilize our previous automatic anatomy recognition (AAR) framework to recognize the thoracic-lymph node stations defined by the International Association for the Study of Lung Cancer (IASLC) lymph node map. The lymph node stations themselves are viewed as anatomic objects and are localized by using a one-shot method in the AAR framework. Two strategies have been taken in this paper for integration into AAR framework. The first is to combine some lymph node stations into composite lymph node stations according to their geometrical nearness. The other is to find the optimal parent (organ or union of organs) as an anchor for each lymph node station based on the recognition error and thereby find an overall optimal hierarchy to arrange anchor organs and lymph node stations. Based on 28 contrast-enhanced thoracic CT image data sets for model building, 12 independent data sets for testing, our results show that thoracic lymph node stations can be localized within 2-3 voxels compared to the ground truth.
NASA Astrophysics Data System (ADS)
Mehrvand, Masoud; Baghanam, Aida Hosseini; Razzaghzadeh, Zahra; Nourani, Vahid
2017-04-01
Since statistical downscaling methods are the most largely used models to study hydrologic impact studies under climate change scenarios, nonlinear regression models known as Artificial Intelligence (AI)-based models such as Artificial Neural Network (ANN) and Support Vector Machine (SVM) have been used to spatially downscale the precipitation outputs of Global Climate Models (GCMs). The study has been carried out using GCM and station data over GCM grid points located around the Peace-Tampa Bay watershed weather stations. Before downscaling with AI-based model, correlation coefficient values have been computed between a few selected large-scale predictor variables and local scale predictands to select the most effective predictors. The selected predictors are then assessed considering grid location for the site in question. In order to increase AI-based downscaling model accuracy pre-processing has been developed on precipitation time series. In this way, the precipitation data derived from various GCM data analyzed thoroughly to find the highest value of correlation coefficient between GCM-based historical data and station precipitation data. Both GCM and station precipitation time series have been assessed by comparing mean and variances over specific intervals. Results indicated that there is similar trend between GCM and station precipitation data; however station data has non-stationary time series while GCM data does not. Finally AI-based downscaling model have been applied to several GCMs with selected predictors by targeting local precipitation time series as predictand. The consequences of recent step have been used to produce multiple ensembles of downscaled AI-based models.
Roland, Mark A.; Stuckey, Marla H.
2007-01-01
The Delaware and North Branch Susquehanna River Basins in Pennsylvania experienced severe flooding as a result of intense rainfall during June 2006. The height of the flood waters on the rivers and tributaries approached or exceeded the peak of record at many locations. Updated flood-magnitude and flood-frequency data for streamflow-gaging stations on tributaries in the Delaware and North Branch Susquehanna River Basins were analyzed using data through the 2006 water year to determine if there were any major differences in the flood-discharge data. Flood frequencies for return intervals of 2, 5, 10, 50, 100, and 500 years (Q2, Q5, Q10, Q50, Q100, and Q500) were determined from annual maximum series (AMS) data from continuous-record gaging stations (stations) and were compared to flood discharges obtained from previously published Flood Insurance Studies (FIS) and to flood frequencies using partial-duration series (PDS) data. A Wilcoxon signed-rank test was performed to determine any statistically significant differences between flood frequencies computed from updated AMS station data and those obtained from FIS. Percentage differences between flood frequencies computed from updated AMS station data and those obtained from FIS also were determined for the 10, 50, 100, and 500 return intervals. A Mann-Kendall trend test was performed to determine statistically significant trends in the updated AMS peak-flow data for the period of record at the 41 stations. In addition to AMS station data, PDS data were used to determine flood-frequency discharges. The AMS and PDS flood-frequency data were compared to determine any differences between the two data sets. An analysis also was performed on AMS-derived flood frequencies for four stations to evaluate the possible effects of flood-control reservoirs on peak flows. Additionally, flood frequencies for three stations were evaluated to determine possible effects of urbanization on peak flows. The results of the Wilcoxon signed-rank test showed a significant difference at the 95-percent confidence level between the Q100 computed from AMS station data and the Q100 determined from previously published FIS for 97 sites. The flood-frequency discharges computed from AMS station data were consistently larger than the flood discharges from the FIS; mean percentage difference between the two data sets ranged from 14 percent for the Q100 to 20 percent for the Q50. The results of the Mann-Kendall test showed that 8 stations exhibited a positive trend (i.e., increasing annual maximum peaks over time) over their respective periods of record at the 95-percent confidence level, and an additional 7 stations indicated a positive trend, for a total of 15 stations, at a confidence level of greater than or equal to 90 percent. The Q2, Q5, Q10, Q50, and Q100 determined from AMS and PDS data for each station were compared by percentage. The flood magnitudes for the 2-year return period were 16 percent higher when partial-duration peaks were incorporated into the analyses, as opposed to using only the annual maximum peaks. The discharges then tended to converge around the 5-year return period, with a mean collective difference of only 1 percent. At the 10-, 50-, and 100-year return periods, the flood magnitudes based on annual maximum peaks were, on average, 6 percent higher compared to corresponding flood magnitudes based on partial-duration peaks. Possible effects on flood peaks from flood-control reservoirs and urban development within the basin also were examined. Annual maximum peak-flow data from four stations were divided into pre- and post-regulation periods. Comparisons were made between the Q100 determined from AMS station data for the periods of record pre- and post-regulation. Two stations showed a nearly 60- and 20-percent reduction in the 100-year discharges; the other two stations showed negligible differences in discharges. Three stations within urban basins were compared to 38 stations
United States data collection activities and requirements, volume 1
NASA Technical Reports Server (NTRS)
Hrin, S.; Mcgregor, D.
1977-01-01
The potential market for a data collection system was investigated to determine whether the user needs would be sufficient to support a satellite relay data collection system design. The activities of 107,407 data collections stations were studied to determine user needs in agriculture, climatology, environmental monitoring, forestry, geology, hydrology, meteorology, and oceanography. Descriptions of 50 distinct data collections networks are described and used to form the user data base. The computer program used to analyze the station data base is discussed, and results of the analysis are presented in maps and graphs. Information format and coding is described in the appendix.
A space transportation system operations model
NASA Technical Reports Server (NTRS)
Morris, W. Douglas; White, Nancy H.
1987-01-01
Presented is a description of a computer program which permits assessment of the operational support requirements of space transportation systems functioning in both a ground- and space-based environment. The scenario depicted provides for the delivery of payloads from Earth to a space station and beyond using upper stages based at the station. Model results are scenario dependent and rely on the input definitions of delivery requirements, task times, and available resources. Output is in terms of flight rate capabilities, resource requirements, and facility utilization. A general program description, program listing, input requirements, and sample output are included.
INTERIOR; DETAIL OF ANTENNA TRUNK OPENING, LOOKING EAST. Naval ...
INTERIOR; DETAIL OF ANTENNA TRUNK OPENING, LOOKING EAST. - Naval Computer & Telecommunications Area Master Station, Eastern Pacific, Radio Transmitter Facility Lualualei, Helix House No. 2, Base of Radio Antenna Structure No. 427, Makaha, Honolulu County, HI
INTERIOR; DETAIL OF ROOF FRAMING STRUCTURE, LOOKING SOUTHWEST. Naval ...
INTERIOR; DETAIL OF ROOF FRAMING STRUCTURE, LOOKING SOUTHWEST. - Naval Computer & Telecommunications Area Master Station, Eastern Pacific, Radio Transmitter Facility Lualualei, Helix House No. 2, Base of Radio Antenna Structure No. 427, Makaha, Honolulu County, HI
NASA Technical Reports Server (NTRS)
1974-01-01
Observations and research progress of the Smithsonian Astrophysical Observatory are reported. Satellite tracking networks (ground stations) are discussed and equipment (Baker-Nunn cameras) used to observe the satellites is described. The improvement of the accuracy of a laser ranging system of the ground stations is discussed. Also, research efforts in satellite geodesy (tides, gravity anomalies, plate tectonics) is discussed. The use of data processing for geophysical data is examined, and a data base for the Earth and Ocean Physics Applications Program is proposed. Analytical models of the earth's motion (computerized simulation) are described and the computation (numerical integration and algorithms) of satellite orbits affected by the earth's albedo, using computer techniques, is also considered. Research efforts in the study of the atmosphere are examined (the effect of drag on satellite motion), and models of the atmosphere based on satellite data are described.
Overview of computational control research at UT Austin
NASA Technical Reports Server (NTRS)
Bong, Wie
1989-01-01
An overview of current research activities at UT Austin is presented to discuss certain technical issues in the following areas: (1) Computer-Aided Nonlinear Control Design: In this project, the describing function method is employed for the nonlinear control analysis and design of a flexible spacecraft equipped with pulse modulated reaction jets. INCA program has been enhanced to allow the numerical calculation of describing functions as well as the nonlinear limit cycle analysis capability in the frequency domain; (2) Robust Linear Quadratic Gaussian (LQG) Compensator Synthesis: Robust control design techniques and software tools are developed for flexible space structures with parameter uncertainty. In particular, an interactive, robust multivariable control design capability is being developed for INCA program; and (3) LQR-Based Autonomous Control System for the Space Station: In this project, real time implementation of LQR-based autonomous control system is investigated for the space station with time-varying inertias and with significant multibody dynamic interactions.
Interactive computation of coverage regions for indoor wireless communication
NASA Astrophysics Data System (ADS)
Abbott, A. Lynn; Bhat, Nitin; Rappaport, Theodore S.
1995-12-01
This paper describes a system which assists in the strategic placement of rf base stations within buildings. Known as the site modeling tool (SMT), this system allows the user to display graphical floor plans and to select base station transceiver parameters, including location and orientation, interactively. The system then computes and highlights estimated coverage regions for each transceiver, enabling the user to assess the total coverage within the building. For single-floor operation, the user can choose between distance-dependent and partition- dependent path-loss models. Similar path-loss models are also available for the case of multiple floors. This paper describes the method used by the system to estimate coverage for both directional and omnidirectional antennas. The site modeling tool is intended to be simple to use by individuals who are not experts at wireless communication system design, and is expected to be very useful in the specification of indoor wireless systems.
NASA Technical Reports Server (NTRS)
Appleby, M. H.; Golightly, M. J.; Hardy, A. C.
1993-01-01
Major improvements have been completed in the approach to analyses and simulation of spacecraft radiation shielding and exposure. A computer-aided design (CAD)-based system has been developed for determining the amount of shielding provided by a spacecraft and simulating transmission of an incident radiation environment to any point within or external to the vehicle. Shielding analysis is performed using a customized ray-tracing subroutine contained within a standard engineering modeling software package. This improved shielding analysis technique has been used in several vehicle design programs such as a Mars transfer habitat, pressurized lunar rover, and the redesigned international Space Station. Results of analysis performed for the Space Station astronaut exposure assessment are provided to demonastrate the applicability and versatility of the system.
NASDA knowledge-based network planning system
NASA Technical Reports Server (NTRS)
Yamaya, K.; Fujiwara, M.; Kosugi, S.; Yambe, M.; Ohmori, M.
1993-01-01
One of the SODS (space operation and data system) sub-systems, NP (network planning) was the first expert system used by NASDA (national space development agency of Japan) for tracking and control of satellite. The major responsibilities of the NP system are: first, the allocation of network and satellite control resources and, second, the generation of the network operation plan data (NOP) used in automated control of the stations and control center facilities. Up to now, the first task of network resource scheduling was done by network operators. NP system automatically generates schedules using its knowledge base, which contains information on satellite orbits, station availability, which computer is dedicated to which satellite, and how many stations must be available for a particular satellite pass or a certain time period. The NP system is introduced.
High data rate modem simulation for the space station multiple-access communications system
NASA Technical Reports Server (NTRS)
Horan, Stephen
1987-01-01
The communications system for the space station will require a space based multiple access component to provide communications between the space based program elements and the station. A study was undertaken to investigate two of the concerns of this multiple access system, namely, the issues related to the frequency spectrum utilization and the possibilities for higher order (than QPSK) modulation schemes for use in possible modulators and demodulators (modems). As a result of the investigation, many key questions about the frequency spectrum utilization were raised. At this point, frequency spectrum utilization is seen as an area requiring further work. Simulations were conducted using a computer aided communications system design package to provide a straw man modem structure to be used for both QPSK and 8-PSK channels.
The multi-disciplinary design study: A life cycle cost algorithm
NASA Technical Reports Server (NTRS)
Harding, R. R.; Pichi, F. J.
1988-01-01
The approach and results of a Life Cycle Cost (LCC) analysis of the Space Station Solar Dynamic Power Subsystem (SDPS) including gimbal pointing and power output performance are documented. The Multi-Discipline Design Tool (MDDT) computer program developed during the 1986 study has been modified to include the design, performance, and cost algorithms for the SDPS as described. As with the Space Station structural and control subsystems, the LCC of the SDPS can be computed within the MDDT program as a function of the engineering design variables. Two simple examples of MDDT's capability to evaluate cost sensitivity and design based on LCC are included. MDDT was designed to accept NASA's IMAT computer program data as input so that IMAT's detailed structural and controls design capability can be assessed with expected system LCC as computed by MDDT. No changes to IMAT were required. Detailed knowledge of IMAT is not required to perform the LCC analyses as the interface with IMAT is noninteractive.
A new method for computing the reliability of consecutive k-out-of-n:F systems
NASA Astrophysics Data System (ADS)
Gökdere, Gökhan; Gürcan, Mehmet; Kılıç, Muhammet Burak
2016-01-01
In many physical systems, reliability evaluation, such as ones encountered in telecommunications, the design of integrated circuits, microwave relay stations, oil pipeline systems, vacuum systems in accelerators, computer ring networks, and spacecraft relay stations, have had applied consecutive k-out-of-n system models. These systems are characterized as logical connections among the components of the systems placed in lines or circles. In literature, a great deal of attention has been paid to the study of the reliability evaluation of consecutive k-out-of-n systems. In this paper, we propose a new method to compute the reliability of consecutive k-out-of-n:F systems, with n linearly and circularly arranged components. The proposed method provides a simple way for determining the system failure probability. Also, we write R-Project codes based on our proposed method to compute the reliability of the linear and circular systems which have a great number of components.
Platform Architecture for Decentralized Positioning Systems.
Kasmi, Zakaria; Norrdine, Abdelmoumen; Blankenbach, Jörg
2017-04-26
A platform architecture for positioning systems is essential for the realization of a flexible localization system, which interacts with other systems and supports various positioning technologies and algorithms. The decentralized processing of a position enables pushing the application-level knowledge into a mobile station and avoids the communication with a central unit such as a server or a base station. In addition, the calculation of the position on low-cost and resource-constrained devices presents a challenge due to the limited computing, storage capacity, as well as power supply. Therefore, we propose a platform architecture that enables the design of a system with the reusability of the components, extensibility (e.g., with other positioning technologies) and interoperability. Furthermore, the position is computed on a low-cost device such as a microcontroller, which simultaneously performs additional tasks such as data collecting or preprocessing based on an operating system. The platform architecture is designed, implemented and evaluated on the basis of two positioning systems: a field strength system and a time of arrival-based positioning system.
Platform Architecture for Decentralized Positioning Systems
Kasmi, Zakaria; Norrdine, Abdelmoumen; Blankenbach, Jörg
2017-01-01
A platform architecture for positioning systems is essential for the realization of a flexible localization system, which interacts with other systems and supports various positioning technologies and algorithms. The decentralized processing of a position enables pushing the application-level knowledge into a mobile station and avoids the communication with a central unit such as a server or a base station. In addition, the calculation of the position on low-cost and resource-constrained devices presents a challenge due to the limited computing, storage capacity, as well as power supply. Therefore, we propose a platform architecture that enables the design of a system with the reusability of the components, extensibility (e.g., with other positioning technologies) and interoperability. Furthermore, the position is computed on a low-cost device such as a microcontroller, which simultaneously performs additional tasks such as data collecting or preprocessing based on an operating system. The platform architecture is designed, implemented and evaluated on the basis of two positioning systems: a field strength system and a time of arrival-based positioning system. PMID:28445414
Stuck on Screens: Patterns of Computer and Gaming Station Use in Youth Seen in a Psychiatric Clinic
Baer, Susan; Bogusz, Elliot; Green, David A.
2011-01-01
Objective: Computer and gaming-station use has become entrenched in the culture of our youth. Parents of children with psychiatric disorders report concerns about overuse, but research in this area is limited. The goal of this study is to evaluate computer/gaming-station use in adolescents in a psychiatric clinic population and to examine the relationship between use and functional impairment. Method: 102 adolescents, ages 11–17, from out-patient psychiatric clinics participated. Amount of computer/gaming-station use, type of use (gaming or non-gaming), and presence of addictive features were ascertained along with emotional/functional impairment. Multivariate linear regression was used to examine correlations between patterns of use and impairment. Results: Mean screen time was 6.7±4.2 hrs/day. Presence of addictive features was positively correlated with emotional/functional impairment. Time spent on computer/gaming-station use was not correlated overall with impairment after controlling for addictive features, but non-gaming time was positively correlated with risky behavior in boys. Conclusions: Youth with psychiatric disorders are spending much of their leisure time on the computer/gaming-station and a substantial subset show addictive features of use which is associated with impairment. Further research to develop measures and to evaluate risk is needed to identify the impact of this problem. PMID:21541096
Using computer graphics to enhance astronaut and systems safety
NASA Technical Reports Server (NTRS)
Brown, J. W.
1985-01-01
Computer graphics is being employed at the NASA Johnson Space Center as a tool to perform rapid, efficient and economical analyses for man-machine integration, flight operations development and systems engineering. The Operator Station Design System (OSDS), a computer-based facility featuring a highly flexible and versatile interactive software package, PLAID, is described. This unique evaluation tool, with its expanding data base of Space Shuttle elements, various payloads, experiments, crew equipment and man models, supports a multitude of technical evaluations, including spacecraft and workstation layout, definition of astronaut visual access, flight techniques development, cargo integration and crew training. As OSDS is being applied to the Space Shuttle, Orbiter payloads (including the European Space Agency's Spacelab) and future space vehicles and stations, astronaut and systems safety are being enhanced. Typical OSDS examples are presented. By performing physical and operational evaluations during early conceptual phases. supporting systems verification for flight readiness, and applying its capabilities to real-time mission support, the OSDS provides the wherewithal to satisfy a growing need of the current and future space programs for efficient, economical analyses.
A PC based time domain reflectometer for space station cable fault isolation
NASA Technical Reports Server (NTRS)
Pham, Michael; McClean, Marty; Hossain, Sabbir; Vo, Peter; Kouns, Ken
1994-01-01
Significant problems are faced by astronauts on orbit in the Space Station when trying to locate electrical faults in multi-segment avionics and communication cables. These problems necessitate the development of an automated portable device that will detect and locate cable faults using the pulse-echo technique known as Time Domain Reflectometry. A breadboard time domain reflectometer (TDR) circuit board was designed and developed at the NASA-JSC. The TDR board works in conjunction with a GRiD lap-top computer to automate the fault detection and isolation process. A software program was written to automatically display the nature and location of any possible faults. The breadboard system can isolate open circuit and short circuit faults within two feet in a typical space station cable configuration. Follow-on efforts planned for 1994 will produce a compact, portable prototype Space Station TDR capable of automated switching in multi-conductor cables for high fidelity evaluation. This device has many possible commercial applications, including commercial and military aircraft avionics, cable TV, telephone, communication, information and computer network systems. This paper describes the principle of time domain reflectometry and the methodology for on-orbit avionics utility distribution system repair, utilizing the newly developed device called the Space Station Time Domain Reflectometer (SSTDR).
Trans-African Hydro-Meteorological Observatory
NASA Astrophysics Data System (ADS)
van de Giesen, N.; Andreini, M.; Selker, J.
2009-04-01
Our computing capacity to model hydrological processes is such that we can readily model every hectare of the globe's surface in real time. Satellites provide us with important state observations that allow us to calibrate our models and estimate model errors. Still, ground observations will remain necessary to obtain data that can not readily be observed from space. Hydro-Meteorological data availability is particularly scarce in Africa. This presentation launches a simple idea by which Africa can leapfrog into a new era of closely knit environmental observation networks. The basic idea is the design of a robust measurement station, based on the smart use of new sensors without moving parts. For example, instead of using a Eu 5000 long-wave pyrgeometer, a factory calibrated IR microwave oven sensor is used that costs less than Eu 10. In total, each station should cost Eu 200 or less. Every 30 km, one station will be installed, being equivalent to 20,000 stations for all of sub-Saharan Africa. The roll-out will follow the XO project ("100 computer") and focus on high schools. The stations will be accompanied by an educational package that allows high school children to learn about their environment, measurements, electronics, and mathematical modeling. Total program costs lie around MEu 18.
NASA Technical Reports Server (NTRS)
Gregg, Hugh; Healey, Kathleen; Hack, Edmund; Wong, Carla
1987-01-01
Traditional expert systems, such as diagnostic and training systems, interact with users only through a keyboard and screen, and are usually symbolic in nature. Expert systems that require access to data bases, complex simulations and real-time instrumentation have both symbolic as well as algorithmic computing needs. These needs could both be met using a general purpose workstation running both symbolic and algorithmic code, or separate, specialized computers networked together. The latter approach was chosen to implement TEXSYS, the thermal expert system, developed by NASA Ames Research Center in conjunction with Johnson Space Center to demonstrate the ability of an expert system to autonomously monitor the thermal control system of the space station. TEXSYS has been implemented on a Symbolics workstation, and will be linked to a microVAX computer that will control a thermal test bed. This paper will explore the integration options, and present several possible solutions.
Goldstone (GDSCC) administrative computing
NASA Technical Reports Server (NTRS)
Martin, H.
1981-01-01
The GDSCC Data Processing Unit provides various administrative computing services for Goldstone. Those activities, including finance, manpower and station utilization, deep-space station scheduling and engineering change order (ECO) control are discussed.
NASA Technical Reports Server (NTRS)
Hwu, Shian U.; Kelley, James S.; Panneton, Robert B.; Arndt, G. Dickey
1995-01-01
In order to estimate the RF radiation hazards to astronauts and electronics equipment due to various Space Station transmitters, the electric fields around the various Space Station antennas are computed using the rigorous Computational Electromagnetics (CEM) techniques. The Method of Moments (MoM) was applied to the UHF and S-band low gain antennas. The Aperture Integration (AI) method and the Geometrical Theory of Diffraction (GTD) method were used to compute the electric field intensities for the S- and Ku-band high gain antennas. As a result of this study, The regions in which the electric fields exceed the specified exposure levels for the Extravehicular Mobility Unit (EMU) electronics equipment and Extravehicular Activity (EVA) astronaut are identified for various Space Station transmitters.
Morlock, Scott E.; Nguyen, Hieu T.; Ross, Jerry H.
2002-01-01
It is feasible to use acoustic Doppler velocity meters (ADVM's) installed at U.S. Geological Survey (USGS) streamflow-gaging stations to compute records of river discharge. ADVM's are small acoustic current meters that use the Doppler principle to measure water velocities in a two-dimensional plane. Records of river discharge can be computed from stage and ADVM velocity data using the 'index velocity' method. The ADVM-measured velocities are used as an estimator or 'index' of the mean velocity in the channel. In evaluations of ADVM's for the computation of records of river discharge, the USGS installed ADVM's at three streamflow-gaging stations in Indiana: Kankakee River at Davis, Fall Creek at Millersville, and Iroquois River near Foresman. The ADVM evaluation study period was from June 1999 to February 2001. Discharge records were computed, using ADVM data from each station. Discharge records also were computed using conventional stage-discharge methods of the USGS. The records produced from ADVM and conventional methods were compared with discharge record hydrographs and statistics. Overall, the records compared closely from the Kankakee River and Fall Creek stations. For the Iroquois River station, variable backwater was present and affected the comparison; because the ADVM record compensates for backwater, the ADVM record may be superior to the conventional record. For the three stations, the ADVM records were judged to be of a quality acceptable to USGS standards for publications and near realtime ADVM-computed discharges are served on USGS real-time data World Wide Web pages.
A Collection of Technical Papers
NASA Technical Reports Server (NTRS)
1995-01-01
Papers presented at the 6th Space Logistics Symposium covered such areas as: The International Space Station; The Hubble Space Telescope; Launch site computer simulation; Integrated logistics support; The Baikonur Cosmodrome; Probabalistic tools for high confidence repair; A simple space station rescue vehicle; Integrated Traffic Model for the International Space Station; Packaging the maintenance shop; Leading edge software support; Storage information management system; Consolidated maintenance inventory logistics planning; Operation concepts for a single stage to orbit vehicle; Mission architecture for human lunar exploration; Logistics of a lunar based solar power satellite scenario; Just in time in space; NASA acquisitions/logistics; Effective transition management; Shuttle logistics; and Revitalized space operations through total quality control management.
Ground-based search for the brightest transiting planets with the Multi-site All-Sky CAmeRA: MASCARA
NASA Astrophysics Data System (ADS)
Snellen, Ignas A. G.; Stuik, Remko; Navarro, Ramon; Bettonvil, Felix; Kenworthy, Matthew; de Mooij, Ernst; Otten, Gilles; ter Horst, Rik; le Poole, Rudolf
2012-09-01
The Multi-site All-sky CAmeRA MASCARA is an instrument concept consisting of several stations across the globe, with each station containing a battery of low-cost cameras to monitor the near-entire sky at each location. Once all stations have been installed, MASCARA will be able to provide a nearly 24-hr coverage of the complete dark sky, down to magnitude 8, at sub-minute cadence. Its purpose is to find the brightest transiting exoplanet systems, expected in the V=4-8 magnitude range - currently not probed by space- or ground-based surveys. The bright/nearby transiting planet systems, which MASCARA will discover, will be the key targets for detailed planet atmosphere observations. We present studies on the initial design of a MASCARA station, including the camera housing, domes, and computer equipment, and on the photometric stability of low-cost cameras showing that a precision of 0.3-1% per hour can be readily achieved. We plan to roll out the first MASCARA station before the end of 2013. A 5-station MASCARA can within two years discover up to a dozen of the brightest transiting planet systems in the sky.
Low-flow characteristics for selected streams in Indiana
Fowler, Kathleen K.; Wilson, John T.
2015-01-01
The management and availability of Indiana’s water resources increase in importance every year. Specifically, information on low-flow characteristics of streams is essential to State water-management agencies. These agencies need low-flow information when working with issues related to irrigation, municipal and industrial water supplies, fish and wildlife protection, and the dilution of waste. Industrial, municipal, and other facilities must obtain National Pollutant Discharge Elimination System (NPDES) permits if their discharges go directly to surface waters. The Indiana Department of Environmental Management (IDEM) requires low-flow statistics in order to administer the NPDES permit program. Low-flow-frequency characteristics were computed for 272 continuous-record stations. The information includes low-flow-frequency analysis, flow-duration analysis, and harmonic mean for the continuous-record stations. For those stations affected by some form of regulation, low-flow frequency curves are based on the longest period of homogeneous record under current conditions. Low-flow-frequency values and harmonic mean flow (if sufficient data were available) were estimated for the 166 partial-record stations. Partial-record stations are ungaged sites where streamflow measurements were made at base flow.
NASA Technical Reports Server (NTRS)
1990-01-01
NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is the computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.
Space Station Simulation Computer System (SCS) study for NASA/MSFC. Phased development plan
NASA Technical Reports Server (NTRS)
1990-01-01
NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is made up of computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.
NASA Technical Reports Server (NTRS)
1990-01-01
NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is made up of the computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.
Space Station Simulation Computer System (SCS) study for NASA/MSFC. Operations concept report
NASA Technical Reports Server (NTRS)
1990-01-01
NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is made up of computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.
47 CFR 80.771 - Method of computing coverage.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 47 Telecommunication 5 2012-10-01 2012-10-01 false Method of computing coverage. 80.771 Section 80.771 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO SERVICES STATIONS IN THE MARITIME SERVICES Standards for Computing Public Coast Station VHF Coverage § 80.771 Method...
Raingauge-Based Rainfall Nowcasting with Artificial Neural Network
NASA Astrophysics Data System (ADS)
Liong, Shie-Yui; He, Shan
2010-05-01
Rainfall forecasting and nowcasting are of great importance, for instance, in real-time flood early warning systems. Long term rainfall forecasting demands global climate, land, and sea data, thus, large computing power and storage capacity are required. Rainfall nowcasting's computing requirement, on the other hand, is much less. Rainfall nowcasting may use data captured by radar and/or weather stations. This paper presents the application of Artificial Neural Network (ANN) on rainfall nowcasting using data observed at weather and/or rainfall stations. The study focuses on the North-East monsoon period (December, January and February) in Singapore. Rainfall and weather data from ten stations, between 2000 and 2006, were selected and divided into three groups for training, over-fitting test and validation of the ANN. Several neural network architectures were tried in the study. Two architectures, Backpropagation ANN and Group Method of Data Handling ANN, yielded better rainfall nowcasting, up to two hours, than the other architectures. The obtained rainfall nowcasts were then used by a catchment model to forecast catchment runoff. The results of runoff forecast are encouraging and promising.With ANN's high computational speed, the proposed approach may be deliverable for creating the real-time flood early warning system.
Regional ionospheric model for improvement of navigation position with EGNOS
NASA Astrophysics Data System (ADS)
Swiatek, Anna; Tomasik, Lukasz; Jaworski, Leszek
The problem of insufficient accuracy of EGNOS correction for the territory of Poland, located at the edge of EGNOS range is well known. The EEI PECS project (EGNOS EUPOS Integration) assumed improving the EGNOS correction by using the GPS observations from Polish ASG-EUPOS stations. A ionospheric delay parameter is a part of EGNOS correction. The comparative analysis of TEC values obtained from EGNOS and regional permanent GNSS stations showed the systematic shift. The TEC from EGNOS correction is underestimated related to computed regional TEC value. The new-‘improved’ corrections computed based on regional model were substituted for the EGNOS correction for suitable message. Dynamic measurements managed using the Mobile GPS Laboratory (MGL), showed the improvement of navigation position with TEC regional model.
Thors, Björn; Thielens, Arno; Fridén, Jonas; Colombi, Davide; Törnevik, Christer; Vermeeren, Günter; Martens, Luc; Joseph, Wout
2014-05-01
In this paper, different methods for practical numerical radio frequency exposure compliance assessments of radio base station products were investigated. Both multi-band base station antennas and antennas designed for multiple input multiple output (MIMO) transmission schemes were considered. For the multi-band case, various standardized assessment methods were evaluated in terms of resulting compliance distance with respect to the reference levels and basic restrictions of the International Commission on Non-Ionizing Radiation Protection. Both single frequency and multiple frequency (cumulative) compliance distances were determined using numerical simulations for a mobile communication base station antenna transmitting in four frequency bands between 800 and 2600 MHz. The assessments were conducted in terms of root-mean-squared electromagnetic fields, whole-body averaged specific absorption rate (SAR) and peak 10 g averaged SAR. In general, assessments based on peak field strengths were found to be less computationally intensive, but lead to larger compliance distances than spatial averaging of electromagnetic fields used in combination with localized SAR assessments. For adult exposure, the results indicated that even shorter compliance distances were obtained by using assessments based on localized and whole-body SAR. Numerical simulations, using base station products employing MIMO transmission schemes, were performed as well and were in agreement with reference measurements. The applicability of various field combination methods for correlated exposure was investigated, and best estimate methods were proposed. Our results showed that field combining methods generally considered as conservative could be used to efficiently assess compliance boundary dimensions of single- and dual-polarized multicolumn base station antennas with only minor increases in compliance distances. © 2014 Wiley Periodicals, Inc.
Ward-Garrison, Christian; Markstrom, Steven L.; Hay, Lauren E.
2009-01-01
The U.S. Geological Survey Downsizer is a computer application that selects, downloads, verifies, and formats station-based time-series data for environmental-resource models, particularly the Precipitation-Runoff Modeling System. Downsizer implements the client-server software architecture. The client presents a map-based, graphical user interface that is intuitive to modelers; the server provides streamflow and climate time-series data from over 40,000 measurement stations across the United States. This report is the Downsizer user's manual and provides (1) an overview of the software design, (2) installation instructions, (3) a description of the graphical user interface, (4) a description of selected output files, and (5) troubleshooting information.
47 CFR 73.185 - Computation of interfering signal.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 4 2010-10-01 2010-10-01 false Computation of interfering signal. 73.185... RADIO BROADCAST SERVICES AM Broadcast Stations § 73.185 Computation of interfering signal. (a) Measured... paragraphs (a) (1) or (2) of this section. (b) For skywave signals from stations operating on all channels...
47 CFR 73.185 - Computation of interfering signal.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 4 2011-10-01 2011-10-01 false Computation of interfering signal. 73.185... RADIO BROADCAST SERVICES AM Broadcast Stations § 73.185 Computation of interfering signal. (a) Measured... paragraphs (a) (1) or (2) of this section. (b) For skywave signals from stations operating on all channels...
47 CFR 73.185 - Computation of interfering signal.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 47 Telecommunication 4 2014-10-01 2014-10-01 false Computation of interfering signal. 73.185... RADIO BROADCAST SERVICES AM Broadcast Stations § 73.185 Computation of interfering signal. (a) Measured... paragraphs (a) (1) or (2) of this section. (b) For skywave signals from stations operating on all channels...
47 CFR 73.185 - Computation of interfering signal.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 47 Telecommunication 4 2013-10-01 2013-10-01 false Computation of interfering signal. 73.185... RADIO BROADCAST SERVICES AM Broadcast Stations § 73.185 Computation of interfering signal. (a) Measured... paragraphs (a) (1) or (2) of this section. (b) For skywave signals from stations operating on all channels...
47 CFR 73.185 - Computation of interfering signal.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 47 Telecommunication 4 2012-10-01 2012-10-01 false Computation of interfering signal. 73.185... RADIO BROADCAST SERVICES AM Broadcast Stations § 73.185 Computation of interfering signal. (a) Measured... paragraphs (a) (1) or (2) of this section. (b) For skywave signals from stations operating on all channels...
Isostatic gravity map of the Nevada Test Site and vicinity, Nevada
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ponce, D.A.; Harris, R.N.; Oliver, H.W.
1988-12-31
The isostatic gravity map of the Nevada Test Site (NTS) and vicinity is based on about 16,000 gravity stations. Principal facts of the gravity data were listed by Harris and others (1989) and their report included descriptions of base stations, high-precision and absolute gravity stations, and data accuracy. Observed gravity values were referenced to the International Gravity Standardization Net 1971 gravity datum described by Morelli (1974) and reduced using the Geodetic Reference System 1967 formula for the normal gravity on the ellipsoid (International Union of Geodesy and Geophysics, 1971). Free-air, Bouguer, curvature, and terrain corrections for a standard reduction densitymore » of 2.67 g/cm{sup 3} were made to compute complete Bouguer anomalies. Terrain corrections were made to a radial distance of 166.7 km from each station using a digital elevation model and a computer procedure by Plouff (1977) and, in general, include manually estimated inner-zone terrain corrections. Finally, isostatic corrections were made using a procedure by Simpson and others (1983) based on an Airy-Heiskanen model with local compensation (Heiskanen and Moritz, 1967) with an upper-crustal density of 2.67 g/cm{sup 3}, a crustal thickness of 25 km, and a density contrast between the lower-crust and upper-mantle of 0.4 g/cm{sup 3}. Isostatic corrections help remove the effects of long-wavelength anomalies related to topography and their compensating masses and, thus, enhance short- to moderate-wavelength anomalies caused by near surface geologic features. 6 refs.« less
Predicting Sets and Lists: Theory and Practice
2015-01-01
school. No work stands in isolation and this work would not have been possible without my co-authors: • “Contextual Optimization of Lists”: Tommy Liu... IMU Microstrain 3DM-GX3-25 PlayStation Eye camera (640x480 @ 30Hz) Onboard ARM-based Linux computer PlayStation Eye camera (640x480 @ 30Hz) Bumblebee...of the IMU integrated in the Ardupilot unit, we added a Microstrain 3DM-GX3-25 IMU which is used to aid real time pose estimation. There are two
2014-07-25
ISS040-E-079083 (25 July 2014) --- In the International Space Station?s Kibo laboratory, NASA astronaut Steve Swanson, Expedition 40 commander, enters data in a computer in preparation for a session with a trio of soccer-ball-sized robots known as the Synchronized Position Hold, Engage, Reorient, Experimental Satellites, or SPHERES. The free-flying robots were equipped with stereoscopic goggles called the Visual Estimation and Relative Tracking for Inspection of Generic Objects, or VERTIGO, to enable the SPHERES to perform relative navigation based on a 3D model of a target object.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merchant, Bion J.
2015-08-01
NetMOD ( Net work M onitoring for O ptimal D etection) is a Java-based software package for conducting simulation of seismic, hydroacoustic and infrasonic networks. Network simulations have long been used to study network resilience to station outages and to determine where additional stations are needed to reduce monitoring thresholds. NetMOD makes use of geophysical models to determine the source characteristics, signal attenuation along the path between the source and station, and the performance and noise properties of the station. These geophysical models are combined to simulate the relative amplitudes of signal and noise that are observed at each ofmore » the stations. From these signal-to-noise ratios (SNR), the probability of detection can be computed given a detection threshold. This document describes the parameters that are used to configure the NetMOD tool and the input and output parameters that make up the simulation definitions.« less
Sparse Matrix Motivated Reconstruction of Far-Field Radiation Patterns
2015-03-01
method for base - station antenna radiation patterns. IEEE Antennas Propagation Magazine. 2001;43(2):132. 4. Vasiliadis TG, Dimitriou D, Sergiadis JD...algorithm based on sparse representations of radiation patterns using the inverse Discrete Fourier Transform (DFT) and the inverse Discrete Cosine...patterns using a Model- Based Parameter Estimation (MBPE) technique that reduces the computational time required to model radiation patterns. Another
NASA Astrophysics Data System (ADS)
Park, J. H.; Chi, H. C.; Lim, I. S.; Seong, Y. J.; Pak, J.
2017-12-01
During the first phase of EEW(Earthquake Early Warning) service to the public by KMA (Korea Meteorological Administration) from 2015 in Korea, KIGAM(Korea Institute of Geoscience and Mineral Resources) has adopted ElarmS2 of UC Berkeley BSL and modified local magnitude relation, travel time curves and association procedures so called TrigDB back-filling method. The TrigDB back-filling method uses a database of sorted lists of stations based on epicentral distances of the pre-defined events located on the grids for 1,401 × 1,601 = 2,243,001 events around the Korean Peninsula at a grid spacing of 0.05 degrees. When the version of an event is updated, the TrigDB back-filling method is invoked. First, the grid closest to the epicenter of an event is chosen from the database and candidate stations, which are stations corresponding to the chosen grid and also adjacent to the already-associated stations, are selected. Second, the directions from the chosen grid to the associated stations are averaged to represent the direction of wave propagation, which is used as a reference for computing apparent travel times. The apparent travel times for the associated stations are computed using a P wave velocity of 5.5 km/s from the grid to the projected points in the reference direction. The travel times for the triggered candidate stations are also computed and used to obtain the difference between the apparent travel times of the associated stations and the triggered candidates. Finally, if the difference in the apparent travel times is less than that of the arrival times, the method forces the triggered candidate station to be associated with the event and updates the event location. This method is useful to reduce false locations of events which could be generated from the deep (> 500 km) and regional distance earthquakes happening on the subduction pacific plate boundaries. In comparison of the case study between TrigDB back-filling applied system and the others, we could get the more reliable results in the early stagy of the version updating by forced association of the neighbored stations.
Computer/gaming station use in youth: Correlations among use, addiction and functional impairment
Baer, Susan; Saran, Kelly; Green, David A
2012-01-01
OBJECTIVE: Computer/gaming station use is ubiquitous in the lives of youth today. Overuse is a concern, but it remains unclear whether problems arise from addictive patterns of use or simply excessive time spent on use. The goal of the present study was to evaluate computer/gaming station use in youth and to examine the relationship between amounts of use, addictive features of use and functional impairment. METHOD: A total of 110 subjects (11 to 17 years of age) from local schools participated. Time spent on television, video gaming and non-gaming recreational computer activities was measured. Addictive features of computer/gaming station use were ascertained, along with emotional/behavioural functioning. Multiple linear regressions were used to understand how youth functioning varied with time of use and addictive features of use. RESULTS: Mean (± SD) total screen time was 4.5±2.4 h/day. Addictive features of use were consistently correlated with functional impairment across multiple measures and informants, whereas time of use, after controlling for addiction, was not. CONCLUSIONS: Youth are spending many hours each day in front of screens. In the absence of addictive features of computer/gaming station use, time spent is not correlated with problems; however, youth with addictive features of use show evidence of poor emotional/ behavioural functioning. PMID:24082802
Cloud Compute for Global Climate Station Summaries
NASA Astrophysics Data System (ADS)
Baldwin, R.; May, B.; Cogbill, P.
2017-12-01
Global Climate Station Summaries are simple indicators of observational normals which include climatic data summarizations and frequency distributions. These typically are statistical analyses of station data over 5-, 10-, 20-, 30-year or longer time periods. The summaries are computed from the global surface hourly dataset. This dataset totaling over 500 gigabytes is comprised of 40 different types of weather observations with 20,000 stations worldwide. NCEI and the U.S. Navy developed these value added products in the form of hourly summaries from many of these observations. Enabling this compute functionality in the cloud is the focus of the project. An overview of approach and challenges associated with application transition to the cloud will be presented.
INTERIOR; VIEW OF ANTENNA TRUNK OPENING AND ENTRY DOOR, LOOKING ...
INTERIOR; VIEW OF ANTENNA TRUNK OPENING AND ENTRY DOOR, LOOKING EAST SOUTHEAST. - Naval Computer & Telecommunications Area Master Station, Eastern Pacific, Radio Transmitter Facility Lualualei, Helix House No. 2, Base of Radio Antenna Structure No. 427, Makaha, Honolulu County, HI
A modern control theory based algorithm for control of the NASA/JPL 70-meter antenna axis servos
NASA Technical Reports Server (NTRS)
Hill, R. E.
1987-01-01
A digital computer-based state variable controller was designed and applied to the 70-m antenna axis servos. The general equations and structure of the algorithm and provisions for alternate position error feedback modes to accommodate intertarget slew, encoder referenced tracking, and precision tracking modes are descibed. Development of the discrete time domain control model and computation of estimator and control gain parameters based on closed loop pole placement criteria are discussed. The new algorithm was successfully implemented and tested in the 70-m antenna at Deep Space Network station 63 in Spain.
Independent calculation of monitor units for VMAT and SPORT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Xin; Bush, Karl; Ding, Aiping
Purpose: Dose and monitor units (MUs) represent two important facets of a radiation therapy treatment. In current practice, verification of a treatment plan is commonly done in dose domain, in which a phantom measurement or forward dose calculation is performed to examine the dosimetric accuracy and the MU settings of a given treatment plan. While it is desirable to verify directly the MU settings, a computational framework for obtaining the MU values from a known dose distribution has yet to be developed. This work presents a strategy to calculate independently the MUs from a given dose distribution of volumetric modulatedmore » arc therapy (VMAT) and station parameter optimized radiation therapy (SPORT). Methods: The dose at a point can be expressed as a sum of contributions from all the station points (or control points). This relationship forms the basis of the proposed MU verification technique. To proceed, the authors first obtain the matrix elements which characterize the dosimetric contribution of the involved station points by computing the doses at a series of voxels, typically on the prescription surface of the VMAT/SPORT treatment plan, with unit MU setting for all the station points. An in-house Monte Carlo (MC) software is used for the dose matrix calculation. The MUs of the station points are then derived by minimizing the least-squares difference between doses computed by the treatment planning system (TPS) and that of the MC for the selected set of voxels on the prescription surface. The technique is applied to 16 clinical cases with a variety of energies, disease sites, and TPS dose calculation algorithms. Results: For all plans except the lung cases with large tissue density inhomogeneity, the independently computed MUs agree with that of TPS to within 2.7% for all the station points. In the dose domain, no significant difference between the MC and Eclipse Anisotropic Analytical Algorithm (AAA) dose distribution is found in terms of isodose contours, dose profiles, gamma index, and dose volume histogram (DVH) for these cases. For the lung cases, the MC-calculated MUs differ significantly from that of the treatment plan computed using AAA. However, the discrepancies are reduced to within 3% when the TPS dose calculation algorithm is switched to a transport equation-based technique (Acuros™). Comparison in the dose domain between the MC and Eclipse AAA/Acuros calculation yields conclusion consistent with the MU calculation. Conclusions: A computational framework relating the MU and dose domains has been established. The framework does not only enable them to verify the MU values of the involved station points of a VMAT plan directly in the MU domain but also provide a much needed mechanism to adaptively modify the MU values of the station points in accordance to a specific change in the dose domain.« less
NASA Technical Reports Server (NTRS)
Clancey, William J.; Lee, Pascal; Sierhuis, Maarten; Norvig, Peter (Technical Monitor)
2001-01-01
Living and working on Mars will require model-based computer systems for maintaining and controlling complex life support, communication, transportation, and power systems. This technology must work properly on the first three-year mission, augmenting human autonomy, without adding-yet more complexity to be diagnosed and repaired. One design method is to work with scientists in analog (mars-like) setting to understand how they prefer to work, what constrains will be imposed by the Mars environment, and how to ameliorate difficulties. We describe how we are using empirical requirements analysis to prototype model-based tools at a research station in the High Canadian Arctic.
Comparison of liquid rocket engine base region heat flux computations using three turbulence models
NASA Technical Reports Server (NTRS)
Kumar, Ganesh N.; Griffith, Dwaine O., II; Prendergast, Maurice J.; Seaford, C. M.
1993-01-01
The flow in the base region of launch vehicles is characterized by flow separation, flow reversals, and reattachment. Computation of the convective heat flux in the base region and on the nozzle external surface of Space Shuttle Main Engine and Space Transportation Main Engine (STME) is an important part of defining base region thermal environments. Several turbulence models were incorporated in a CFD code and validated for flow and heat transfer computations in the separated and reattaching regions associated with subsonic and supersonic flows over backward facing steps. Heat flux computations in the base region of a single STME engine and a single S1C engine were performed using three different wall functions as well as a renormalization-group based k-epsilon model. With the very limited data available, the computed values are seen to be of the right order of magnitude. Based on the validation comparisons, it is concluded that all the turbulence models studied have predicted the reattachment location and the velocity profiles at various axial stations downstream of the step very well.
Interesting viewpoints to those who will put Ada into practice
NASA Technical Reports Server (NTRS)
Carlsson, Arne
1986-01-01
Ada will most probably be used as the programming language for computers in the NASA Space Station. It is reasonable to suppose that Ada will be used for at least embedded computers, because the high software costs for these embedded computers were the reason why Ada activities were initiated about ten years ago. The on-board computers are designed for use in space applications, where maintenance by man is impossible. All manipulation of such computers has to be performed in an autonomous way or remote with commands from the ground. In a manned Space Station some maintenance work can be performed by service people on board, but there are still a lot of applications, which require autonomous computers, for example, vital Space Station functions and unmanned orbital transfer vehicles. Those aspect which have come out of the analysis of Ada characteristics together with the experience of requirements for embedded on-board computers in space applications are examined.
NASA Technical Reports Server (NTRS)
1989-01-01
The Simulation Computer System (SCS) is the computer hardware, software, and workstations that will support the Payload Training Complex (PTC) at Marshall Space Flight Center (MSFC). The PTC will train the space station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. In the first step of this task, a methodology was developed to ensure that all relevant design dimensions were addressed, and that all feasible designs could be considered. The development effort yielded the following method for generating and comparing designs in task 4: (1) Extract SCS system requirements (functions) from the system specification; (2) Develop design evaluation criteria; (3) Identify system architectural dimensions relevant to SCS system designs; (4) Develop conceptual designs based on the system requirements and architectural dimensions identified in step 1 and step 3 above; (5) Evaluate the designs with respect to the design evaluation criteria developed in step 2 above. The results of the method detailed in the above 5 steps are discussed. The results of the task 4 work provide the set of designs which two or three candidate designs are to be selected by MSFC as input to task 5-refine SCS conceptual designs. The designs selected for refinement will be developed to a lower level of detail, and further analyses will be done to begin to determine the size and speed of the components required to implement these designs.
Statistical models for estimating daily streamflow in Michigan
Holtschlag, D.J.; Salehi, Habib
1992-01-01
Statistical models for estimating daily streamflow were analyzed for 25 pairs of streamflow-gaging stations in Michigan. Stations were paired by randomly choosing a station operated in 1989 at which 10 or more years of continuous flow data had been collected and at which flow is virtually unregulated; a nearby station was chosen where flow characteristics are similar. Streamflow data from the 25 randomly selected stations were used as the response variables; streamflow data at the nearby stations were used to generate a set of explanatory variables. Ordinary-least squares regression (OLSR) equations, autoregressive integrated moving-average (ARIMA) equations, and transfer function-noise (TFN) equations were developed to estimate the log transform of flow for the 25 randomly selected stations. The precision of each type of equation was evaluated on the basis of the standard deviation of the estimation errors. OLSR equations produce one set of estimation errors; ARIMA and TFN models each produce l sets of estimation errors corresponding to the forecast lead. The lead-l forecast is the estimate of flow l days ahead of the most recent streamflow used as a response variable in the estimation. In this analysis, the standard deviation of lead l ARIMA and TFN forecast errors were generally lower than the standard deviation of OLSR errors for l < 2 days and l < 9 days, respectively. Composite estimates were computed as a weighted average of forecasts based on TFN equations and backcasts (forecasts of the reverse-ordered series) based on ARIMA equations. The standard deviation of composite errors varied throughout the length of the estimation interval and generally was at maximum near the center of the interval. For comparison with OLSR errors, the mean standard deviation of composite errors were computed for intervals of length 1 to 40 days. The mean standard deviation of length-l composite errors were generally less than the standard deviation of the OLSR errors for l < 32 days. In addition, the composite estimates ensure a gradual transition between periods of estimated and measured flows. Model performance among stations of differing model error magnitudes were compared by computing ratios of the mean standard deviation of the length l composite errors to the standard deviation of OLSR errors. The mean error ratio for the set of 25 selected stations was less than 1 for intervals l < 32 days. Considering the frequency characteristics of the length of intervals of estimated record in Michigan, the effective mean error ratio for intervals < 30 days was 0.52. Thus, for intervals of estimation of 1 month or less, the error of the composite estimate is substantially lower than error of the OLSR estimate.
Development of a PC-based ground support system for a small satellite instrument
NASA Astrophysics Data System (ADS)
Deschambault, Robert L.; Gregory, Philip R.; Spenler, Stephen; Whalen, Brian A.
1993-11-01
The importance of effective ground support for the remote control and data retrieval of a satellite instrument cannot be understated. Problems with ground support may include the need to base personnel at a ground tracking station for extended periods, and the delay between the instrument observation and the processing of the data by the science team. Flexible solutions to such problems in the case of small satellite systems are provided by using low-cost, powerful personal computers and off-the-shelf software for data acquisition and processing, and by using Internet as a communication pathway to enable scientists to view and manipulate satellite data in real time at any ground location. The personal computer based ground support system is illustrated for the case of the cold plasma analyzer flown on the Freja satellite. Commercial software was used as building blocks for writing the ground support equipment software. Several levels of hardware support, including unit tests and development, functional tests, and integration were provided by portable and desktop personal computers. Satellite stations in Saskatchewan and Sweden were linked to the science team via phone lines and Internet, which provided remote control through a central point. These successful strategies will be used on future small satellite space programs.
Williams works on computer in the U.S. Laboratory during Expedition 13
2006-04-15
ISS013-E-07975 (15 April 2006) --- Astronaut Jeffrey N. Williams, Expedition 13 NASA space station science officer and flight engineer, uses a computer in the Destiny laboratory of the International Space Station.
Williams uses computer in the U.S. Laboratory during Expedition 13
2006-04-11
ISS013-E-05853 (11 April 2006) --- Astronaut Jeffrey N. Williams, Expedition 13 NASA space station science officer and flight engineer, uses a computer in the Destiny laboratory of the International Space Station.
NASA Technical Reports Server (NTRS)
Gregg, Hugh; Healey, Kathleen; Hack, Edmund; Wong, Carla
1988-01-01
Expert systems that require access to data bases, complex simulations and real time instrumentation have both symbolic and algorithmic needs. Both of these needs could be met using a general purpose workstation running both symbolic and algorithmic codes, or separate, specialized computers networked together. The later approach was chosen to implement TEXSYS, the thermal expert system, developed by the NASA Ames Research Center in conjunction with the Johnson Space Center to demonstrate the ability of an expert system to autonomously monitor the thermal control system of the space station. TEXSYS has been implemented on a Symbolics workstation, and will be linked to a microVAX computer that will control a thermal test bed. The integration options and several possible solutions are presented.
Synthetic Flight Training System Study
1983-12-23
Distribution unlimited IC. SUPPLEMENTARY NOTiS - 19. KEY WORDS (Continue on reveree side if necoeeary and Identify by block nunber) Visual Systems Computer ...platforms, instructional features, computer hardware and software, student stations, etc. DOR 1473 EDITON OF INMOV6S ISOSOLETE Unclassified SECURITY... Computational Systems .................................... 4-I I 4.5.3 Visual Processing Systems .......................... 4-13 4.5.4 Instructor Stations
Williams uses laptop computer in the U.S. Laboratory taken during Expedition 13
2006-06-22
ISS013-E-40000 (22 June 2006) --- Astronaut Jeffrey N. Williams, Expedition 13 NASA space station science officer and flight engineer, uses a computer in the Destiny laboratory of the International Space Station.
Griffin, Eleanor R.; Wiele, Stephen M.
1996-01-01
A one-dimensional model of unsteady discharge waves was applied to research flowr that were released from Glen Canyon Dam in support of the Glen Canyon Environmental Studies. These research flows extended over periods of 11 days during which the discharge followed specific, regular patterns repeated on a daily cycle that were similar to the daily releases for power generation. The model was used to produce discharge hydrographs at 38 selected sites in Marble and Grand Canyons for each of nine unsteady flows released from the dam in 1990 and 1991. In each case, the discharge computed from stage measurements and the associated stage-discharge relation at the streamflow-gaging station just below the dam (09379910 Colorado River Hlow Glen Canyon Dam) was routed to Diamond Creek, which is 386 kilometers downstream. Steady and unsteady tributary inflows downstream from the dam were included in the model calculations. Steady inflow to the river from tributaries downstream from the dam was determined for each case by comparing the steady base flow preceding and following the unsteady flow measured at six streamflow-gaging stations between Glen Canyon Dam and Diamond Creek. During three flow periods, significant unsteady inflow was received from the Paria River, or the Little Colorado River, or both. The amount and timing of unsteady inflow was determined using the discharge computed from records of streamflow-gaging stations on the tributaries. Unsteady flow then was added to the flow calculated by the model at the appropriate location. Hydrographs were calculated using the model at 5 streamflow-gaging stations downstream from the dam and at 33 beach study sites. Accuracy of model results was evaluated by comparing the results to discharge hydrographs computed from the records of the five streamflow-gaging stations between Lees Ferry and Lake Mead. Results show that model predictions of wave speed and shape agree well with data from the five streamflow-gaging stations.
Human computer interface guide, revision A
NASA Technical Reports Server (NTRS)
1993-01-01
The Human Computer Interface Guide, SSP 30540, is a reference document for the information systems within the Space Station Freedom Program (SSFP). The Human Computer Interface Guide (HCIG) provides guidelines for the design of computer software that affects human performance, specifically, the human-computer interface. This document contains an introduction and subparagraphs on SSFP computer systems, users, and tasks; guidelines for interactions between users and the SSFP computer systems; human factors evaluation and testing of the user interface system; and example specifications. The contents of this document are intended to be consistent with the tasks and products to be prepared by NASA Work Package Centers and SSFP participants as defined in SSP 30000, Space Station Program Definition and Requirements Document. The Human Computer Interface Guide shall be implemented on all new SSFP contractual and internal activities and shall be included in any existing contracts through contract changes. This document is under the control of the Space Station Control Board, and any changes or revisions will be approved by the deputy director.
A Distributed Signature Detection Method for Detecting Intrusions in Sensor Systems
Kim, Ilkyu; Oh, Doohwan; Yoon, Myung Kuk; Yi, Kyueun; Ro, Won Woo
2013-01-01
Sensor nodes in wireless sensor networks are easily exposed to open and unprotected regions. A security solution is strongly recommended to prevent networks against malicious attacks. Although many intrusion detection systems have been developed, most systems are difficult to implement for the sensor nodes owing to limited computation resources. To address this problem, we develop a novel distributed network intrusion detection system based on the Wu–Manber algorithm. In the proposed system, the algorithm is divided into two steps; the first step is dedicated to a sensor node, and the second step is assigned to a base station. In addition, the first step is modified to achieve efficient performance under limited computation resources. We conduct evaluations with random string sets and actual intrusion signatures to show the performance improvement of the proposed method. The proposed method achieves a speedup factor of 25.96 and reduces 43.94% of packet transmissions to the base station compared with the previously proposed method. The system achieves efficient utilization of the sensor nodes and provides a structural basis of cooperative systems among the sensors. PMID:23529146
A distributed signature detection method for detecting intrusions in sensor systems.
Kim, Ilkyu; Oh, Doohwan; Yoon, Myung Kuk; Yi, Kyueun; Ro, Won Woo
2013-03-25
Sensor nodes in wireless sensor networks are easily exposed to open and unprotected regions. A security solution is strongly recommended to prevent networks against malicious attacks. Although many intrusion detection systems have been developed, most systems are difficult to implement for the sensor nodes owing to limited computation resources. To address this problem, we develop a novel distributed network intrusion detection system based on the Wu-Manber algorithm. In the proposed system, the algorithm is divided into two steps; the first step is dedicated to a sensor node, and the second step is assigned to a base station. In addition, the first step is modified to achieve efficient performance under limited computation resources. We conduct evaluations with random string sets and actual intrusion signatures to show the performance improvement of the proposed method. The proposed method achieves a speedup factor of 25.96 and reduces 43.94% of packet transmissions to the base station compared with the previously proposed method. The system achieves efficient utilization of the sensor nodes and provides a structural basis of cooperative systems among the sensors.
Data Recorders Speed Forest Surveys
George C. Keith; Roy C. Beltz
1980-01-01
Adapting computer and communications concepts to conduct and analyze forest surveys has been undertaken by the Renewable Resources Evaluation Research project of Southern Forest Experiment Station. How data is relayed from the faraway forest site to the unit's base without handwriting one note is described in this study.
Reaching Out with Business-Related Information.
ERIC Educational Resources Information Center
Erbes, Bill
2000-01-01
Discusses how the Bensenville Community Public Library (Illinois) met the growing needs of the business community by subscribing to Infotrac's Web-based General BusinessFile. Topics include meeting the information needs of business; installing a computer station at the Chamber of Commerce; and grant funding. (LRW)
Antenna pattern control using impedance surfaces
NASA Technical Reports Server (NTRS)
Balanis, Constantine A.; Liu, Kefeng
1992-01-01
During this research period, we have effectively transferred existing computer codes from CRAY supercomputer to work station based systems. The work station based version of our code preserved the accuracy of the numerical computations while giving a much better turn-around time than the CRAY supercomputer. Such a task relieved us of the heavy dependence of the supercomputer account budget and made codes developed in this research project more feasible for applications. The analysis of pyramidal horns with impedance surfaces was our major focus during this research period. Three different modeling algorithms in analyzing lossy impedance surfaces were investigated and compared with measured data. Through this investigation, we discovered that a hybrid Fourier transform technique, which uses the eigen mode in the stepped waveguide section and the Fourier transformed field distributions across the stepped discontinuities for lossy impedances coating, gives a better accuracy in analyzing lossy coatings. After a further refinement of the present technique, we will perform an accurate radiation pattern synthesis in the coming reporting period.
NASA Astrophysics Data System (ADS)
Sebastian, Nita; Kim, Seongryong; Tkalčić, Hrvoje; Sippl, Christian
2017-04-01
The purpose of this study is to develop an integrated inference on the lithospheric structure of NE China using three passive seismic networks comprised of 92 stations. The NE China plain consists of complex lithospheric domains characterised by the co-existence of complex geodynamic processes such as crustal thinning, active intraplate cenozoic volcanism and low velocity anomalies. To estimate lithospheric structures with greater detail, we chose to perform the joint inversion of independent data sets such as receiver functions and surface wave dispersion curves (group and phase velocity). We perform a joint inversion based on principles of Bayesian transdimensional optimisation techniques (Kim etal., 2016). Unlike in the previous studies of NE China, the complexity of the model is determined from the data in the first stage of the inversion, and the data uncertainty is computed based on Bayesian statistics in the second stage of the inversion. The computed crustal properties are retrieved from an ensemble of probable models. We obtain major structural inferences with well constrained absolute velocity estimates, which are vital for inferring properties of the lithosphere and bulk crustal Vp/Vs ratio. The Vp/Vs estimate obtained from joint inversions confirms the high Vp/Vs ratio ( 1.98) obtained using the H-Kappa method beneath some stations. Moreover, we could confirm the existence of a lower crustal velocity beneath several stations (eg: station SHS) within the NE China plain. Based on these findings we attempt to identify a plausible origin for structural complexity. We compile a high-resolution 3D image of the lithospheric architecture of the NE China plain.
Improving Aircraft Refueling Procedures at Naval Air Station Oceana
2012-06-01
Station (NAS) Oceana, VA, using aircraft waiting time for fuel as a measure of performance. We develop a computer-assisted discrete-event simulation to...Station (NAS) Oceana, VA, using aircraft waiting time for fuel as a measure of performance. We develop a computer-assisted discrete-event simulation...server queue, with general interarrival and service time distributions gpm Gallons per minute JDK Java development kit M/M/1 Single-server queue
JPRS Report, Soviet Union, Foreign Military Review, No. 8, August 1987
1988-01-28
Hinkley Point (1.5 million) and Hartlepool (1.3 million). In recent years the country has begun building large hydro- electric pumped storage power ...antenna 6. Interface equipment 7. Data transmission line terminal 8. Computer 9. Power supply plant control station 10. Radio-relay station terminals... stations and data transmission line, interface equipment, and power distribution unit (Fig. 3). The parallel computer, which performs operations on
Kononenko uses laptop computer in the SM Transfer Compartment
2012-03-21
ISS030-E-161167 (21 March 2012) --- Russian cosmonaut Oleg Kononenko, Expedition 30 flight engineer, uses a computer in the transfer compartment of the International Space Station?s Zvezda Service Module. Russia's Zarya module is visible in the background.
Weighted triangulation adjustment
Anderson, Walter L.
1969-01-01
The variation of coordinates method is employed to perform a weighted least squares adjustment of horizontal survey networks. Geodetic coordinates are required for each fixed and adjustable station. A preliminary inverse geodetic position computation is made for each observed line. Weights associated with each observed equation for direction, azimuth, and distance are applied in the formation of the normal equations in-the least squares adjustment. The number of normal equations that may be solved is twice the number of new stations and less than 150. When the normal equations are solved, shifts are produced at adjustable stations. Previously computed correction factors are applied to the shifts and a most probable geodetic position is found for each adjustable station. Pinal azimuths and distances are computed. These may be written onto magnetic tape for subsequent computation of state plane or grid coordinates. Input consists of punch cards containing project identification, program options, and position and observation information. Results listed include preliminary and final positions, residuals, observation equations, solution of the normal equations showing magnitudes of shifts, and a plot of each adjusted and fixed station. During processing, data sets containing irrecoverable errors are rejected and the type of error is listed. The computer resumes processing of additional data sets.. Other conditions cause warning-errors to be issued, and processing continues with the current data set.
Towards System Calibration of Panoramic Laser Scanners from a Single Station
Medić, Tomislav; Holst, Christoph; Kuhlmann, Heiner
2017-01-01
Terrestrial laser scanner measurements suffer from systematic errors due to internal misalignments. The magnitude of the resulting errors in the point cloud in many cases exceeds the magnitude of random errors. Hence, the task of calibrating a laser scanner is important for applications with high accuracy demands. This paper primarily addresses the case of panoramic terrestrial laser scanners. Herein, it is proven that most of the calibration parameters can be estimated from a single scanner station without a need for any reference information. This hypothesis is confirmed through an empirical experiment, which was conducted in a large machine hall using a Leica Scan Station P20 panoramic laser scanner. The calibration approach is based on the widely used target-based self-calibration approach, with small modifications. A new angular parameterization is used in order to implicitly introduce measurements in two faces of the instrument and for the implementation of calibration parameters describing genuine mechanical misalignments. Additionally, a computationally preferable calibration algorithm based on the two-face measurements is introduced. In the end, the calibration results are discussed, highlighting all necessary prerequisites for the scanner calibration from a single scanner station. PMID:28513548
Floods in Central Texas, September 7-14, 2010
Winters, Karl E.
2012-01-01
Severe flooding occurred near the Austin metropolitan area in central Texas September 7–14, 2010, because of heavy rainfall associated with Tropical Storm Hermine. The U.S. Geological Survey, in cooperation with the Upper Brushy Creek Water Control and Improvement District, determined rainfall amounts and annual exceedance probabilities for rainfall resulting in flooding in Bell, Williamson, and Travis counties in central Texas during September 2010. We documented peak streamflows and the annual exceedance probabilities for peak streamflows recorded at several streamflow-gaging stations in the study area. The 24-hour rainfall total exceeded 12 inches at some locations, with one report of 14.57 inches at Lake Georgetown. Rainfall probabilities were estimated using previously published depth-duration frequency maps for Texas. At 4 sites in Williamson County, the 24-hour rainfall had an annual exceedance probability of 0.002. Streamflow measurement data and flood-peak data from U.S. Geological Survey surface-water monitoring stations (streamflow and reservoir gaging stations) are presented, along with a comparison of September 2010 flood peaks to previous known maximums in the periods of record. Annual exceedance probabilities for peak streamflow were computed for 20 streamflow-gaging stations based on an analysis of streamflow-gaging station records. The annual exceedance probability was 0.03 for the September 2010 peak streamflow at the Geological Survey's streamflow-gaging stations 08104700 North Fork San Gabriel River near Georgetown, Texas, and 08154700 Bull Creek at Loop 360 near Austin, Texas. The annual exceedance probability was 0.02 for the peak streamflow for Geological Survey's streamflow-gaging station 08104500 Little River near Little River, Texas. The lack of similarity in the annual exceedance probabilities computed for precipitation and streamflow might be attributed to the small areal extent of the heaviest rainfall over these and the other gaged watersheds.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Jiamin; Hoffman, Joanne; Zhao, Jocelyn
2016-07-15
Purpose: To develop an automated system for mediastinal lymph node detection and station mapping for chest CT. Methods: The contextual organs, trachea, lungs, and spine are first automatically identified to locate the region of interest (ROI) (mediastinum). The authors employ shape features derived from Hessian analysis, local object scale, and circular transformation that are computed per voxel in the ROI. Eight more anatomical structures are simultaneously segmented by multiatlas label fusion. Spatial priors are defined as the relative multidimensional distance vectors corresponding to each structure. Intensity, shape, and spatial prior features are integrated and parsed by a random forest classifiermore » for lymph node detection. The detected candidates are then segmented by the following curve evolution process. Texture features are computed on the segmented lymph nodes and a support vector machine committee is used for final classification. For lymph node station labeling, based on the segmentation results of the above anatomical structures, the textual definitions of mediastinal lymph node map according to the International Association for the Study of Lung Cancer are converted into patient-specific color-coded CT image, where the lymph node station can be automatically assigned for each detected node. Results: The chest CT volumes from 70 patients with 316 enlarged mediastinal lymph nodes are used for validation. For lymph node detection, their system achieves 88% sensitivity at eight false positives per patient. For lymph node station labeling, 84.5% of lymph nodes are correctly assigned to their stations. Conclusions: Multiple-channel shape, intensity, and spatial prior features aggregated by a random forest classifier improve mediastinal lymph node detection on chest CT. Using the location information of segmented anatomic structures from the multiatlas formulation enables accurate identification of lymph node stations.« less
Sea level rise within the west of Arabian Gulf using tide gauge and continuous GPS measurements
NASA Astrophysics Data System (ADS)
Ayhan, M. E.; Alothman, A.
2009-04-01
Arabian Gulf is connected to Indian Ocean and located in the south-west of the Zagros Trust Belt. To investigate sea level variations within the west of Arabian Gulf, monthly means of sea level at 13 tide gauges along the coast of Saudi Arabia and Bahrain, available in the database of the Permanent Service for Mean Sea Level (PSMSL), are studied. We analyzed individually the monthly means at each station, and estimated secular sea level rate by a robust linear trend fitting. We computed the average relative sea level rise rate of 1.96 ± 0.21 mm/yr within the west of Arabian Gulf based on 4 stations spanning longer than 19 years. Vertical land motions are included into the relative sea level measurements at the tide gauges. Therefore sea level rates at the stations are corrected for vertical land motions using the ICE-5G v1.2 VM4 Glacial Isostatic Adjustment (GIA) model then we found the average sea level rise rate of 2.27 mm/yr. Bahrain International GPS Service (IGS) GPS station, which is close to the Mina Sulman tide gauge station in Bahrain, is the only continuous GPS station accessible in the region. The weekly GPS time series of vertical component at Bahrain IGS-GPS station referring to the ITRF97 from 1999.2 to 2008.6 are downloaded from http://www-gps.mit.edu/~tah/. We fitted a linear trend with an annual signal and one break to the GPS vertical time series and found a vertical land motion rate of 0.48 ± 0.11 mm/yr. Assuming the vertical rate at Bahrain IGS-GPS station represents the vertical rate at each of the other tide gauge stations studied here in the region, we computed average sea level rise rate of 2.44 ± 0.21 mm/yr within the west of Arabian Gulf.
Satellite freeze forecast system: Executive summary
NASA Technical Reports Server (NTRS)
Martsolf, J. D. (Principal Investigator)
1983-01-01
A satellite-based temperature monitoring and prediction system consisting of a computer controlled acquisition, processing, and display system and the ten automated weather stations called by that computer was developed and transferred to the national weather service. This satellite freeze forecasting system (SFFS) acquires satellite data from either one of two sources, surface data from 10 sites, displays the observed data in the form of color-coded thermal maps and in tables of automated weather station temperatures, computes predicted thermal maps when requested and displays such maps either automatically or manually, archives the data acquired, and makes comparisons with historical data. Except for the last function, SFFS handles these tasks in a highly automated fashion if the user so directs. The predicted thermal maps are the result of two models, one a physical energy budget of the soil and atmosphere interface and the other a statistical relationship between the sites at which the physical model predicts temperatures and each of the pixels of the satellite thermal map.
A detailed numerical simulation of a liquid-propellant rocket engine ground test experiment
NASA Astrophysics Data System (ADS)
Lankford, D. W.; Simmons, M. A.; Heikkinen, B. D.
1992-07-01
A computational simulation of a Liquid Rocket Engine (LRE) ground test experiment was performed using two modeling approaches. The results of the models were compared with selected data to assess the validity of state-of-the-art computational tools for predicting the flowfield and radiative transfer in complex flow environments. The data used for comparison consisted of in-band station radiation measurements obtained in the near-field portion of the plume exhaust. The test article was a subscale LRE with an afterbody, resulting in a large base region. The flight conditions were such that afterburning regions were observed in the plume flowfield. A conventional standard modeling approach underpredicted the extent of afterburning and the associated radiation levels. These results were attributed to the absence of the base flow region which is not accounted for in this model. To assess the effects of the base region a Navier-Stokes model was applied. The results of this calculation indicate that the base recirculation effects are dominant features in the immediate expansion region and resulted in a much improved comparison. However, the downstream in-band station radiation data remained underpredicted by this model.
A computer-based maintenance reminder and record-keeping system for clinical laboratories.
Roberts, B I; Mathews, C L; Walton, C J; Frazier, G
1982-09-01
"Maintenance" is all the activity an organization devotes to keeping instruments within performance specifications to assure accurate and precise operation. The increasing use of complex analytical instruments as "workhorses" in clinical laboratories requires more maintenance awareness by laboratory personnel. Record-keeping systems that document maintenance completion and that should prompt the continued performance of maintenance tasks have not kept up with instrumentation development. We report here a computer-based record-keeping and reminder system that lists weekly the maintenance items due for each work station in the laboratory, including the time required to complete each item. Written in BASIC, the system uses a DATABOSS data base management system running on a time-shared Digital Equipment Corporation PDP 11/60 computer with a RSTS V 7.0 operating system.
VIEW OF EAST ELEVATION OF HELIX HOUSE NO. 2 (S87), ...
VIEW OF EAST ELEVATION OF HELIX HOUSE NO. 2 (S-87), LOOKING WEST (without scale stick). - Naval Computer & Telecommunications Area Master Station, Eastern Pacific, Radio Transmitter Facility Lualualei, Helix House No. 2, Base of Radio Antenna Structure No. 427, Makaha, Honolulu County, HI
VIEW OF EAST ELEVATION OF HELIX HOUSE NO. 2 (S87), ...
VIEW OF EAST ELEVATION OF HELIX HOUSE NO. 2 (S-87), LOOKING WEST (with scale stick). - Naval Computer & Telecommunications Area Master Station, Eastern Pacific, Radio Transmitter Facility Lualualei, Helix House No. 2, Base of Radio Antenna Structure No. 427, Makaha, Honolulu County, HI
VIEW OF HELIX HOUSE NO. 2 (S87), WITH ANTENNA TOWER ...
VIEW OF HELIX HOUSE NO. 2 (S-87), WITH ANTENNA TOWER CABLE SUPPORT IN FOREGROUND, LOOKING SOUTHEAST. - Naval Computer & Telecommunications Area Master Station, Eastern Pacific, Radio Transmitter Facility Lualualei, Helix House No. 2, Base of Radio Antenna Structure No. 427, Makaha, Honolulu County, HI
NASA Technical Reports Server (NTRS)
Tomayko, James E.
1986-01-01
Twenty-five years of spacecraft onboard computer development have resulted in a better understanding of the requirements for effective, efficient, and fault tolerant flight computer systems. Lessons from eight flight programs (Gemini, Apollo, Skylab, Shuttle, Mariner, Voyager, and Galileo) and three reserach programs (digital fly-by-wire, STAR, and the Unified Data System) are useful in projecting the computer hardware configuration of the Space Station and the ways in which the Ada programming language will enhance the development of the necessary software. The evolution of hardware technology, fault protection methods, and software architectures used in space flight in order to provide insight into the pending development of such items for the Space Station are reviewed.
PRIVACYGRID: Supporting Anonymous Location Queries in Mobile Environments
2007-01-01
continued price reduction of location tracking de- vices, location - based services (LBSs) are widely recognized as an important feature of the future computing... location - based services can operate completely anonymously, such as “when I pass a gas station, alert me with the unit price of the gas”. Others can...Anonymous Usage of Location - Based Services Through Spatial and Tempo- ral Cloaking. In Proceedings of the International Con- ference on Mobile
NASA Technical Reports Server (NTRS)
Bedrossian, Nazareth; Jang, Jiann-Woei; McCants, Edward; Omohundro, Zachary; Ring, Tom; Templeton, Jeremy; Zoss, Jeremy; Wallace, Jonathan; Ziegler, Philip
2011-01-01
Draper Station Analysis Tool (DSAT) is a computer program, built on commercially available software, for simulating and analyzing complex dynamic systems. Heretofore used in designing and verifying guidance, navigation, and control systems of the International Space Station, DSAT has a modular architecture that lends itself to modification for application to spacecraft or terrestrial systems. DSAT consists of user-interface, data-structures, simulation-generation, analysis, plotting, documentation, and help components. DSAT automates the construction of simulations and the process of analysis. DSAT provides a graphical user interface (GUI), plus a Web-enabled interface, similar to the GUI, that enables a remotely located user to gain access to the full capabilities of DSAT via the Internet and Webbrowser software. Data structures are used to define the GUI, the Web-enabled interface, simulations, and analyses. Three data structures define the type of analysis to be performed: closed-loop simulation, frequency response, and/or stability margins. DSAT can be executed on almost any workstation, desktop, or laptop computer. DSAT provides better than an order of magnitude improvement in cost, schedule, and risk assessment for simulation based design and verification of complex dynamic systems.
An XML-based method for astronomy software designing
NASA Astrophysics Data System (ADS)
Liao, Mingxue; Aili, Yusupu; Zhang, Jin
XML-based method for standardization of software designing is introduced and analyzed and successfully applied to renovating the hardware and software of the digital clock at Urumqi Astronomical Station. Basic strategy for eliciting time information from the new digital clock of FT206 in the antenna control program is introduced. By FT206, the need to compute how many centuries passed since a certain day with sophisticated formulas is eliminated and it is no longer necessary to set right UT time for the computer holding control over antenna because the information about year, month, day are all deduced from Julian day dwelling in FT206, rather than from computer time. With XML-based method and standard for software designing, various existing designing methods are unified, communications and collaborations between developers are facilitated, and thus Internet-based mode of developing software becomes possible. The trend of development of XML-based designing method is predicted.
Computer simulation of space station computer steered high gain antenna
NASA Technical Reports Server (NTRS)
Beach, S. W.
1973-01-01
The mathematical modeling and programming of a complete simulation program for a space station computer-steered high gain antenna are described. The program provides for reading input data cards, numerically integrating up to 50 first order differential equations, and monitoring up to 48 variables on printed output and on plots. The program system consists of a high gain antenna, an antenna gimbal control system, an on board computer, and the environment in which all are to operate.
Martínez-Búrdalo, M; Martín, A; Anguiano, M; Villar, R
2005-09-07
In this work, the procedures for safety assessment in the close proximity of cellular communications base-station antennas at three different frequencies (900, 1800 and 2170 MHz) are analysed. For each operating frequency, we have obtained and compared the distances to the antenna from the exposure places where electromagnetic fields are below reference levels and the distances where the specific absorption rate (SAR) values in an exposed person are below the basic restrictions, according to the European safety guidelines. A high-resolution human body model has been located, in front of each base-station antenna as a worst case, at different distances, to compute whole body averaged SAR and maximum 10 g averaged SAR inside the exposed body. The finite-difference time-domain method has been used for both electromagnetic fields and SAR calculations. This paper shows that, for antenna-body distances in the near zone of the antenna, the fact that averaged field values be below the reference levels could, at certain frequencies, not guarantee guidelines compliance based on basic restrictions.
Link, Brenda L.; Cary, L.E.
1986-01-01
Meteorological data were located, acquired, and stored from selected stations in Montana and North Dakota coal regions and adjacent areas including South Dakota and Wyoming. Data that were acquired have potential use in small watershed modeling studies. Emphasis was placed on acquiring data that was collected during the period 1970 to the present (1984). A map shows the location and type of stations selected. A narration summarizing conventions used in acquiring and storing the meteorological data is provided along with the various retrieval options available. Individual station descriptions are followed by tables listing the meteorological variables collected, period of obtained record, percentage of data recovery, and the instruments used and their description. (USGS)
NASA Technical Reports Server (NTRS)
Rule, William Keith
1991-01-01
A computer program called BALLIST that is intended to be a design tool for engineers is described. BALLlST empirically predicts the bumper thickness required to prevent perforation of the Space Station pressure wall by a projectile (such as orbital debris) as a function of the projectile's velocity. 'Ballistic' limit curves (bumper thickness vs. projectile velocity) are calculated and are displayed on the screen as well as being stored in an ASCII file. A Whipple style of spacecraft wall configuration is assumed. The predictions are based on a database of impact test results. NASA/Marshall Space Flight Center currently has the capability to generate such test results. Numerical simulation results of impact conditions that can not be tested (high velocities or large particles) can also be used for predictions.
Rayleigh wave ellipticity across the Iberian Peninsula and Morocco
NASA Astrophysics Data System (ADS)
Gómez García, Clara; Villaseñor, Antonio
2015-04-01
Spectral amplitude ratios between horizontal and vertical components (H/V ratios) from seismic records are useful to evaluate site effects, predict ground motion and invert for S velocity in the top several hundred meters. These spectral ratios can be obtained from both ambient noise and earthquakes. H/V ratios from ambient noise depend on the content and predominant wave types: body waves, Rayleigh waves, a mixture of different waves, etc. The H/V ratio computed in this way is assumed to measure Rayleigh wave ellipticity since ambient vibrations are dominated by Rayleigh waves. H/V ratios from earthquakes are able to determine the local crustal structure at the vicinity of the recording station. These ratios obtained from earthquakes are based on surface wave ellipticity measurements. Although long period (>20 seconds) Rayleigh H/V ratio is not currently used because of large scatter has been reported and uncertainly about whether these measurements are compatible with traditional phase and group velocity measurements, we will investigate whether it is possible to obtain stable estimates after collecting statistics for many earthquakes. We will use teleseismic events from shallow earthquakes (depth ≤ 40 km) between 2007 January 1 and 2012 December 31 with M ≥ 6 and we will compute H/V ratios for more than 400 stations from several seismic networks across the Iberian Peninsula and Morocco for periods between 20 and 100 seconds. Also H/V ratios from cross-correlations of ambient noise in different components for each station pair will be computed. Shorter period H/V ratio measurements based on ambient noise cross-correlations are strongly sensitive to near-surface structure, rather than longer period earthquake Rayleigh waves. The combination of ellipticity measurements based on earthquakes and ambient noise will allow us to perform a joint inversion with Rayleigh wave phase velocity. Upper crustal structure is better constrained by the joint inversion compared to inversions based on phase velocities alone.
2011-12-29
ISS030-E-017776 (29 Dec. 2011) --- Working in chorus with the International Space Station team in Houston?s Mission Control Center, this astronaut and his Expedition 30 crewmates on the station install a set of Enhanced Processor and Integrated Communications (EPIC) computer cards in one of seven primary computers onboard. The upgrade will allow more experiments to operate simultaneously, and prepare for the arrival of commercial cargo ships later this year.
Work-related health disorders among Saudi computer users.
Jomoah, Ibrahim M
2014-01-01
The present study was conducted to investigate the prevalence of musculoskeletal disorders and eye and vision complaints among the computer users of King Abdulaziz University (KAU), Saudi Arabian Airlines (SAUDIA), and Saudi Telecom Company (STC). Stratified random samples of the work stations and operators at each of the studied institutions were selected and the ergonomics of the work stations were assessed and the operators' health complaints were investigated. The average ergonomic score of the studied work station at STC, KAU, and SAUDIA was 81.5%, 73.3%, and 70.3, respectively. Most of the examined operators use computers daily for ≤ 7 hours, yet they had some average incidences of general complaints (e.g., headache, body fatigue, and lack of concentration) and relatively high level of incidences of eye and vision complaints and musculoskeletal complaints. The incidences of the complaints have been found to increase with the (a) decrease in work station ergonomic score, (b) progress of age and duration of employment, (c) smoking, (d) use of computers, (e) lack of work satisfaction, and (f) history of operators' previous ailments. It has been recommended to improve the ergonomics of the work stations, set up training programs, and conduct preplacement and periodical examinations for operators.
Work-Related Health Disorders among Saudi Computer Users
Jomoah, Ibrahim M.
2014-01-01
The present study was conducted to investigate the prevalence of musculoskeletal disorders and eye and vision complaints among the computer users of King Abdulaziz University (KAU), Saudi Arabian Airlines (SAUDIA), and Saudi Telecom Company (STC). Stratified random samples of the work stations and operators at each of the studied institutions were selected and the ergonomics of the work stations were assessed and the operators' health complaints were investigated. The average ergonomic score of the studied work station at STC, KAU, and SAUDIA was 81.5%, 73.3%, and 70.3, respectively. Most of the examined operators use computers daily for ≤ 7 hours, yet they had some average incidences of general complaints (e.g., headache, body fatigue, and lack of concentration) and relatively high level of incidences of eye and vision complaints and musculoskeletal complaints. The incidences of the complaints have been found to increase with the (a) decrease in work station ergonomic score, (b) progress of age and duration of employment, (c) smoking, (d) use of computers, (e) lack of work satisfaction, and (f) history of operators' previous ailments. It has been recommended to improve the ergonomics of the work stations, set up training programs, and conduct preplacement and periodical examinations for operators. PMID:25383379
An intelligent control and virtual display system for evolutionary space station workstation design
NASA Technical Reports Server (NTRS)
Feng, Xin; Niederjohn, Russell J.; Mcgreevy, Michael W.
1992-01-01
Research and development of the Advanced Display and Computer Augmented Control System (ADCACS) for the space station Body-Ported Cupola Virtual Workstation (BP/VCWS) were pursued. The potential applications were explored of body ported virtual display and intelligent control technology for the human-system interfacing applications is space station environment. The new system is designed to enable crew members to control and monitor a variety of space operations with greater flexibility and efficiency than existing fixed consoles. The technologies being studied include helmet mounted virtual displays, voice and special command input devices, and microprocessor based intelligent controllers. Several research topics, such as human factors, decision support expert systems, and wide field of view, color displays are being addressed. The study showed the significant advantages of this uniquely integrated display and control system, and its feasibility for human-system interfacing applications in the space station command and control environment.
Scattering Effects of Solar Panels on Space Station Antenna Performance
NASA Technical Reports Server (NTRS)
Panneton, Robert J.; Ngo, John C.; Hwu, Shian U.; Johnson, Larry A.; Elmore, James D.; Lu, Ba P.; Kelley, James S.
1994-01-01
Characterizing the scattering properties of the solar array panels is important in predicting Space Station antenna performance. A series of far-field, near-field, and radar cross section (RCS) scattering measurements were performed at S-Band and Ku-Band microwave frequencies on Space Station solar array panels. Based on investigation of the measured scattering patterns, the solar array panels exhibit similar scattering properties to that of the same size aluminum or copper panel mockup. As a first order approximation, and for worse case interference simulation, the solar array panels may be modeled using perfect reflecting plates. Numerical results obtained using the Geometrical Theory of Diffraction (GTD) modeling technique are presented for Space Station antenna pattern degradation due to solar panel interference. The computational and experimental techniques presented in this paper are applicable for antennas mounted on other platforms such as ship, aircraft, satellite, and space or land vehicle.
Space Station Freedom Data Assessment Study
NASA Technical Reports Server (NTRS)
Johnson, Anngienetta R.; Deskevich, Joseph
1990-01-01
The SSF Data Assessment Study was initiated to identify payload and operations data requirements to be supported in the Space Station era. To initiate the study payload requirements from the projected SSF user community were obtained utilizing an electronic questionnaire. The results of the questionnaire were incorporated in a personal computer compatible database used for mission scheduling and end-to-end communications analyses. This paper discusses data flow paths and associated latencies, communications bottlenecks, resource needs versus availability, payload scheduling 'warning flags' and payload data loading requirements for each major milestone in the Space Station buildup sequence. This paper also presents the statistical and analytical assessments produced using the data base, an experiment scheduling program, and a Space Station unique end-to-end simulation model. The modeling concepts and simulation methodologies presented in this paper provide a foundation for forecasting communication requirements and identifying modeling tools to be used in the SSF Tactical Operations Planning (TOP) process.
Study of the GPS inter-frequency calibration of timing receivers
NASA Astrophysics Data System (ADS)
Defraigne, P.; Huang, W.; Bertrand, B.; Rovera, D.
2018-02-01
When calibrating Global Positioning System (GPS) stations dedicated to timing, the hardware delays of P1 and P2, the P(Y)-codes on frequencies L1 and L2, are determined separately. In the international atomic time (TAI) network the GPS stations of the time laboratories are calibrated relatively against reference stations. This paper aims at determining the consistency between the P1 and P2 hardware delays (called dP1 and dP2) of these reference stations, and to look at the stability of the inter-signal hardware delays dP1-dP2 of all the stations in the network. The method consists of determining the dP1-dP2 directly from the GPS pseudorange measurements corrected for the frequency-dependent antenna phase center and the frequency-dependent ionosphere corrections, and then to compare these computed dP1-dP2 to the calibrated values. Our results show that the differences between the computed and calibrated dP1-dP2 are well inside the expected combined uncertainty of the two quantities. Furthermore, the consistency between the calibrated time transfer solution obtained from either single-frequency P1 or dual-frequency P3 for reference laboratories is shown to be about 1.0 ns, well inside the 2.1 ns uB uncertainty of a time transfer link based on GPS P3 or Precise Point Positioning. This demonstrates the good consistency between the P1 and P2 hardware delays of the reference stations used for calibration in the TAI network. The long-term stability of the inter-signal hardware delays is also analysed from the computed dP1-dP2. It is shown that only variations larger than 2 ns can be detected for a particular station, while variations of 200 ps can be detected when differentiating the results between two stations. Finally, we also show that in the differential calibration process as used in the TAI network, using the same antenna phase center or using different positions for L1 and L2 signals gives maximum differences of 200 ps on the hardware delays of the separate codes P1 and P2; however, the final impact on the P3 combination is less than 10 ps.
VIEW OF SOUTH ELEVATION OF HELIX HOUSE NO. 2 (S87) ...
VIEW OF SOUTH ELEVATION OF HELIX HOUSE NO. 2 (S-87) SHOWING MAIN ENTRY DOOR, LOOKING NORTH NORTHWEST. - Naval Computer & Telecommunications Area Master Station, Eastern Pacific, Radio Transmitter Facility Lualualei, Helix House No. 2, Base of Radio Antenna Structure No. 427, Makaha, Honolulu County, HI
Computer Assisted Communication within the Classroom: Interactive Lecturing.
ERIC Educational Resources Information Center
Herr, Richard B.
At the University of Delaware student-teacher communication within the classroom was enhanced through the implementation of a versatile, yet cost efficient, application of computer technology. A single microcomputer at a teacher's station controls a network of student keypad/display stations to provide individual channels of continuous…
Malenchenko uses a computer in the SM during Joint Operations
2008-03-21
S123-E-008370 (21 March 2008) --- Cosmonaut Yuri I. Malenchenko, Expedition 16 flight engineer representing Russia's Federal Space Agency, uses a computer in the Zvezda Service Module of the International Space Station while Space Shuttle Endeavour (STS-123) is docked with the station.
Streamflow characteristics and trends in New Jersey, water years 1897-2003
Watson, Kara M.; Reiser, Robert G.; Nieswand, Steven P.; Schopp, Robert D.
2005-01-01
Streamflow statistics were computed for 111 continuous-record streamflow-gaging stations with 20 or more years of continuous record and for 500 low-flow partial-record stations, including 66 gaging stations with less than 20 years of continuous record. Daily mean streamflow data from water year 1897 through water year 2001 were used for the computations at the gaging stations. (The water year is the 12-month period, October 1 through September 30, designated by the calendar year in which it ends). The characteristics presented for the long-term continuous-record stations are daily streamflow, harmonic mean flow, flow frequency, daily flow durations, trend analysis, and streamflow variability. Low-flow statistics for gaging stations with less than 20 years of record and for partial-record stations were estimated by correlating base-flow measurements with daily mean flows at long-term (more than 20 years) continuous-record stations. Instantaneous streamflow measurements through water year 2003 were used to estimate low-flow statistics at the partial-record stations. The characteristics presented for partial-record stations are mean annual flow; harmonic mean flow; and annual and winter low-flow frequency. The annual 1-, 7-, and 30-day low- and high-flow data sets were tested for trends. The results of trend tests for high flows indicate relations between upward trends for high flows and stream regulation, and high flows and development in the basin. The relation between development and low-flow trends does not appear to be as strong as for development and high-flow trends. Monthly, seasonal, and annual precipitation data for selected long-term meteorological stations also were tested for trends to analyze the effects of climate. A significant upward trend in precipitation in northern New Jersey, Climate Division 1 was identified. For Climate Division 2, no general increase in average precipitation was observed. Trend test results indicate that high flows at undeveloped, unregulated sites have not been affected by the increase in average precipitation. The ratio of instantaneous peak flow to 3-day mean flow, ratios of flow duration, ratios of high-flow/low-flow frequency, and coefficient of variation were used to define streamflow variability. Streamflow variability was significantly greater among the group of gaging stations located outside the Coastal Plain than among the group of gaging stations located in the Coastal Plain.
Wireless Acoustic Measurement System
NASA Technical Reports Server (NTRS)
Anderson, Paul D.; Dorland, Wade D.; Jolly, Ronald L.
2007-01-01
A prototype wireless acoustic measurement system (WAMS) is one of two main subsystems of the Acoustic Prediction/ Measurement Tool, which comprises software, acoustic instrumentation, and electronic hardware combined to afford integrated capabilities for predicting and measuring noise emitted by rocket and jet engines. The other main subsystem is described in the article on page 8. The WAMS includes analog acoustic measurement instrumentation and analog and digital electronic circuitry combined with computer wireless local-area networking to enable (1) measurement of sound-pressure levels at multiple locations in the sound field of an engine under test and (2) recording and processing of the measurement data. At each field location, the measurements are taken by a portable unit, denoted a field station. There are ten field stations, each of which can take two channels of measurements. Each field station is equipped with two instrumentation microphones, a micro- ATX computer, a wireless network adapter, an environmental enclosure, a directional radio antenna, and a battery power supply. The environmental enclosure shields the computer from weather and from extreme acoustically induced vibrations. The power supply is based on a marine-service lead-acid storage battery that has enough capacity to support operation for as long as 10 hours. A desktop computer serves as a control server for the WAMS. The server is connected to a wireless router for communication with the field stations via a wireless local-area network that complies with wireless-network standard 802.11b of the Institute of Electrical and Electronics Engineers. The router and the wireless network adapters are controlled by use of Linux-compatible driver software. The server runs custom Linux software for synchronizing the recording of measurement data in the field stations. The software includes a module that provides an intuitive graphical user interface through which an operator at the control server can control the operations of the field stations for calibration and for recording of measurement data. A test engineer positions and activates the WAMS. The WAMS automatically establishes the wireless network. Next, the engineer performs pretest calibrations. Then the engineer executes the test and measurement procedures. After the test, the raw measurement files are copied and transferred, through the wireless network, to a hard disk in the control server. Subsequently, the data are processed into 1.3-octave spectrograms.
Wireless Acoustic Measurement System
NASA Technical Reports Server (NTRS)
Anderson, Paul D.; Dorland, Wade D.
2005-01-01
A prototype wireless acoustic measurement system (WAMS) is one of two main subsystems of the Acoustic Prediction/Measurement Tool, which comprises software, acoustic instrumentation, and electronic hardware combined to afford integrated capabilities for predicting and measuring noise emitted by rocket and jet engines. The other main subsystem is described in "Predicting Rocket or Jet Noise in Real Time" (SSC-00215-1), which appears elsewhere in this issue of NASA Tech Briefs. The WAMS includes analog acoustic measurement instrumentation and analog and digital electronic circuitry combined with computer wireless local-area networking to enable (1) measurement of sound-pressure levels at multiple locations in the sound field of an engine under test and (2) recording and processing of the measurement data. At each field location, the measurements are taken by a portable unit, denoted a field station. There are ten field stations, each of which can take two channels of measurements. Each field station is equipped with two instrumentation microphones, a micro-ATX computer, a wireless network adapter, an environmental enclosure, a directional radio antenna, and a battery power supply. The environmental enclosure shields the computer from weather and from extreme acoustically induced vibrations. The power supply is based on a marine-service lead-acid storage battery that has enough capacity to support operation for as long as 10 hours. A desktop computer serves as a control server for the WAMS. The server is connected to a wireless router for communication with the field stations via a wireless local-area network that complies with wireless-network standard 802.11b of the Institute of Electrical and Electronics Engineers. The router and the wireless network adapters are controlled by use of Linux-compatible driver software. The server runs custom Linux software for synchronizing the recording of measurement data in the field stations. The software includes a module that provides an intuitive graphical user interface through which an operator at the control server can control the operations of the field stations for calibration and for recording of measurement data. A test engineer positions and activates the WAMS. The WAMS automatically establishes the wireless network. Next, the engineer performs pretest calibrations. Then the engineer executes the test and measurement procedures. After the test, the raw measurement files are copied and transferred, through the wireless network, to a hard disk in the control server. Subsequently, the data are processed into 1/3-octave spectrograms.
The Mount Rainier Lahar Detection System
NASA Astrophysics Data System (ADS)
Lockhart, A. B.; Murray, T. L.
2003-12-01
To mitigate the risk of unheralded lahars from Mount Rainier, the U.S. Geological Survey, in cooperation with Pierce County, Washington, installed a lahar-detection system on the Puyallup and Carbon rivers that originate on Mount Rainier's western slopes. The system, installed in 1998, is designed to automatically detect the passage of lahars large enough to potentially affect populated areas downstream (approximate volume threshold 40 million cubic meters), while ignoring small lahars, earthquakes, extreme weather and floods. Along each river valley upstream, arrays of independent lahar-monitoring stations equipped with geophones and short tripwires telemeter data to a pair of redundant computer base stations located in and near Tacoma at existing public safety facilities that are staffed around the clock. Monitored data consist of ground-vibration levels, tripwire status, and transmissions at regular intervals. The base stations automatically evaluate these data to determine if a dangerous lahar is passing through the station array. The detection algorithm requires significant ground vibration to occur at those stations in the array that are above the anticipated level of inundation, while lower level `deadman' stations, inundated by the flow, experience tripwire breakage or are destroyed. Once a base station detects a lahar, it alerts staff who execute a call-down of public-safety officials and schools, initiating evacuation of areas potentially at risk. Because the system's risk-mitigation task imposes high standards of reliability on all components, it has been under test for several years. To date, the system has operated reliably and without false alarms, including during the nearby M6.8 Nisqually Earthquake on February 28, 2001. The system is being turned over to Pierce County, and activated as part of their lahar warning system.
Burbank works on the EPIC in the Node 2
2012-02-28
ISS030-E-114433 (29 Feb. 2012) --- In the International Space Station?s Destiny laboratory, NASA astronaut Dan Burbank, Expedition 30 commander, upgrades Multiplexer/Demultiplexer (MDM) computers and Portable Computer System (PCS) laptops and installs the Enhanced Processor & Integrated Communications (EPIC) hardware in the Payload 1 (PL-1) MDM.
Program For Generating Interactive Displays
NASA Technical Reports Server (NTRS)
Costenbader, Jay; Moleski, Walt; Szczur, Martha; Howell, David; Engelberg, Norm; Li, Tin P.; Misra, Dharitri; Miller, Philip; Neve, Leif; Wolf, Karl;
1991-01-01
Sun/Unix version of Transportable Applications Environment Plus (TAE+) computer program provides integrated, portable software environment for developing and running interactive window, text, and graphical-object-based application software systems. Enables programmer or nonprogrammer to construct easily custom software interface between user and application program and to move resulting interface program and its application program to different computers. Plus viewed as productivity tool for application developers and application end users, who benefit from resultant consistent and well-designed user interface sheltering them from intricacies of computer. Available in form suitable for following six different groups of computers: DEC VAX station and other VMS VAX computers, Macintosh II computers running AUX, Apollo Domain Series 3000, DEC VAX and reduced-instruction-set-computer workstations running Ultrix, Sun 3- and 4-series workstations running Sun OS and IBM RT/PC and PS/2 compute
Another Program For Generating Interactive Graphics
NASA Technical Reports Server (NTRS)
Costenbader, Jay; Moleski, Walt; Szczur, Martha; Howell, David; Engelberg, Norm; Li, Tin P.; Misra, Dharitri; Miller, Philip; Neve, Leif; Wolf, Karl;
1991-01-01
VAX/Ultrix version of Transportable Applications Environment Plus (TAE+) computer program provides integrated, portable software environment for developing and running interactive window, text, and graphical-object-based application software systems. Enables programmer or nonprogrammer to construct easily custom software interface between user and application program and to move resulting interface program and its application program to different computers. When used throughout company for wide range of applications, makes both application program and computer seem transparent, with noticeable improvements in learning curve. Available in form suitable for following six different groups of computers: DEC VAX station and other VMS VAX computers, Macintosh II computers running AUX, Apollo Domain Series 3000, DEC VAX and reduced-instruction-set-computer workstations running Ultrix, Sun 3- and 4-series workstations running Sun OS and IBM RT/PC's and PS/2 computers running AIX, and HP 9000 S
NASA Technical Reports Server (NTRS)
Yanosy, James L.
1988-01-01
Over the years, computer modeling has been used extensively in many disciplines to solve engineering problems. A set of computer program tools is proposed to assist the engineer in the various phases of the Space Station program from technology selection through flight operations. The development and application of emulation and simulation transient performance modeling tools for life support systems are examined. The results of the development and the demonstration of the utility of three computer models are presented. The first model is a detailed computer model (emulation) of a solid amine water desorbed (SAWD) CO2 removal subsystem combined with much less detailed models (simulations) of a cabin, crew, and heat exchangers. This model was used in parallel with the hardware design and test of this CO2 removal subsystem. The second model is a simulation of an air revitalization system combined with a wastewater processing system to demonstrate the capabilities to study subsystem integration. The third model is that of a Space Station total air revitalization system. The station configuration consists of a habitat module, a lab module, two crews, and four connecting nodes.
CERN's Common Unix and X Terminal Environment
NASA Astrophysics Data System (ADS)
Cass, Tony
The Desktop Infrastructure Group of CERN's Computing and Networks Division has developed a Common Unix and X Terminal Environment to ease the migration to Unix based Interactive Computing. The CUTE architecture relies on a distributed filesystem—currently Trans arc's AFS—to enable essentially interchangeable client work-stations to access both "home directory" and program files transparently. Additionally, we provide a suite of programs to configure workstations for CUTE and to ensure continued compatibility. This paper describes the different components and the development of the CUTE architecture.
User Centered System Design: Papers for the CHI Conference on Human Factors in Computer Systems.
1983-11-01
purpose of the United States Government. ONR REPORT830383 2e0 104 Unclagsified ’tCCU*ITY CLASSIFICATION OF THIS PAGE (Whom, Des enteredE) REPORT... con - mand languages versus menu-based systems, choices of names, and handheld computers versus work stations are examined briefly. UN.ATkrr SErXjftTyv...lsted above in alphabetical order their intentions during the session. An extract from one of We wish to thank Don Norman. Bob Glushko, and Jnathan
Telemetry Data Collection from Oscar Satellite
NASA Technical Reports Server (NTRS)
Haddock, Paul C.; Horan, Stephen
1998-01-01
This paper discusses the design, configuration, and operation of a satellite station built for the Center for Space Telemetering and Telecommunications Laboratory in the Klipsch School of Electrical and Computer Engineering Engineering at New Mexico State University (NMSU). This satellite station consists of a computer-controlled antenna tracking system, 2m/70cm transceiver, satellite tracking software, and a demodulator. The satellite station receives satellite,telemetry, allows for voice communications, and will be used in future classes. Currently this satellite station is receiving telemetry from an amateur radio satellite, UoSAT-OSCAR-11. Amateur radio satellites are referred to as Orbiting Satellites Carrying Amateur Radio (OSCAR) satellites as discussed in the next section.
Implementation of Biogas Stations into Smart Heating and Cooling Network
NASA Astrophysics Data System (ADS)
Milčák, P.; Konvička, J.; Jasenská, M.
2016-10-01
The paper is aimed at the description of implementation of a biogas station into software environment for the "Smart Heating and Cooling Networks". The aim of this project is creation of a software tool for preparation of operation and optimization of treatment of heat/cool in small regions. In this case, the biogas station represents a kind of renewable energy source, which, however, has its own operational specifics which need to be taken into account at the creation of an implementation project. For a specific biogas station, a detailed computational model was elaborated, which is parameterized in particular for an optimization of the total computational time.
Daily values flow comparison and estimates using program HYCOMP, version 1.0
Sanders, Curtis L.
2002-01-01
A method used by the U.S. Geological Survey for quality control in computing daily value flow records is to compare hydrographs of computed flows at a station under review to hydrographs of computed flows at a selected index station. The hydrographs are placed on top of each other (as hydrograph overlays) on a light table, compared, and missing daily flow data estimated. This method, however, is subjective and can produce inconsistent results, because hydrographers can differ when calculating acceptable limits of deviation between observed and estimated flows. Selection of appropriate index stations also is judgemental, giving no consideration to the mathematical correlation between the review station and the index station(s). To address the limitation of the hydrograph overlay method, a set of software programs, written in the SAS macrolanguage, was developed and designated Program HYDCOMP. The program automatically selects statistically comparable index stations by correlation and regression, and performs hydrographic comparisons and estimates of missing data by regressing daily mean flows at the review station against -8 to +8 lagged flows at one or two index stations and day-of-week. Another advantage that HYDCOMP has over the graphical method is that estimated flows, the criteria for determining the quality of the data, and the selection of index stations are determined statistically, and are reproducible from one user to another. HYDCOMP will load the most-correlated index stations into another file containing the ?best index stations,? but will not overwrite stations already in the file. A knowledgeable user should delete unsuitable index stations from this file based on standard error of estimate, hydrologic similarity of candidate index stations to the review station, and knowledge of the individual station characteristics. Also, the user can add index stations not selected by HYDCOMP, if desired. Once the file of best-index stations is created, a user may do hydrographic comparison and data estimates by entering the number of the review station, selecting an index station, and specifying the periods to be used for regression and plotting. For example, the user can restrict the regression to ice-free periods of the year to exclude flows estimated during iced conditions. However, the regression could still be used to estimate flow during iced conditions. HYDCOMP produces the standard error of estimate as a measure of the central scatter of the regression and R-square (coefficient of determination) for evaluating the accuracy of the regression. Output from HYDCOMP includes plots of percent residuals against (1) time within the regression and plot periods, (2) month and day of the year for evaluating seasonal bias in the regression, and (3) the magnitude of flow. For hydrographic comparisons, it plots 2-month segments of hydrographs over the selected plot period showing the observed flows, the regressed flows, the 95 percent confidence limit flows, flow measurements, and regression limits. If the observed flows at the review station remain outside the 95 percent confidence limits for a prolonged period, there may be some error in the flows at the review station or at the index station(s). In addition, daily minimum and maximum temperatures and daily rainfall are shown on the hydrographs, if available, to help indicate whether an apparent change in flow may result from rainfall or from changes in backwater from melting ice or freezing water. HYDCOMP statistically smooths estimated flows from non-missing flows at the edges of the gaps in data into regressed flows at the center of the gaps using the Kalman smoothing algorithm. Missing flows are automatically estimated by HYDCOMP, but the user also can specify that periods of erroneous, but nonmissing flows, be estimated by the program.
Telescience testbed pilot program, volume 1: Executive summary
NASA Technical Reports Server (NTRS)
Leiner, Barry M.
1989-01-01
Space Station Freedom and its associated labs, coupled with the availability of new computing and communications technologies, have the potential for significantly enhancing scientific research. A Telescience Testbed Pilot Program (TTPP), aimed at developing the experience base to deal with issues in the design of the future information system of the Space Station era. The testbeds represented four scientific disciplines (astronomy and astrophysics, earth sciences, life sciences, and microgravity sciences) and studied issues in payload design, operation, and data analysis. This volume, of a 3 volume set, which all contain the results of the TTPP, is the executive summary.
A linear quadratic tracker for Control Moment Gyro based attitude control of the Space Station
NASA Technical Reports Server (NTRS)
Kaidy, J. T.
1986-01-01
The paper discusses a design for an attitude control system for the Space Station which produces fast response, with minimal overshoot and cross-coupling with the use of Control Moment Gyros (CMG). The rigid body equations of motion are linearized and discretized and a Linear Quadratic Regulator (LQR) design and analysis study is performed. The resulting design is then modified such that integral and differential terms are added to the state equations to enhance response characteristics. Methods for reduction of computation time through channelization are discussed as well as the reduction of initial torque requirements.
Energy Efficiency Challenges of 5G Small Cell Networks.
Ge, Xiaohu; Yang, Jing; Gharavi, Hamid; Sun, Yang
2017-05-01
The deployment of a large number of small cells poses new challenges to energy efficiency, which has often been ignored in fifth generation (5G) cellular networks. While massive multiple-input multiple outputs (MIMO) will reduce the transmission power at the expense of higher computational cost, the question remains as to which computation or transmission power is more important in the energy efficiency of 5G small cell networks. Thus, the main objective in this paper is to investigate the computation power based on the Landauer principle. Simulation results reveal that more than 50% of the energy is consumed by the computation power at 5G small cell base stations (BSs). Moreover, the computation power of 5G small cell BS can approach 800 watt when the massive MIMO (e.g., 128 antennas) is deployed to transmit high volume traffic. This clearly indicates that computation power optimization can play a major role in the energy efficiency of small cell networks.
Energy Efficiency Challenges of 5G Small Cell Networks
Ge, Xiaohu; Yang, Jing; Gharavi, Hamid; Sun, Yang
2017-01-01
The deployment of a large number of small cells poses new challenges to energy efficiency, which has often been ignored in fifth generation (5G) cellular networks. While massive multiple-input multiple outputs (MIMO) will reduce the transmission power at the expense of higher computational cost, the question remains as to which computation or transmission power is more important in the energy efficiency of 5G small cell networks. Thus, the main objective in this paper is to investigate the computation power based on the Landauer principle. Simulation results reveal that more than 50% of the energy is consumed by the computation power at 5G small cell base stations (BSs). Moreover, the computation power of 5G small cell BS can approach 800 watt when the massive MIMO (e.g., 128 antennas) is deployed to transmit high volume traffic. This clearly indicates that computation power optimization can play a major role in the energy efficiency of small cell networks. PMID:28757670
Modeling the data management system of Space Station Freedom with DEPEND
NASA Technical Reports Server (NTRS)
Olson, Daniel P.; Iyer, Ravishankar K.; Boyd, Mark A.
1993-01-01
Some of the features and capabilities of the DEPEND simulation-based modeling tool are described. A study of a 1553B local bus subsystem of the Space Station Freedom Data Management System (SSF DMS) is used to illustrate some types of system behavior that can be important to reliability and performance evaluations of this type of spacecraft. A DEPEND model of the subsystem is used to illustrate how these types of system behavior can be modeled, and shows what kinds of engineering and design questions can be answered through the use of these modeling techniques. DEPEND's process-based simulation environment is shown to provide a flexible method for modeling complex interactions between hardware and software elements of a fault-tolerant computing system.
A computer program to determine the possible daily release window for sky target experiments
NASA Technical Reports Server (NTRS)
Michaud, N. H.
1973-01-01
A computer program is presented which is designed to determine the daily release window for sky target experiments. Factors considered in the program include: (1) target illumination by the sun at release time and during the tracking period; (2) look angle elevation above local horizon from each tracking station to the target; (3) solar depression angle from the local horizon of each tracking station during the experimental period after target release; (4) lunar depression angle from the local horizon of each tracking station during the experimental period after target release; and (5) total sky background brightness as seen from each tracking station while viewing the target. Program output is produced in both graphic and data form. Output data can be plotted for a single calendar month or year. The numerical values used to generate the plots are furnished to permit a more detailed review of the computed daily release windows.
Composite measures of watershed health from a water quality perspective.
Mallya, Ganeshchandra; Hantush, Mohamed; Govindaraju, Rao S
2018-05-15
Water quality data at gaging stations are typically compared with established federal, state, or local water quality standards to determine if violations (concentrations of specific constituents falling outside acceptable limits) have occurred. Based on the frequency and severity of water quality violations, risk metrics such as reliability, resilience, and vulnerability (R-R-V) are computed for assessing water quality-based watershed health. In this study, a modified methodology for computing R-R-V measures is presented, and a new composite watershed health index is proposed. Risk-based assessments for different water quality parameters are carried out using identified national sampling stations within the Upper Mississippi River Basin, the Maumee River Basin, and the Ohio River Basin. The distributional properties of risk measures with respect to water quality parameters are reported. Scaling behaviors of risk measures using stream order, specifically for the watershed health (WH) index, suggest that WH values increased with stream order for suspended sediment concentration, nitrogen, and orthophosphate in the Upper Mississippi River Basin. Spatial distribution of risk measures enable identification of locations exhibiting poor watershed health with respect to the chosen numerical standard, and the role of land use characteristics within the watershed. Copyright © 2018 Elsevier Ltd. All rights reserved.
Space Station Furnace Facility. Volume 3: Program cost estimate
NASA Technical Reports Server (NTRS)
1992-01-01
The approach used to estimate costs for the Space Station Furnace Facility (SSFF) is based on a computer program developed internally at Teledyne Brown Engineering (TBE). The program produces time-phased estimates of cost elements for each hardware component, based on experience with similar components. Engineering estimates of the degree of similarity or difference between the current project and the historical data is then used to adjust the computer-produced cost estimate and to fit it to the current project Work Breakdown Structure (WBS). The SSFF Concept as presented at the Requirements Definition Review (RDR) was used as the base configuration for the cost estimate. This program incorporates data on costs of previous projects and the allocation of those costs to the components of one of three, time-phased, generic WBS's. Input consists of a list of similar components for which cost data exist, number of interfaces with their type and complexity, identification of the extent to which previous designs are applicable, and programmatic data concerning schedules and miscellaneous data (travel, off-site assignments). Output is program cost in labor hours and material dollars, for each component, broken down by generic WBS task and program schedule phase.
19. YAZOO BACKWATER PUMPING STATION MODEL, YAZOO RIVER BASIN. ELECTRONICS ...
19. YAZOO BACKWATER PUMPING STATION MODEL, YAZOO RIVER BASIN. ELECTRONICS ENGINEER AT DATA COLLECTION COMPUTER ROOM. - Waterways Experiment Station, Hydraulics Laboratory, Halls Ferry Road, 2 miles south of I-20, Vicksburg, Warren County, MS
System and method for transferring telemetry data between a ground station and a control center
NASA Technical Reports Server (NTRS)
Ray, Timothy J. (Inventor); Ly, Vuong T. (Inventor)
2012-01-01
Disclosed herein are systems, computer-implemented methods, and tangible computer-readable media for coordinating communications between a ground station, a control center, and a spacecraft. The method receives a call to a simple, unified application programmer interface implementing communications protocols related to outer space, when instruction relates to receiving a command at the control center for the ground station generate an abstract message by agreeing upon a format for each type of abstract message with the ground station and using a set of message definitions to configure the command in the agreed upon format, encode the abstract message to generate an encoded message, and transfer the encoded message to the ground station, and perform similar actions when the instruction relates to receiving a second command as a second encoded message at the ground station from the control center and when the determined instruction type relates to transmitting information to the control center.
Space Station solar water heater
NASA Technical Reports Server (NTRS)
Horan, D. C.; Somers, Richard E.; Haynes, R. D.
1990-01-01
The feasibility of directly converting solar energy for crew water heating on the Space Station Freedom (SSF) and other human-tended missions such as a geosynchronous space station, lunar base, or Mars spacecraft was investigated. Computer codes were developed to model the systems, and a proof-of-concept thermal vacuum test was conducted to evaluate system performance in an environment simulating the SSF. The results indicate that a solar water heater is feasible. It could provide up to 100 percent of the design heating load without a significant configuration change to the SSF or other missions. The solar heater system requires only 15 percent of the electricity that an all-electric system on the SSF would require. This allows a reduction in the solar array or a surplus of electricity for onboard experiments.
StreamStats: A water resources web application
Ries, Kernell G.; Guthrie, John G.; Rea, Alan H.; Steeves, Peter A.; Stewart, David W.
2008-01-01
Streamflow statistics, such as the 1-percent flood, the mean flow, and the 7-day 10-year low flow, are used by engineers, land managers, biologists, and many others to help guide decisions in their everyday work. For example, estimates of the 1-percent flood (the flow that is exceeded, on average, once in 100 years and has a 1-percent chance of being exceeded in any year, sometimes referred to as the 100-year flood) are used to create flood-plain maps that form the basis for setting insurance rates and land-use zoning. This and other streamflow statistics also are used for dam, bridge, and culvert design; water-supply planning and management; water-use appropriations and permitting; wastewater and industrial discharge permitting; hydropower facility design and regulation; and the setting of minimum required streamflows to protect freshwater ecosystems. In addition, researchers, planners, regulators, and others often need to know the physical and climatic characteristics of the drainage basins (basin characteristics) and the influence of human activities, such as dams and water withdrawals, on streamflow upstream from locations of interest to understand the mechanisms that control water availability and quality at those locations. Knowledge of the streamflow network and downstream human activities also is necessary to adequately determine whether an upstream activity, such as a water withdrawal, can be allowed without adversely affecting downstream activities.Streamflow statistics could be needed at any location along a stream. Most often, streamflow statistics are needed at ungaged sites, where no streamflow data are available to compute the statistics. At U.S. Geological Survey (USGS) streamflow data-collection stations, which include streamgaging stations, partial-record stations, and miscellaneous-measurement stations, streamflow statistics can be computed from available data for the stations. Streamflow data are collected continuously at streamgaging stations. Streamflow measurements are collected systematically over a period of years at partial-record stations to estimate peak-flow or low-flow statistics. Streamflow measurements usually are collected at miscellaneous-measurement stations for specific hydrologic studies with various objectives.StreamStats is a Web-based Geographic Information System (GIS) application that was created by the USGS, in cooperation with Environmental Systems Research Institute, Inc. (ESRI)1, to provide users with access to an assortment of analytical tools that are useful for water-resources planning and management. StreamStats functionality is based on ESRI’s ArcHydro Data Model and Tools, described on the Web at http://resources.arcgis.com/en/communities/hydro/01vn0000000s000000.htm. StreamStats allows users to easily obtain streamflow statistics, basin characteristics, and descriptive information for USGS data-collection stations and user-selected ungaged sites. It also allows users to identify stream reaches that are upstream and downstream from user-selected sites, and to identify and obtain information for locations along the streams where activities that may affect streamflow conditions are occurring. This functionality can be accessed through a map-based user interface that appears in the user’s Web browser, or individual functions can be requested remotely as Web services by other Web or desktop computer applications. StreamStats can perform these analyses much faster than historically used manual techniques.StreamStats was designed so that each state would be implemented as a separate application, with a reliance on local partnerships to fund the individual applications, and a goal of eventual full national implementation. Idaho became the first state to implement StreamStats in 2003. By mid-2008, 14 states had applications available to the public, and 18 other states were in various stages of implementation.
Space station data system analysis/architecture study. Task 3: Trade studies, DR-5, volume 2
NASA Technical Reports Server (NTRS)
1985-01-01
Results of a Space Station Data System Analysis/Architecture Study for the Goddard Space Flight Center are presented. This study, which emphasized a system engineering design for a complete, end-to-end data system, was divided into six tasks: (1); Functional requirements definition; (2) Options development; (3) Trade studies; (4) System definitions; (5) Program plan; and (6) Study maintenance. The Task inter-relationship and documentation flow are described. Information in volume 2 is devoted to Task 3: trade Studies. Trade Studies have been carried out in the following areas: (1) software development test and integration capability; (2) fault tolerant computing; (3) space qualified computers; (4) distributed data base management system; (5) system integration test and verification; (6) crew workstations; (7) mass storage; (8) command and resource management; and (9) space communications. Results are presented for each task.
NASA Technical Reports Server (NTRS)
1985-01-01
The second task in the Space Station Data System (SSDS) Analysis/Architecture Study is the development of an information base that will support the conduct of trade studies and provide sufficient data to make key design/programmatic decisions. This volume identifies the preferred options in the technology category and characterizes these options with respect to performance attributes, constraints, cost, and risk. The technology category includes advanced materials, processes, and techniques that can be used to enhance the implementation of SSDS design structures. The specific areas discussed are mass storage, including space and round on-line storage and off-line storage; man/machine interface; data processing hardware, including flight computers and advanced/fault tolerant computer architectures; and software, including data compression algorithms, on-board high level languages, and software tools. Also discussed are artificial intelligence applications and hard-wire communications.
Octree-based Global Earthquake Simulations
NASA Astrophysics Data System (ADS)
Ramirez-Guzman, L.; Juarez, A.; Bielak, J.; Salazar Monroy, E. F.
2017-12-01
Seismological research has motivated recent efforts to construct more accurate three-dimensional (3D) velocity models of the Earth, perform global simulations of wave propagation to validate models, and also to study the interaction of seismic fields with 3D structures. However, traditional methods for seismogram computation at global scales are limited by computational resources, relying primarily on traditional methods such as normal mode summation or two-dimensional numerical methods. We present an octree-based mesh finite element implementation to perform global earthquake simulations with 3D models using topography and bathymetry with a staircase approximation, as modeled by the Carnegie Mellon Finite Element Toolchain Hercules (Tu et al., 2006). To verify the implementation, we compared the synthetic seismograms computed in a spherical earth against waveforms calculated using normal mode summation for the Preliminary Earth Model (PREM) for a point source representation of the 2014 Mw 7.3 Papanoa, Mexico earthquake. We considered a 3 km-thick ocean layer for stations with predominantly oceanic paths. Eigen frequencies and eigen functions were computed for toroidal, radial, and spherical oscillations in the first 20 branches. Simulations are valid at frequencies up to 0.05 Hz. Matching among the waveforms computed by both approaches, especially for long period surface waves, is excellent. Additionally, we modeled the Mw 9.0 Tohoku-Oki earthquake using the USGS finite fault inversion. Topography and bathymetry from ETOPO1 are included in a mesh with more than 3 billion elements; constrained by the computational resources available. We compared estimated velocity and GPS synthetics against observations at regional and teleseismic stations of the Global Seismological Network and discuss the differences among observations and synthetics, revealing that heterogeneity, particularly in the crust, needs to be considered.
A computer system for the storage and retrieval of gravity data, Kingdom of Saudi Arabia
Godson, Richard H.; Andreasen, Gordon H.
1974-01-01
A computer system has been developed for the systematic storage and retrieval of gravity data. All pertinent facts relating to gravity station measurements and computed Bouguer values may be retrieved either by project name or by geographical coordinates. Features of the system include visual display in the form of printer listings of gravity data and printer plots of station locations. The retrieved data format interfaces with the format of GEOPAC, a system of computer programs designed for the analysis of geophysical data.
Anderson uses laptop computer in the U.S. Laboratory during Joint Operations
2007-06-13
S117-E-07134 (12 June 2007) --- Astronaut Clayton Anderson, Expedition 15 flight engineer, uses a computer near the Microgravity Science Glovebox (MSG) in the Destiny laboratory of the International Space Station while Space Shuttle Atlantis (STS-117) was docked with the station. Astronaut Sunita Williams, flight engineer, is at right.
A Computerized Weather Station for the Apple IIe.
ERIC Educational Resources Information Center
Lorson, Mark V.
Predicting weather conditions is a topic of interest for students who want to make plans for outside activities. This paper discusses the development of an inexpensive computer-interfaced classroom weather station using an Apple IIe computer that provides the viewer with up to the minute digital readings of inside and outside temperature,…
Near-station terrain corrections for gravity data by a surface-integral technique
Gettings, M.E.
1982-01-01
A new method of computing gravity terrain corrections by use of a digitizer and digital computer can result in substantial savings in the time and manual labor required to perform such corrections by conventional manual ring-chart techniques. The method is typically applied to estimate terrain effects for topography near the station, for example within 3 km of the station, although it has been used successfully to a radius of 15 km to estimate corrections in areas where topographic mapping is poor. Points (about 20) that define topographic maxima, minima, and changes in the slope gradient are picked on the topographic map, within the desired radius of correction about the station. Particular attention must be paid to the area immediately surrounding the station to ensure a good topographic representation. The horizontal and vertical coordinates of these points are entered into the computer, usually by means of a digitizer. The computer then fits a multiquadric surface to the input points to form an analytic representation of the surface. By means of the divergence theorem, the gravity effect of an interior closed solid can be expressed as a surface integral, and the terrain correction is calculated by numerical evaluation of the integral over the surfaces of a cylinder, The vertical sides of which are at the correction radius about the station, the flat bottom surface at the topographic minimum, and the upper surface given by the multiquadric equation. The method has been tested with favorable results against models for which an exact result is available and against manually computed field-station locations in areas of rugged topography. By increasing the number of points defining the topographic surface, any desired degree of accuracy can be obtained. The method is more objective than manual ring-chart techniques because no average compartment elevations need be estimated ?
Automation of Precise Time Reference Stations (PTRS)
NASA Astrophysics Data System (ADS)
Wheeler, P. J.
1985-04-01
The U.S. Naval Observatory is presently engaged in a program of automating precise time stations (PTS) and precise time reference stations (PTBS) by using a versatile mini-computer controlled data acquisition system (DAS). The data acquisition system is configured to monitor locally available PTTI signals such as LORAN-C, OMEGA, and/or the Global Positioning System. In addition, the DAS performs local standard intercomparison. Computer telephone communications provide automatic data transfer to the Naval Observatory. Subsequently, after analysis of the data, results and information can be sent back to the precise time reference station to provide automatic control of remote station timing. The DAS configuration is designed around state of the art standard industrial high reliability modules. The system integration and software are standardized but allow considerable flexibility to satisfy special local requirements such as stability measurements, performance evaluation and printing of messages and certificates. The DAS operates completely independently and may be queried or controlled at any time with a computer or terminal device (control is protected for use by authorized personnel only). Such DAS equipped PTS are operational in Hawaii, California, Texas and Florida.
NASA Technical Reports Server (NTRS)
1983-01-01
Various parameters of the orbital space station are discussed. The space station environment, data management system, communication and tracking, environmental control, and life support system are considered. Specific topics reviewed include crew work stations, restraint systems, stowage, computer hardware, and expert systems.
Model implementation for dynamic computation of system cost
NASA Astrophysics Data System (ADS)
Levri, J.; Vaccari, D.
The Advanced Life Support (ALS) Program metric is the ratio of the equivalent system mass (ESM) of a mission based on International Space Station (ISS) technology to the ESM of that same mission based on ALS technology. ESM is a mission cost analog that converts the volume, power, cooling and crewtime requirements of a mission into mass units to compute an estimate of the life support system emplacement cost. Traditionally, ESM has been computed statically, using nominal values for system sizing. However, computation of ESM with static, nominal sizing estimates cannot capture the peak sizing requirements driven by system dynamics. In this paper, a dynamic model for a near-term Mars mission is described. The model is implemented in Matlab/Simulink' for the purpose of dynamically computing ESM. This paper provides a general overview of the crew, food, biomass, waste, water and air blocks in the Simulink' model. Dynamic simulations of the life support system track mass flow, volume and crewtime needs, as well as power and cooling requirement profiles. The mission's ESM is computed, based upon simulation responses. Ultimately, computed ESM values for various system architectures will feed into an optimization search (non-derivative) algorithm to predict parameter combinations that result in reduced objective function values.
A computer graphics system for visualizing spacecraft in orbit
NASA Technical Reports Server (NTRS)
Eyles, Don E.
1989-01-01
To carry out unanticipated operations with resources already in space is part of the rationale for a permanently manned space station in Earth orbit. The astronauts aboard a space station will require an on-board, spatial display tool to assist the planning and rehearsal of upcoming operations. Such a tool can also help astronauts to monitor and control such operations as they occur, especially in cases where first-hand visibility is not possible. A computer graphics visualization system designed for such an application and currently implemented as part of a ground-based simulation is described. The visualization system presents to the user the spatial information available in the spacecraft's computers by drawing a dynamic picture containing the planet Earth, the Sun, a star field, and up to two spacecraft. The point of view within the picture can be controlled by the user to obtain a number of specific visualization functions. The elements of the display, the methods used to control the display's point of view, and some of the ways in which the system can be used are described.
Divergence analysis report for the bodies of revolution model support systems
NASA Technical Reports Server (NTRS)
Rash, Larry C.
1983-01-01
This report documents the sting divergence analyses of nine different model and model support systems that were performed in preparation for a series of wind tunnel tests at the National Transonic Facility at NASA Langley Research Center in Hampton, Virginia. The models were missile shaped bodies of revolution and the model support systems included a force and moment balance and tapered sting sections. The sting divergence results were obtained from a computer program that solved a two-point boundary value problem which used a second order Runge-Kutta integration technique. The computer solution was based on constant section properties between discrete stations along the sting sections, a procedure was developed and included to evaluate the properties for the minimum number of stations along the tapered sections that would produce no more than one half of one percent error in the divergence results. Also included in the report are development of the aerodynamic input data, listings of all input and output computer data, and summary sheets that highlight the input and the critical sting divergence dynamic pressure for each respective configuration.
Jia, Limin
2017-01-01
Aimed at the complicated problems of attraction characteristics regarding passenger flow in urban rail transit network, the concept of the gravity field of passenger flow is proposed in this paper. We establish the computation methods of field strength and potential energy to reveal the potential attraction relationship among stations from the perspective of the collection and distribution of passenger flow and the topology of network. As for the computation methods of field strength, an optimum path concept is proposed to define betweenness centrality parameter. Regarding the computation of potential energy, Compound Simpson’s Rule Formula is applied to get a solution to the function. Taking No. 10 Beijing Subway as a practical example, an analysis of simulation and verification is conducted, and the results shows in the following ways. Firstly, the bigger field strength value between two stations is, the stronger passenger flow attraction is, and the greater probability of the formation of the largest passenger flow of section is. Secondly, there is the greatest passenger flow volume and circulation capacity between two zones of high potential energy. PMID:28863175
Near Zone: Basic scattering code user's manual with space station applications
NASA Technical Reports Server (NTRS)
Marhefka, R. J.; Silvestro, J. W.
1989-01-01
The Electromagnetic Code - Basic Scattering Code, Version 3, is a user oriented computer code to analyze near and far zone patterns of antennas in the presence of scattering structures, to provide coupling between antennas in a complex environment, and to determine radiation hazard calculations at UHF and above. The analysis is based on uniform asymptotic techniques formulated in terms of the Uniform Geometrical Theory of Diffraction (UTD). Complicated structures can be simulated by arbitrarily oriented flat plates and an infinite ground plane that can be perfectly conducting or dielectric. Also, perfectly conducting finite elliptic cylinder, elliptic cone frustum sections, and finite composite ellipsoids can be used to model the superstructure of a ship, the body of a truck, and airplane, a satellite, etc. This manual gives special consideration to space station modeling applications. This is a user manual designed to give an overall view of the operation of the computer code, to instruct a user in how to model structures, and to show the validity of the code by comparing various computed results against measured and alternative calculations such as method of moments whenever available.
NASA Astrophysics Data System (ADS)
Ji, Kun; Ren, Yefei; Wen, Ruizhi
2017-10-01
Reliable site classification of the stations of the China National Strong Motion Observation Network System (NSMONS) has not yet been assigned because of lacking borehole data. This study used an empirical horizontal-to-vertical (H/V) spectral ratio (hereafter, HVSR) site classification method to overcome this problem. First, according to their borehole data, stations selected from KiK-net in Japan were individually assigned a site class (CL-I, CL-II, or CL-III), which is defined in the Chinese seismic code. Then, the mean HVSR curve for each site class was computed using strong motion recordings captured during the period 1996-2012. These curves were compared with those proposed by Zhao et al. (2006a) for four types of site classes (SC-I, SC-II, SC-III, and SC-IV) defined in the Japanese seismic code (JRA, 1980). It was found that an approximate range of the predominant period Tg could be identified by the predominant peak of the HVSR curve for the CL-I and SC-I sites, CL-II and SC-II sites, and CL-III and SC-III + SC-IV sites. Second, an empirical site classification method was proposed based on comprehensive consideration of peak period, amplitude, and shape of the HVSR curve. The selected stations from KiK-net were classified using the proposed method. The results showed that the success rates of the proposed method in identifying CL-I, CL-II, and CL-III sites were 63%, 64%, and 58% respectively. Finally, the HVSRs of 178 NSMONS stations were computed based on recordings from 2007 to 2015 and the sites classified using the proposed method. The mean HVSR curves were re-calculated for three site classes and compared with those from KiK-net data. It was found that both the peak period and the amplitude were similar for the mean HVSR curves derived from NSMONS classification results and KiK-net borehole data, implying the effectiveness of the proposed method in identifying different site classes. The classification results have good agreement with site classes based on borehole data of 81 stations in China, which indicates that our site classification results are acceptable and that the proposed method is practicable.
NASA Technical Reports Server (NTRS)
Jules, Kenol; Lin, Paul P.; Weiss, Daniel S.
2002-01-01
This paper presents the preliminary performance results of the artificial intelligence monitoring system in full operational mode using near real time acceleration data downlinked from the International Space Station. Preliminary microgravity environment characterization analysis result for the International Space Station (Increment-2), using the monitoring system is presented. Also, comparison between the system predicted performance based on ground test data for the US laboratory "Destiny" module and actual on-orbit performance, using measured acceleration data from the U.S. laboratory module of the International Space Station is presented. Finally, preliminary on-orbit disturbance magnitude levels are presented for the Experiment of Physics of Colloids in Space, which are compared with on ground test data. The ground test data for the Experiment of Physics of Colloids in Space were acquired from the Microgravity Emission Laboratory, located at the NASA Glenn Research Center, Cleveland, Ohio. The artificial intelligence was developed by the NASA Glenn Principal Investigator Microgravity Services Project to help the principal investigator teams identify the primary vibratory disturbance sources that are active, at any moment of time, on-board the International Space Station, which might impact the microgravity environment their experiments are exposed to. From the Principal Investigator Microgravity Services' web site, the principal investigator teams can monitor via a dynamic graphical display, implemented in Java, in near real time, which event(s) is/are on, such as crew activities, pumps, fans, centrifuges, compressor, crew exercise, structural modes, etc., and decide whether or not to run their experiments, whenever that is an option, based on the acceleration magnitude and frequency sensitivity associated with that experiment. This monitoring system detects primarily the vibratory disturbance sources. The system has built-in capability to detect both known and unknown vibratory disturbance sources. Several soft computing techniques such as Kohonen's Self-Organizing Feature Map, Learning Vector Quantization, Back-Propagation Neural Networks, and Fuzzy Logic were used to design the system.
Assessing the impact of PACS on patient care in a medical intensive care unit
NASA Astrophysics Data System (ADS)
Shile, Peter E.; Kundel, Harold L.; Seshadri, Sridhar B.; Carey, Bruce; Brikman, Inna; Kishore, Sheel; Feingold, Eric R.; Lanken, Paul N.
1993-09-01
In this paper we have present data from pilot studies to estimate the impact on patient care of an intensive care unit display station. The data were collected during two separate one-month periods in 1992. We compared these two different periods in terms of the relative speeds with which images were first viewed by MICU physicians. First, we found that images for routine chest radiographs (CXRs) are viewed by a greater number of physicians and slightly sooner with the PACS display station operating in the MICU than when it is not. Thus, for routine exams, PACS provide the potential for shortening of time intervals between exam completions and image-based clinical actions. A second finding is that the use of the display station for viewing non-routine CXRs is strongly influenced by the speed with which films are digitized. Hence, if film digitization is not rapid, the presence of a MICU display station is unlikely to contribute to a shortening of time intervals between exam completions and image-based clinical actions. This finding supports the use of computed radiography for CXRs in an intensive care unit.
Computer systems for automatic earthquake detection
Stewart, S.W.
1974-01-01
U.S Geological Survey seismologists in Menlo park, California, are utilizing the speed, reliability, and efficiency of minicomputers to monitor seismograph stations and to automatically detect earthquakes. An earthquake detection computer system, believed to be the only one of its kind in operation, automatically reports about 90 percent of all local earthquakes recorded by a network of over 100 central California seismograph stations. The system also monitors the stations for signs of malfunction or abnormal operation. Before the automatic system was put in operation, all of the earthquakes recorded had to be detected by manually searching the records, a time-consuming process. With the automatic detection system, the stations are efficiently monitored continuously.
NASA Astrophysics Data System (ADS)
Tsuboi, S.; Hirshorn, B. F.
2009-12-01
We have determined Mwp for the August 11, 2009 Suruga-Bay earthquake (MJMA=6.5) using broadband seismograms recorded at close epicentral distance stations. We have used two broadband seismograph stations: JHJ2 (epicentral distance 1.9 degree) and FUJ (epicentral distance 0.44 degree). Because of the close epicentral distance of FUJ, the seismogram is clipped at about 10 second after the P-wave arrival. However, it was possible to use the first 10 second of this seismogram to compute Mwp. We get Mwp=6.4 for JHJ2 and 6.8 for FUJ(figure 1). After we apply Whitmore et al (2000)’s correction and average these two stations, we get Mwp=6.6 for this event. The epicentral distance of 0.44 degree for magnitude 6.5 earthquake is marginal to treat this seismogram as far-field. However, considering the aftershock distribution, the fault area seems to be limited to within the Suruga-Bay, which may confirm the fact that Mwp can be successfully computed at FUJ based on the far-field approximation. This result is significant in using Mwp from close epicentral distance seismograms to issue early tsunami warning. A large earthquake with Mw=7.5 (GCMT) occurred in Andaman Island, India, 10 minutes before this Suruga-Bay event. This made it very difficult to estimate Mwp for the Suruga-Bay event from broadband seismograms at teleseismic distances because of the large amplitude of Mw7.5 Andaman Island earthquake. In this case, it is therefore difficult to issue accurate tsunami warnings based on the teleseismic stations. We used broadband seismograms recorded by F-net operated by the National Research Institute for Earth Science and Disaster Prevention.
NASA Astrophysics Data System (ADS)
Chetty, S.; Field, L. A.
2013-12-01
The Arctic ocean's continuing decrease of summer-time ice is related to rapidly diminishing multi-year ice due to the effects of climate change. Ice911 Research aims to develop environmentally respectful materials that when deployed will increase the albedo, enhancing the formation and/preservation of multi-year ice. Small scale deployments using various materials have been done in Canada, California's Sierra Nevada Mountains and a pond in Minnesota to test the albedo performance and environmental characteristics of these materials. SWIMS is a sophisticated autonomous sensor system being developed to measure the albedo, weather, water temperature and other environmental parameters. The system (SWIMS) employs low cost, high accuracy/precision sensors, high resolution cameras, and an extreme environment command and data handling computer system using satellite and terrestrial wireless communication. The entire system is solar powered with redundant battery backup on a floating buoy platform engineered for low temperature (-40C) and high wind conditions. The system also incorporates tilt sensors, sonar based ice thickness sensors and a weather station. To keep the costs low, each SWIMS unit measures incoming and reflected radiation from the four quadrants around the buoy. This allows data from four sets of sensors, cameras, weather station, water temperature probe to be collected and transmitted by a single on-board solar powered computer. This presentation covers the technical, logistical and cost challenges in designing, developing and deploying these stations in remote, extreme environments. Image captured by camera #3 of setting sun on the SWIMS station One of the images captured by SWIMS Camera #4
NASA Astrophysics Data System (ADS)
Lund, Matthew Lawrence
The space radiation environment is a significant challenge to future manned and unmanned space travels. Future missions will rely more on accurate simulations of radiation transport in space through spacecraft to predict astronaut dose and energy deposition within spacecraft electronics. The International Space Station provides long-term measurements of the radiation environment in Low Earth Orbit (LEO); however, only the Apollo missions provided dosimetry data beyond LEO. Thus dosimetry analysis for deep space missions is poorly supported with currently available data, and there is a need to develop dosimetry-predicting models for extended deep space missions. GEANT4, a Monte Carlo Method, provides a powerful toolkit in C++ for simulation of radiation transport in arbitrary media, thus including the spacecraft and space travels. The newest version of GEANT4 supports multithreading and MPI, resulting in faster distributive processing of simulations in high-performance computing clusters. This thesis introduces a new application based on GEANT4 that greatly reduces computational time using Kingspeak and Ember computational clusters at the Center for High Performance Computing (CHPC) to simulate radiation transport through full spacecraft geometry, reducing simulation time to hours instead of weeks without post simulation processing. Additionally, this thesis introduces a new set of detectors besides the historically used International Commission of Radiation Units (ICRU) spheres for calculating dose distribution, including a Thermoluminescent Detector (TLD), Tissue Equivalent Proportional Counter (TEPC), and human phantom combined with a series of new primitive scorers in GEANT4 to calculate dose equivalence based on the International Commission of Radiation Protection (ICRP) standards. The developed models in this thesis predict dose depositions in the International Space Station and during the Apollo missions showing good agreement with experimental measurements. From these models the greatest contributor to radiation dose for the Apollo missions was from Galactic Cosmic Rays due to the short time within the radiation belts. The Apollo 14 dose measurements were an order of magnitude higher compared to other Apollo missions. The GEANT4 model of the Apollo Command Module shows consistent doses due to Galactic Cosmic Rays and Radiation Belts for all missions, with a small variation in dose distribution across the capsule. The model also predicts well the dose depositions and equivalent dose values in various human organs for the International Space Station or Apollo Command Module.
NASA Technical Reports Server (NTRS)
2003-01-01
The same software controlling autonomous and crew-assisted operations for the International Space Station (ISS) is enabling commercial enterprises to integrate and automate manual operations, also known as decision logic, in real time across complex and disparate networked applications, databases, servers, and other devices, all with quantifiable business benefits. Auspice Corporation, of Framingham, Massachusetts, developed the Auspice TLX (The Logical Extension) software platform to effectively mimic the human decision-making process. Auspice TLX automates operations across extended enterprise systems, where any given infrastructure can include thousands of computers, servers, switches, and modems that are connected, and therefore, dependent upon each other. The concept behind the Auspice software spawned from a computer program originally developed in 1981 by Cambridge, Massachusetts-based Draper Laboratory for simulating tasks performed by astronauts aboard the Space Shuttle. At the time, the Space Shuttle Program was dependent upon paper-based procedures for its manned space missions, which typically averaged 2 weeks in duration. As the Shuttle Program progressed, NASA began increasing the length of manned missions in preparation for a more permanent space habitat. Acknowledging the need to relinquish paper-based procedures in favor of an electronic processing format to properly monitor and manage the complexities of these longer missions, NASA realized that Draper's task simulation software could be applied to its vision of year-round space occupancy. In 1992, Draper was awarded a NASA contract to build User Interface Language software to enable autonomous operations of a multitude of functions on Space Station Freedom (the station was redesigned in 1993 and converted into the international venture known today as the ISS)
NASA Technical Reports Server (NTRS)
Austin, F.; Markowitz, J.; Goldenberg, S.; Zetkov, G. A.
1973-01-01
The formulation of a mathematical model for predicting the dynamic behavior of rotating flexible space station configurations was conducted. The overall objectives of the study were: (1) to develop the theoretical techniques for determining the behavior of a realistically modeled rotating space station, (2) to provide a versatile computer program for the numerical analysis, and (3) to present practical concepts for experimental verification of the analytical results. The mathematical model and its associated computer program are described.
Computer evaluation of existing and proposed fire lookouts
Romain M. Mees
1976-01-01
A computer simulation model has been developed for evaluating the fire detection capabilities of existing and proposed lookout stations. The model uses coordinate location of fires and lookouts, tower elevation, and topographic data to judge location of stations, and to determine where a fire can be seen. The model was tested by comparing it with manual detection on a...
Linear genetic programming application for successive-station monthly streamflow prediction
NASA Astrophysics Data System (ADS)
Danandeh Mehr, Ali; Kahya, Ercan; Yerdelen, Cahit
2014-09-01
In recent decades, artificial intelligence (AI) techniques have been pronounced as a branch of computer science to model wide range of hydrological phenomena. A number of researches have been still comparing these techniques in order to find more effective approaches in terms of accuracy and applicability. In this study, we examined the ability of linear genetic programming (LGP) technique to model successive-station monthly streamflow process, as an applied alternative for streamflow prediction. A comparative efficiency study between LGP and three different artificial neural network algorithms, namely feed forward back propagation (FFBP), generalized regression neural networks (GRNN), and radial basis function (RBF), has also been presented in this study. For this aim, firstly, we put forward six different successive-station monthly streamflow prediction scenarios subjected to training by LGP and FFBP using the field data recorded at two gauging stations on Çoruh River, Turkey. Based on Nash-Sutcliffe and root mean squared error measures, we then compared the efficiency of these techniques and selected the best prediction scenario. Eventually, GRNN and RBF algorithms were utilized to restructure the selected scenario and to compare with corresponding FFBP and LGP. Our results indicated the promising role of LGP for successive-station monthly streamflow prediction providing more accurate results than those of all the ANN algorithms. We found an explicit LGP-based expression evolved by only the basic arithmetic functions as the best prediction model for the river, which uses the records of the both target and upstream stations.
NASA Astrophysics Data System (ADS)
Burba, G. G.; Johnson, D.; Velgersdyk, M.; Beaty, K.; Forgione, A.; Begashaw, I.; Allyn, D.
2015-12-01
Significant increases in data generation and computing power in recent years have greatly improved spatial and temporal flux data coverage on multiple scales, from a single station to continental flux networks. At the same time, operating budgets for flux teams and stations infrastructure are getting ever more difficult to acquire and sustain. With more stations and networks, larger data flows from each station, and smaller operating budgets, modern tools are needed to effectively and efficiently handle the entire process. This would help maximize time dedicated to answering research questions, and minimize time and expenses spent on data processing, quality control and station management. Cross-sharing the stations with external institutions may also help leverage available funding, increase scientific collaboration, and promote data analyses and publications. FluxSuite, a new advanced tool combining hardware, software and web-service, was developed to address these specific demands. It automates key stages of flux workflow, minimizes day-to-day site management, and modernizes the handling of data flows: Each next-generation station measures all parameters needed for flux computations Field microcomputer calculates final fully-corrected flux rates in real time, including computation-intensive Fourier transforms, spectra, co-spectra, multiple rotations, stationarity, footprint, etc. Final fluxes, radiation, weather and soil data are merged into a single quality-controlled file Multiple flux stations are linked into an automated time-synchronized network Flux network manager, or PI, can see all stations in real time, including fluxes, supporting data, automated reports, and email alerts PI can assign rights, allow or restrict access to stations and data: selected stations can be shared via rights-managed access internally or with external institutions Researchers without stations could form "virtual networks" for specific projects by collaborating with PIs from different actual networks This presentation provides detailed examples of FluxSuite currently utilized by two large flux networks in China (National Academy of Sciences & Agricultural Academy of Sciences), and smaller networks with stations in the USA, Germany, Ireland, Malaysia and other locations around the globe.
2004-02-03
KENNEDY SPACE CENTER, FLA. - In the Space Station Processing Facility, workers check over the Italian-built Node 2, a future element of the International Space Station. The second of three Station connecting modules, the Node 2 attaches to the end of the U.S. Lab and provides attach locations for several other elements. Kopra is currently assigned technical duties in the Space Station Branch of the Astronaut Office, where his primary focus involves the testing of crew interfaces for two future ISS modules as well as the implementation of support computers and operational Local Area Network on ISS. Node 2 is scheduled to launch on mission STS-120, Station assembly flight 10A.
A method to establish seismic noise baselines for automated station assessment
McNamara, D.E.; Hutt, C.R.; Gee, L.S.; Benz, H.M.; Buland, R.P.
2009-01-01
We present a method for quantifying station noise baselines and characterizing the spectral shape of out-of-nominal noise sources. Our intent is to automate this method in order to ensure that only the highest-quality data are used in rapid earthquake products at NEIC. In addition, the station noise baselines provide a valuable tool to support the quality control of GSN and ANSS backbone data and metadata. The procedures addressed here are currently in development at the NEIC, and work is underway to understand how quickly changes from nominal can be observed and used within the NEIC processing framework. The spectral methods and software used to compute station baselines and described herein (PQLX) can be useful to both permanent and portable seismic stations operators. Applications include: general seismic station and data quality control (QC), evaluation of instrument responses, assessment of near real-time communication system performance, characterization of site cultural noise conditions, and evaluation of sensor vault design, as well as assessment of gross network capabilities (McNamara et al. 2005). Future PQLX development plans include incorporating station baselines for automated QC methods and automating station status report generation and notification based on user-defined QC parameters. The PQLX software is available through the USGS (http://earthquake. usgs.gov/research/software/pqlx.php) and IRIS (http://www.iris.edu/software/ pqlx/).
RMT focal plane sensitivity to seismic network geometry and faulting style
Johnson, Kendra L.; Hayes, Gavin; Herrmann, Robert B.; Benz, Harley M.; McNamara, Daniel E.; Bergman, Eric A.
2016-01-01
Modern tectonic studies often use regional moment tensors (RMTs) to interpret the seismotectonic framework of an earthquake or earthquake sequence; however, despite extensive use, little existing work addresses RMT parameter uncertainty. Here, we quantify how network geometry and faulting style affect RMT sensitivity. We examine how data-model fits change with fault plane geometry (strike and dip) for varying station configurations. We calculate the relative data fit for incrementally varying geometries about a best-fitting solution, applying our workflow to real and synthetic seismograms for both real and hypothetical station distributions and earthquakes. Initially, we conduct purely observational tests, computing RMTs from synthetic seismograms for hypothetical earthquakes and a series of well-behaved network geometries. We then incorporate real data and station distributions from the International Maule Aftershock Deployment (IMAD), which recorded aftershocks of the 2010 MW 8.8 Maule earthquake, and a set of regional stations capturing the ongoing earthquake sequence in Oklahoma and southern Kansas. We consider RMTs computed under three scenarios: (1) real seismic records selected for high data quality; (2) synthetic seismic records with noise computed for the observed source-station pairings and (3) synthetic seismic records with noise computed for all possible station-source pairings. To assess RMT sensitivity for each test, we observe the ‘fit falloff’, which portrays how relative fit changes when strike or dip varies incrementally; we then derive the ranges of acceptable strikes and dips by identifying the span of solutions with relative fits larger than 90 per cent of the best fit. For the azimuthally incomplete IMAD network, Scenario 3 best constrains fault geometry, with average ranges of 45° and 31° for strike and dip, respectively. In Oklahoma, Scenario 3 best constrains fault dip with an average range of 46°; however, strike is best constrained by Scenario 1, with a range of 26°. We draw two main conclusions from this study. (1) Station distribution impacts our ability to constrain RMTs using waveform time-series; however, in some tectonic settings, faulting style also plays a significant role and (2) increasing station density and data quantity (both the number of stations and the number of individual channels) does not necessarily improve RMT constraint. These results may be useful when organizing future seismic deployments (e.g. by concentrating stations in alignment with anticipated nodal planes), and in computing RMTs, either by guiding a more rigorous data selection process for input data or informing variable weighting among the selected data (e.g. by eliminating the transverse component when strike-slip mechanisms are expected).
NASA Technical Reports Server (NTRS)
Mcgreevy, Michael W.
1990-01-01
An advanced human-system interface is being developed for evolutionary Space Station Freedom as part of the NASA Office of Space Station (OSS) Advanced Development Program. The human-system interface is based on body-pointed display and control devices. The project will identify and document the design accommodations ('hooks and scars') required to support virtual workstations and telepresence interfaces, and prototype interface systems will be built, evaluated, and refined. The project is a joint enterprise of Marquette University, Astronautics Corporation of America (ACA), and NASA's ARC. The project team is working with NASA's JSC and McDonnell Douglas Astronautics Company (the Work Package contractor) to ensure that the project is consistent with space station user requirements and program constraints. Documentation describing design accommodations and tradeoffs will be provided to OSS, JSC, and McDonnell Douglas, and prototype interface devices will be delivered to ARC and JSC. ACA intends to commercialize derivatives of the interface for use with computer systems developed for scientific visualization and system simulation.
Evaluation of thermograph data for California streams
Limerinos, J.T.
1978-01-01
Statistical analysis of water-temperature data from California streams indicates that, for most purposes, long-term operation of thermographs (automatic water-temperature recording instruments) does not provide a more useful record than either short-term operation of such instruments or periodic measurements. Harmonic analyses were made of thermograph records 5 to 14 years in length from 82 stations. More than 80 percent of the annual variation in water temperature is explained by the harmonic function for 77 of the 82 stations. Harmonic coefficients based on 8 years of thermograph record at 12 stations varied only slightly from coefficients computed using two equally split 4-year records. At five stations where both thermograph and periodic (10 to 23 measurements per year) data were collected concurrently, harmonic coefficients for periodic data were defined nearly as well as those for thermograph data. Results of this analysis indicate that, except where detailed surveillance of water temperatures is required or where there is a chance of temporal change, thermograph operations can be reduced substantially without affecting the usefulness of temperature records.
NASA Technical Reports Server (NTRS)
Hall, William A.
1990-01-01
Slave microprocessors in multimicroprocessor computing system contains modified circuit cards programmed via bus connecting master processor with slave microprocessors. Enables interactive, microprocessor-based, single-loop control. Confers ability to load and run program from master/slave bus, without need for microprocessor development station. Tristate buffers latch all data and information on status. Slave central processing unit never connected directly to bus.
Verification of the WFAS Lightning Efficiency Map
Paul Sopko; Don Latham; Isaac Grenfell
2007-01-01
A Lightning Ignition Efficiency map was added to the suite of daily maps offered by the Wildland Fire Assessment System (WFAS) in 1999. This map computes a lightning probability of ignition (POI) based on the estimated fuel type, fuel depth, and 100-hour fuel moisture interpolated from the Remote Automated Weather Station (RAWS) network. An attempt to verify the...
OVERVIEW OF HELIX HOUSE NO. 2 (S87), WITH ANTENNA TOWERS, ...
OVERVIEW OF HELIX HOUSE NO. 2 (S-87), WITH ANTENNA TOWERS, HELIX HOUSE NO. 1 (S-3) AND TRANSMITTER BLDG. (S-2) AT REAR, LOOKING WEST SOUTHWEST. - Naval Computer & Telecommunications Area Master Station, Eastern Pacific, Radio Transmitter Facility Lualualei, Helix House No. 2, Base of Radio Antenna Structure No. 427, Makaha, Honolulu County, HI
NASA Astrophysics Data System (ADS)
Gherghel-Lascu, A.; Apel, W. D.; Arteaga-Velázquez, J. C.; Bekk, K.; Bertania, M.; Blümer, J.; Bozdog, H.; Brancus, I. M.; Cantoni, E.; Chiavassa, A.; Cossavella, F.; Daumiller, K.; de Souza, V.; Di Pierro, F.; Doll, P.; Engel, R.; Fuhrmann, D.; Gils, H. J.; Glasstetter, R.; Grupen, C.; Haungs, A.; Heck, D.; Hörandel, J. R.; Huber, D.; Huege, T.; Kampert, K.-H.; Kang, D.; Klages, H. O.; Link, K.; Łuczak, P.; Mathes, H. J.; Mayer, H. J.; Milke, J.; Mitrica, B.; Morello, C.; Oehlschläger, J.; Ostapchenko, S.; Palmieri, N.; Pierog, T.; Rebel, H.; Roth, M.; Schieler, H.; Schoo, S.; Schröder, F. G.; Sima, O.; Toma, G.; Trinchero, G. C.; Ulrich, H.; Weindl, A.; Wochele, J.; Zabierowski, J.
2017-06-01
The charged particle densities obtained from CORSIKA simulated EAS, using the QGSJet-II.04 hadronic interaction model are used for primary energy reconstruction. Simulated data are reconstructed by using Lateral Energy Correction Functions computed with a new realistic model of the Grande stations implemented in Geant4.10.
A Medical Decision Support System for the Space Station Health Maintenance Facility
Ostler, David V.; Gardner, Reed M.; Logan, James S.
1988-01-01
NASA is developing a Health Maintenance Facility (HMF) to provide the equipment and supplies necessary to deliver medical care in the Space Station. An essential part of the Health Maintenance Facility is a computerized Medical Decision Support System (MDSS) that will enhance the ability of the medical officer (“paramedic” or “physician”) to maintain the crew's health, and to provide emergency medical care. The computer system has four major functions: 1) collect and integrate medical information into an electronic medical record from Space Station medical officers, HMF instrumentation, and exercise equipment; 2) provide an integrated medical record and medical reference information management system; 3) manage inventory for logistical support of supplies and secure pharmaceuticals; 4) supply audio and electronic mail communications between the medical officer and ground based flight surgeons. ImagesFigure 1
Simulated annealing in orbital flight planning
NASA Technical Reports Server (NTRS)
Soller, Jeffrey
1990-01-01
Simulated annealing is used to solve a minimum fuel trajectory problem in the space station environment. The environment is unique because the space station will define the first true multivehicle environment in space. The optimization yields surfaces which are potentially complex, with multiple local minima. Because of the likelihood of these local minima, descent techniques are unable to offer robust solutions. Other deterministic optimization techniques were explored without success. The simulated annealing optimization is capable of identifying a minimum-fuel, two-burn trajectory subject to four constraints. Furthermore, the computational efforts involved in the optimization are such that missions could be planned on board the space station. Potential applications could include the on-site planning of rendezvous with a target craft of the emergency rescue of an astronaut. Future research will include multiwaypoint maneuvers, using a knowledge base to guide the optimization.
NASA Astrophysics Data System (ADS)
Ranzi, Roberto; Goatelli, Federica; Castioni, Camilla; Tomirotti, Massimo; Crespi, Alice; Mattea, Enrico; Brunetti, Michele; Maugeri, Maurizio
2017-04-01
A new time series of daily runoff reconstructed at the inflow in the Como Lake in the Italian Alps is presented. The time series covers a 170 years time period and includes the two largest floods ever recorded for the region: the 1868 and 1987 ones. Statistics of annual maxima show a decrease which is not statistically significant and a decrease of annual runoff which is statistically significant, instead. To investigate the possible reasons of such changes monthly temperature and precipitation are analysed. Decrease of runoff peaks can be justified by the increase of reservoir storage volumes. Evapotranspiration indexes based on monthly temperature indicate an increase of evapotranspiration losses as a possible cause of runoff decrease. Secular precipitation series for the Adda basin are then computed by a methodology projecting observational data onto a high-resolution grid (30-arc-second, DEM GTOPO30). It is based on the assumption that the spatio-temporal behaviour of a meteorological variable over a given area can be described by superimposing two fields: the climatological normals over a reference period, i.e. the climatologies, and the departure from them, i.e. the anomalies. The two fields can be reconstructed independently and are based on different datasets. To compute the precipitation climatologies all the available stations within the Adda basin are considered while, for the anomalies, only the longest and the most homogeneous records are selected. To this aim, a great effort was made to extend these series to the past as much as possible, also by digitising the historical records available from the hardcopy archives. The climatological values at each DEM cell of the Adda basin are obtained by a local weighted linear regression of precipitation versus elevation (LWLR) taking into account the closest stations with similar geographical characteristics to those of the cell itself. The anomaly field is obtained by a weighted average of the anomalies of neighbouring stations considering both the distance and the elevation differences between the stations and the considered cell. Finally, the secular precipitation records at each DEM cell of the Adda basin are computed by multiplying the local estimated anomalies for the corresponding climatological values. A statistically significant decreasing trend of precipitation results from the Man Kendall and Sen-Theil tests.
NASA Technical Reports Server (NTRS)
1982-01-01
The ventilation and fire safety requirements for subway tunnels with dipped profiles between stations as compared to subway tunnels with level profiles were evaluated. This evaluation is based upon computer simulations of a train fire emergency condition. Each of the tunnel configurations evaluated was developed from characteristics that are representative of modern transit systems. The results of the study indicate that: (1) The level tunnel system required about 10% more station cooling than dipped tunnel systems in order to meet design requirements; and (2) The emergency ventilation requirements are greater with dipped tunnel systems than with level tunnel systems.
Lucani, Daniel; Cataldo, Giancarlos; Cruz, Julio; Villegas, Guillermo; Wong, Sara
2006-01-01
A prototype of a portable ECG-monitoring device has been developed for clinical and non-clinical environments as part of a telemedicine system to provide remote and continuous surveillance of patients. The device can acquire, store and/or transmit ECG signals to computer-based platforms or specially configured access points (AP) with Intranet/Internet capabilities in order to reach remote monitoring stations. Acquired data can be stored in a flash memory card in FAT16 format for later recovery, or transmitted via Bluetooth or USB to a local station or AP. This data acquisition module (DAM) operates in two modes: Holter and on-line transmission.
Assessment of a human computer interface prototyping environment
NASA Technical Reports Server (NTRS)
Moore, Loretta A.
1993-01-01
A Human Computer Interface (HCI) prototyping environment with embedded evaluation capability has been successfully assessed which will be valuable in developing and refining HCI standards and evaluating program/project interface development, especially Space Station Freedom on-board displays for payload operations. The HCI prototyping environment is designed to include four components: (1) a HCI format development tool, (2) a test and evaluation simulator development tool, (3) a dynamic, interactive interface between the HCI prototype and simulator, and (4) an embedded evaluation capability to evaluate the adequacy of an HCI based on a user's performance.
The Remote Analysis Station (RAS) as an instructional system
NASA Technical Reports Server (NTRS)
Rogers, R. H.; Wilson, C. L.; Dye, R. H.; Jaworski, E.
1981-01-01
"Hands-on" training in LANDSAT data analysis techniques can be obtained using a desk-top, interactive remote analysis station (RAS) which consists of a color CRT imagery display, with alphanumeric overwrite and keyboard, as well as a cursor controller and modem. This portable station can communicate via modem and dial-up telephone with a host computer at 1200 baud or it can be hardwired to a host computer at 9600 baud. A Z80 microcomputer controls the display refresh memory and remote station processing. LANDSAT data is displayed as three-band false-color imagery, one-band color-sliced imagery, or color-coded processed imagery. Although the display memory routinely operates at 256 x 256 picture elements, a display resolution of 128 x 128 can be selected to fill the display faster. In the false color mode the computer packs the data into one 8-bit character. When the host is not sending pictorial information the characters sent are in ordinary ASCII code. System capabilities are described.
Space station Simulation Computer System (SCS) study for NASA/MSFC. Volume 5: Study analysis report
NASA Technical Reports Server (NTRS)
1989-01-01
The Simulation Computer System (SCS) is the computer hardware, software, and workstations that will support the Payload Training Complex (PTC) at the Marshall Space Flight Center (MSFC). The PTC will train the space station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be on-board the Freedom Space Station. The further analysis performed on the SCS study as part of task 2-Perform Studies and Parametric Analysis-of the SCS study contract is summarized. These analyses were performed to resolve open issues remaining after the completion of task 1, and the publishing of the SCS study issues report. The results of these studies provide inputs into SCS task 3-Develop and present SCS requirements, and SCS task 4-develop SCS conceptual designs. The purpose of these studies is to resolve the issues into usable requirements given the best available information at the time of the study. A list of all the SCS study issues is given.
Dezhurov works in the sleep station in the U.S. Laboratory during Expedition Three
2001-09-09
ISS003-E-5558 (9 September 2001) --- Cosmonaut Vladimir Dezhurov of Rosaviakosmos, Expedition 3 flight engineer, works on a laptop computer in the temporary sleep station of the in the U.S. Laboratory Destiny onboard the International Space Station.
Using computer graphics to design Space Station Freedom viewing
NASA Technical Reports Server (NTRS)
Goldsberry, Betty S.; Lippert, Buddy O.; Mckee, Sandra D.; Lewis, James L., Jr.; Mount, Francis E.
1993-01-01
Viewing requirements were identified early in the Space Station Freedom program for both direct viewing via windows and indirect viewing via cameras and closed-circuit television (CCTV). These requirements reside in NASA Program Definition and Requirements Document (PDRD), Section 3: Space Station Systems Requirements. Currently, analyses are addressing the feasibility of direct and indirect viewing. The goal of these analyses is to determine the optimum locations for the windows, cameras, and CCTV's in order to meet established requirements, to adequately support space station assembly, and to operate on-board equipment. PLAID, a three-dimensional computer graphics program developed at NASA JSC, was selected for use as the major tool in these analyses. PLAID provides the capability to simulate the assembly of the station as well as to examine operations as the station evolves. This program has been used successfully as a tool to analyze general viewing conditions for many Space Shuttle elements and can be used for virtually all Space Station components. Additionally, PLAID provides the ability to integrate an anthropometric scale-modeled human (representing a crew member) with interior and exterior architecture.
Free geometric adjustment of the SECOR Equatorial Network (Solution SECOR-27)
NASA Technical Reports Server (NTRS)
Mueller, I. I.; Kumar, M.; Soler, T.
1973-01-01
The basic purpose of this experiment is to compute reduced normal equations from the observational data of the SECOR Equatorial Network obtained from DMA/Topographic Center, D/Geodesy, Geosciences Div. Washington, D.C. These reduced normal equations are to be combined with reduced normal equations of other satellite networks of the National Geodetic Satellite Program to provide station coordinates from a single least square adjustment. An individual SECOR solution was also obtained and is presented in this report, using direction constraints computed from BC-4 optical data from stations collocated with SECOR stations. Due to the critical configuration present in the range observations, weighted height constraints were also applied in order to break the near coplanarity of the observing stations.
ERIC Educational Resources Information Center
Mitra, Sugata; Dangwal, Ritu
2017-01-01
This article describes a study under the Reaching the Unreached component of the Chiphen Rigpel project between the governments of Bhutan and India. This initiative is an attempt to provide computer literacy to children of Bhutan through setting up "hole in the wall" (HiWEL) Playground Learning Station(s) (PLSs). The study described here…
NASA Astrophysics Data System (ADS)
Gao, S. S.; Kong, F.; Wu, J.; Liu, L.; Liu, K. H.
2017-12-01
Seismic azimuthal anisotropy is measured at 83 stations situated at the southeastern margin of the Tibetan Plateau and adjacent regions based on shear-wave splitting analyses. A total of 1701 individual pairs of splitting parameters (fast polarization orientations and splitting delay times) are obtained using the PKS, SKKS, and SKS phases. The splitting parameters from 21 stations exhibit systematic back-azimuthal variations with a 90° periodicity, which is consistent with a two-layer anisotropy model. The resulting upper-layer splitting parameters computed based on a grid-search algorithm are comparable with crustal anisotropy measurements obtained independently based on the sinusoidal moveout of P-to-S conversions from the Moho. The fast orientations of the upper layer anisotropy, which is mostly parallel with major shear zones, are associated with crustal fabrics with a vertical foliation plane. The lower layer anisotropy and the station averaged splitting parameters at stations with azimuthally invariant splitting parameters can be adequately explained by the differential movement between the lithosphere and asthenosphere. The NW-SE fast orientations obtained in the northern part of the study area probably reflect the southeastward extruded mantle flow from central Tibet. In contrast, the NE-SW to E-W fast orientations observed in the southern part of the study area are most likely related to the northeastward to eastward mantle flow induced by the subduction of the Burma microplate.
Mullaney, John R.; Schwarz, Gregory E.
2013-01-01
The total nitrogen load to Long Island Sound from Connecticut and contributing areas to the north was estimated for October 1998 to September 2009. Discrete measurements of total nitrogen concentrations and continuous flow data from 37 water-quality monitoring stations in the Long Island Sound watershed were used to compute total annual nitrogen yields and loads. Total annual computed yields and basin characteristics were used to develop a generalized-least squares regression model for use in estimating the total nitrogen yields from unmonitored areas in coastal and central Connecticut. Significant variables in the regression included the percentage of developed land, percentage of row crops, point-source nitrogen yields from wastewater-treatment facilities, and annual mean streamflow. Computed annual median total nitrogen yields at individual monitoring stations ranged from less than 2,000 pounds per square mile in mostly forested basins (typically less than 10 percent developed land) to more than 13,000 pounds per square mile in urban basins (greater than 40 percent developed) with wastewater-treatment facilities and in one agricultural basin. Medians of computed total annual nitrogen yields for water years 1999–2009 at most stations were similar to those previously computed for water years 1988–98. However, computed medians of annual yields at several stations, including the Naugatuck River, Quinnipiac River, and Hockanum River, were lower than during 1988–98. Nitrogen yields estimated for 26 unmonitored areas downstream from monitoring stations ranged from less than 2,000 pounds per square mile to 34,000 pounds per square mile. Computed annual total nitrogen loads at the farthest downstream monitoring stations were combined with the corresponding estimates for the downstream unmonitored areas for a combined estimate of the total nitrogen load from the entire study area. Resulting combined total nitrogen loads ranged from 38 to 68 million pounds per year during water years 1999–2009. Total annual loads from the monitored basins represent 63 to 74 percent of the total load. Computed annual nitrogen loads from four stations near the Massachusetts border with Connecticut represent 52 to 54 percent of the total nitrogen load during water years 2008–9, the only years with data for all the border sites. During the latter part of the 1999–2009 study period, total nitrogen loads to Long Island Sound from the study area appeared to increase slightly. The apparent increase in loads may be due to higher than normal streamflows, which consequently increased nonpoint nitrogen loads during the study, offsetting major reductions of nitrogen from wastewater-treatment facilities. Nitrogen loads from wastewater treatment facilities declined as much as 2.3 million pounds per year in areas of Connecticut upstream from the monitoring stations and as much as 5.8 million pounds per year in unmonitored areas downstream in coastal and central Connecticut.
Bayesian analysis of stage-fall-discharge rating curves and their uncertainties
NASA Astrophysics Data System (ADS)
Mansanarez, V.; Le Coz, J.; Renard, B.; Lang, M.; Pierrefeu, G.; Vauchel, P.
2016-09-01
Stage-fall-discharge (SFD) rating curves are traditionally used to compute streamflow records at sites where the energy slope of the flow is variable due to variable backwater effects. We introduce a model with hydraulically interpretable parameters for estimating SFD rating curves and their uncertainties. Conventional power functions for channel and section controls are used. The transition to a backwater-affected channel control is computed based on a continuity condition, solved either analytically or numerically. The practical use of the method is demonstrated with two real twin-gauge stations, the Rhône River at Valence, France, and the Guthusbekken stream at station 0003ṡ0033, Norway. Those stations are typical of a channel control and a section control, respectively, when backwater-unaffected conditions apply. The performance of the method is investigated through sensitivity analysis to prior information on controls and to observations (i.e., available gaugings) for the station of Valence. These analyses suggest that precisely identifying SFD rating curves requires adapted gauging strategy and/or informative priors. The Madeira River, one of the largest tributaries of the Amazon, provides a challenging case typical of large, flat, tropical river networks where bed roughness can also be variable in addition to slope. In this case, the difference in staff gauge reference levels must be estimated as another uncertain parameter of the SFD model. The proposed Bayesian method is a valuable alternative solution to the graphical and empirical techniques still proposed in hydrometry guidance and standards.
Preliminary Observations of Noise Spectra at the SRO and ASRO Stations
Peterson, Jon
1980-01-01
Introduction The seismic noise spectra presented in this report were derived from SRO and ASRO station data for the purpose of evaluating the performance of the seismic instruments. They are also useful for constructing a spectral estimate of earth noise at a quiet site based on noise samples obtained from a network of globally distributed sites. It is hoped that the spectra will be usefull for other purposes as well. The term 'noise' is used here to describe the ambient signals recorded during a quiet period when earthquake signals have not been detected by visual inspectino of the analog seismogram. The total recorded noise is the sum of instrumental noise, environmental noise (such as effects of temperature, pressure, wind), earth background noise from both natural and cultural sources, and very possibly low-level signals from earthquakes that cannot be visually identified. It is not possible to separate and quantify the signals generated by these independent noise sources using a single sample of station data, although instrumental problems may be indicated by gross changes of noise levels, if the changes are not in the microseismic bands. Since seismic data at the SRO and ASRO stations are recorded in a digital format, spectral computations can be automated so that station noise levels can be monitored as part of data-review procedures. The noise spectra presented in this study are intended to serve as an initial baseline against which relative changes in noise levels can be measured. Total noise power was computed separately for the short- and long-period bands, which are recorded separately at the stations. Power spectral densities were derived by averaging the spectral estimates of a number of contiguous dat segments. The mean value and slope were removed from each segment, cosine-tapered windows were applied, and the estimates were obtained using a fast Fourier transform. In the short-period analyses 16 segments were used, each segment being 1024 samples in length. Because the sampling interval is .05 seconds, the total record length is nearly 13.7 minutes. Normally, the short-period SRO and ASRO data are recorded in an event-only mode. However, several days of continuous short-period data were acquired from most stations for the purpose of this study. Where there was appreciable diurnal variation in short-period noise, spectral data were computed for both day and night intervals. In most cases the long-period spectral densities were obtained by averagin the estimates from 16 data segments, each segment having a length of 2048 samples. Since the long-period sampling interval in 1 second, the total record length used was nearly 9.1 hours. In a few instances, a smaller number of segments was averaged. Spectral data were computed from the vertical-component short-period signals and all three components of long-period signals. All of the spectral plots have been corrected for known instrument response and presented in units of earth displacement. With a few exceptions, the samples of noise data used were acquired during the early months of 1980, winter at some of the stations and summer at others. The starting times for the intervals analyzed are listed in Table 1. A seasonal variation of noise levels in microseismic bands is to be expected. However, none of the stations were experiencing a noticeably high level of microseisms during the intervals analyzed. Weltman and others (1979) have studied and reported daily and seasonal RMS (root-mean-square) noise trends at the SRO and ASRO stations.
Statewide analysis of the drainage-area ratio method for 34 streamflow percentile ranges in Texas
Asquith, William H.; Roussel, Meghan C.; Vrabel, Joseph
2006-01-01
The drainage-area ratio method commonly is used to estimate streamflow for sites where no streamflow data are available using data from one or more nearby streamflow-gaging stations. The method is intuitive and straightforward to implement and is in widespread use by analysts and managers of surface-water resources. The method equates the ratio of streamflow at two stream locations to the ratio of the respective drainage areas. In practice, unity often is assumed as the exponent on the drainage-area ratio, and unity also is assumed as a multiplicative bias correction. These two assumptions are evaluated in this investigation through statewide analysis of daily mean streamflow in Texas. The investigation was made by the U.S. Geological Survey in cooperation with the Texas Commission on Environmental Quality. More than 7.8 million values of daily mean streamflow for 712 U.S. Geological Survey streamflow-gaging stations in Texas were analyzed. To account for the influence of streamflow probability on the drainage-area ratio method, 34 percentile ranges were considered. The 34 ranges are the 4 quartiles (0-25, 25-50, 50-75, and 75-100 percent), the 5 intervals of the lower tail of the streamflow distribution (0-1, 1-2, 2-3, 3-4, and 4-5 percent), the 20 quintiles of the 4 quartiles (0-5, 5-10, 10-15, 15-20, 20-25, 25-30, 30-35, 35-40, 40-45, 45-50, 50-55, 55-60, 60-65, 65-70, 70-75, 75-80, 80-85, 85-90, 90-95, and 95-100 percent), and the 5 intervals of the upper tail of the streamflow distribution (95-96, 96-97, 97-98, 98-99 and 99-100 percent). For each of the 253,116 (712X711/2) unique pairings of stations and for each of the 34 percentile ranges, the concurrent daily mean streamflow values available for the two stations provided for station-pair application of the drainage-area ratio method. For each station pair, specific statistical summarization (median, mean, and standard deviation) of both the exponent and bias-correction components of the drainage-area ratio method were computed. Statewide statistics (median, mean, and standard deviation) of the station-pair specific statistics subsequently were computed and are tabulated herein. A separate analysis considered conditioning station pairs to those stations within 100 miles of each other and with the absolute value of the logarithm (base-10) of the ratio of the drainage areas greater than or equal to 0.25. Statewide statistics of the conditional station-pair specific statistics were computed and are tabulated. The conditional analysis is preferable because of the anticipation that small separation distances reflect similar hydrologic conditions and the observation of large variation in exponent estimates for similar-sized drainage areas. The conditional analysis determined that the exponent is about 0.89 for streamflow percentiles from 0 to about 50 percent, is about 0.92 for percentiles from about 50 to about 65 percent, and is about 0.93 for percentiles from about 65 to about 85 percent. The exponent decreases rapidly to about 0.70 for percentiles nearing 100 percent. The computation of the bias-correction factor is sensitive to the range analysis interval (range of streamflow percentile); however, evidence suggests that in practice the drainage-area method can be considered unbiased. Finally, for general application, suggested values of the exponent are tabulated for 54 percentiles of daily mean streamflow in Texas; when these values are used, the bias correction is unity.
A PDA-based flexible telecommunication system for telemedicine applications.
Nazeran, Homer; Setty, Sunil; Haltiwanger, Emily; Gonzalez, Virgilio
2004-01-01
Technology has been used to deliver health care at a distance for many years. Telemedicine is a rapidly growing area and recently there are studies devoted to prehospital care of patients in emergency cases. In this work we have developed a compact, reliable, and low cost PDA-based telecommunication device for telemedicine applications to transmit audio, still images, and vital signs from a remote site to a fixed station such as a clinic or a hospital in real time. This was achieved based on a client-server architecture. A Pocket PC, a miniature camera, and a hands-free microphone were used at the client site and a desktop computer running the Windows XP operating system was used as a server. The server was located at a fixed station. The system was implemented on TCP/IP and HTTP protocol. Field tests have shown that the system can reliably transmit still images, audio, and sample vital signs from a simulated remote site to a fixed station either via a wired or wireless network in real time. The Pocket PC was used at the client site because of its compact size, low cost and processing capabilities.
Thompson, Ronald E.; Hoffman, Scott A.
2006-01-01
A suite of 28 streamflow statistics, ranging from extreme low to high flows, was computed for 17 continuous-record streamflow-gaging stations and predicted for 20 partial-record stations in Monroe County and contiguous counties in north-eastern Pennsylvania. The predicted statistics for the partial-record stations were based on regression analyses relating inter-mittent flow measurements made at the partial-record stations indexed to concurrent daily mean flows at continuous-record stations during base-flow conditions. The same statistics also were predicted for 134 ungaged stream locations in Monroe County on the basis of regression analyses relating the statistics to GIS-determined basin characteristics for the continuous-record station drainage areas. The prediction methodology for developing the regression equations used to estimate statistics was developed for estimating low-flow frequencies. This study and a companion study found that the methodology also has application potential for predicting intermediate- and high-flow statistics. The statistics included mean monthly flows, mean annual flow, 7-day low flows for three recurrence intervals, nine flow durations, mean annual base flow, and annual mean base flows for two recurrence intervals. Low standard errors of prediction and high coefficients of determination (R2) indicated good results in using the regression equations to predict the statistics. Regression equations for the larger flow statistics tended to have lower standard errors of prediction and higher coefficients of determination (R2) than equations for the smaller flow statistics. The report discusses the methodologies used in determining the statistics and the limitations of the statistics and the equations used to predict the statistics. Caution is indicated in using the predicted statistics for small drainage area situations. Study results constitute input needed by water-resource managers in Monroe County for planning purposes and evaluation of water-resources availability.
The CGE-PLATO Electronic Laboratory Station Structure and Operation.
ERIC Educational Resources Information Center
Neal, J. P.
An electronic laboratory station was designed for student use in learning electronic instrumentation and measurement by means of the computer-guided experimentation (CGE) system. The station features rack-mounted electronic laboratory equipment on a laboratory table adjacent to a PLATO IV terminal. An integrated logic system behind the laboratory…
2011-12-29
ISS030-E-017789 (29 Dec. 2011) --- Working in chorus with the International Space Station team in Houston?s Mission Control Center, this astronaut and his Expedition 30 crewmates on the station install a set of Enhanced Processor and Integrated Communications (EPIC) computer cards in one of seven primary computers onboard. The upgrade will allow more experiments to operate simultaneously, and prepare for the arrival of commercial cargo ships later this year.
Development and applications of nondestructive evaluation at Marshall Space Flight Center
NASA Technical Reports Server (NTRS)
Whitaker, Ann F.
1990-01-01
A brief description of facility design and equipment, facility usage, and typical investigations are presented for the following: Surface Inspection Facility; Advanced Computer Tomography Inspection Station (ACTIS); NDE Data Evaluation Facility; Thermographic Test Development Facility; Radiographic Test Facility; Realtime Radiographic Test Facility; Eddy Current Research Facility; Acoustic Emission Monitoring System; Advanced Ultrasonic Test Station (AUTS); Ultrasonic Test Facility; and Computer Controlled Scanning (CONSCAN) System.
Revision of IRIS/IDA Seismic Station Metadata
NASA Astrophysics Data System (ADS)
Xu, W.; Davis, P.; Auerbach, D.; Klimczak, E.
2017-12-01
Trustworthy data quality assurance has always been one of the goals of seismic network operators and data management centers. This task is considerably complex and evolving due to the huge quantities as well as the rapidly changing characteristics and complexities of seismic data. Published metadata usually reflect instrument response characteristics and their accuracies, which includes zero frequency sensitivity for both seismometer and data logger as well as other, frequency-dependent elements. In this work, we are mainly focused studying the variation of the seismometer sensitivity with time of IRIS/IDA seismic recording systems with a goal to improve the metadata accuracy for the history of the network. There are several ways to measure the accuracy of seismometer sensitivity for the seismic stations in service. An effective practice recently developed is to collocate a reference seismometer in proximity to verify the in-situ sensors' calibration. For those stations with a secondary broadband seismometer, IRIS' MUSTANG metric computation system introduced a transfer function metric to reflect two sensors' gain ratios in the microseism frequency band. In addition, a simulation approach based on M2 tidal measurements has been proposed and proven to be effective. In this work, we compare and analyze the results from three different methods, and concluded that the collocated-sensor method is most stable and reliable with the minimum uncertainties all the time. However, for epochs without both the collocated sensor and secondary seismometer, we rely on the analysis results from tide method. For the data since 1992 on IDA stations, we computed over 600 revised seismometer sensitivities for all the IRIS/IDA network calibration epochs. Hopefully further revision procedures will help to guarantee that the data is accurately reflected by the metadata of these stations.
The computer-communication link for the innovative use of Space Station
NASA Technical Reports Server (NTRS)
Carroll, C. C.
1984-01-01
The potential capability of the computer-communications system link of space station is related to innovative utilization for industrial applications. Conceptual computer network architectures are presented and their respective accommodation of innovative industrial projects are discussed. To achieve maximum system availability for industrialization is a possible design goal, which would place the industrial community in an interactive mode with facilities in space. A worthy design goal would be to minimize the computer-communication management function and thereby optimize the system availability for industrial users. Quasi-autonomous modes and subnetworks are key design issues, since they would be the system elements directly effecting the system performance for industrial use.
Instructor/Operator Station Design Handbook for Aircrew Training Devices.
1987-10-01
to only the necessary work areas and baffles it from the CRT; (f) use of a selective -spectrum lighting system, in which the spectral output of the...operator. While the device provides some new features which support training, such as a debrief facility and a computer-based instructor training module , the...ZIP Code) 10 SOURCE OF FUNDING NUMBERS Brooks Air Force Base, Texas 78235-5601 PROGRAM PROJECT TASK WORK UNIT ELEMENT NO NO NO ACCESSION NO 62205F
NASA Technical Reports Server (NTRS)
Yanosy, James L.
1988-01-01
Emulation/Simulation Computer Model (ESCM) computes the transient performance of a Space Station air revitalization subsystem with carbon dioxide removal provided by a solid amine water desorbed subsystem called SAWD. This manual describes the mathematical modeling and equations used in the ESCM. For the system as a whole and for each individual component, the fundamental physical and chemical laws which govern their operations are presented. Assumptions are stated, and when necessary, data is presented to support empirically developed relationships.
NASA Technical Reports Server (NTRS)
Stagl, T. W.; Singh, J. P.
1972-01-01
A description and listings of computer programs for plotting geographical and political features of the world or a specified portion of it, for plotting spot-beam coverages from an earth-synchronous satellite over the computer generated mass, and for plotting polar perspective views of the earth and earth-station antenna elevation contours for a given satellite location are presented. The programs have been prepared in connection with a project on Application of Communication Satellites to Educational Development.
NASA Astrophysics Data System (ADS)
Piana Agostinetti, N.; Amato, A.; Cattaneo, M.; de Gori, P.; di Bona, M.
In the framework of the italian PNRA (Progetto Nazionale di Ricerche in Antartide), we have started to re-analize teleseismic waveforms recorded, using three-components seismometers (equipped with 5 seconds sensors, Lennartz 3D-5s), during five summer campaings, from 1993 to 2000. Seismic stations were deployed around Terra Nova Bay (TNB) italian base, from the sea to reach the interior of the Transantartic Moun- tains (TAM), the most striking example of nocontractional mountain belt. During the last campaingn (1999-2000) seismic stations were deployed deep into Northern Vic- toria Land to reach Rennik and Lillie Glaciers Area and George V coast region, the northest part of TAM. Our main goals were: to compute, using frequency-domanin deconvolution method by Di Bona [1998], Receiver Functions covering all the area around TNB italian antartic base; to map of Moho-depth and intercrustal S-waves ve- locity discontinuity from 1-D velocity model computed using Sambridge's inversion scheme [Sambridge,1999]; to analize new teleseimic waveforms recorded near TNB base: continuos recording, from 1999 to present, permits more accurate modelling S-velocity crustal structure in this critical area situated at the edge of the ipothetic rift [Stern and ten Brik, 1989; Stump and Fitzgerald, 1992; ten Brik et al., 1997]; to present final results from BACKTAM expedition.
NASA Technical Reports Server (NTRS)
Babrauckas, Theresa
2000-01-01
The Affordable High Performance Computing (AHPC) project demonstrated that high-performance computing based on a distributed network of computer workstations is a cost-effective alternative to vector supercomputers for running CPU and memory intensive design and analysis tools. The AHPC project created an integrated system called a Network Supercomputer. By connecting computer work-stations through a network and utilizing the workstations when they are idle, the resulting distributed-workstation environment has the same performance and reliability levels as the Cray C90 vector Supercomputer at less than 25 percent of the C90 cost. In fact, the cost comparison between a Cray C90 Supercomputer and Sun workstations showed that the number of distributed networked workstations equivalent to a C90 costs approximately 8 percent of the C90.
An application for multi-person task synchronization
NASA Technical Reports Server (NTRS)
Brown, Robert L.; Doyle, Dee
1990-01-01
Computer applications are studied that will enable a group of people to synchronize their actions when following a predefined task sequence. It is assumed that the people involved only have computer workstations available to them for communication. Hence, the approach is to study how the computer can be used to help a group remain synchronized. A series of applications were designed and developed that can be used as vehicles for experimentation. An example of how this technique can be used for a remote coaching capability is explained in a report describing an experiment that simulated a Life Sciences experiment on-board Space Station Freedom, with a ground based principal investigator providing the expertise by coaching the on-orbit mission specialist.
Geolocation of LTE Subscriber Stations Based on the Timing Advance Ranging Parameter
2010-12-01
provides the maximum achievable data rates. The specifications for LTE include FDD and TDD in all of its descriptions since there is little to no...parameters used during LTE network entry are examined as they relate to calculating these distances. Computer simulation is used to demonstrate...11 Figure 4. Principles of TDD and FDD modes of
Computer networks for remote laboratories in physics and engineering
NASA Technical Reports Server (NTRS)
Starks, Scott; Elizandro, David; Leiner, Barry M.; Wiskerchen, Michael
1988-01-01
This paper addresses a relatively new approach to scientific research, telescience, which is the conduct of scientific operations in locations remote from the site of central experimental activity. A testbed based on the concepts of telescience is being developed to ultimately enable scientific researchers on earth to conduct experiments onboard the Space Station. This system along with background materials are discussed.
NASA Technical Reports Server (NTRS)
Schneider, Michelle
2003-01-01
This viewgraph representation provides an overview of the Telescience Resource Kit. The Telescience Resource Kit is a pc-based telemetry and command system that will be used by scientists and engineers to monitor and control experiments located on-board the International Space Station (ISS). Topics covered include: ISS Payload Telemetry and Command Flow, kit computer applications, kit telemetry capabilities, command capabilities, and training/testing capabilities.
Changes in the Arctic: Background and Issues for Congress
2014-04-28
knowledge of the physical environment. Data must be obtained by a suite of remote sensors (satellites, radars), autonomous sensors (data buoys...unmanned vehicles), and manned sensors (shipboard, coastal observing stations). Computer-based ocean and atmospheric models must be adjusted to the... soot ). 6. Implementation: In carrying out this policy as it relates to environmental protection and conservation of natural resources, the
Urban Crowns: crown analysis software to assist in quantifying urban tree benefits
Matthew F. Winn; Sang-Mook Lee Bradley; Philip A. Araman
2010-01-01
UrbanCrowns is a Microsoft® Windows®-based computer program developed by the U.S. Forest Service Southern Research Station. The software assists urban forestry professionals, arborists, and community volunteers in assessing and monitoring the crown characteristics of urban trees (both deciduous and coniferous) using a single side-view digital photograph. Program output...
2011-01-01
Background Evidence about a possible causal relationship between non-specific physical symptoms (NSPS) and exposure to electromagnetic fields (EMF) emitted by sources such as mobile phone base stations (BS) and powerlines is insufficient. So far little epidemiological research has been published on the contribution of psychological components to the occurrence of EMF-related NSPS. The prior objective of the current study is to explore the relative importance of actual and perceived proximity to base stations and psychological components as determinants of NSPS, adjusting for demographic, residency and area characteristics. Methods Analysis was performed on data obtained in a cross-sectional study on environment and health in 2006 in the Netherlands. In the current study, 3611 adult respondents (response rate: 37%) in twenty-two Dutch residential areas completed a questionnaire. Self-reported instruments included a symptom checklist and assessment of environmental and psychological characteristics. The computation of the distance between household addresses and location of base stations and powerlines was based on geo-coding. Multilevel regression models were used to test the hypotheses regarding the determinants related to the occurrence of NSPS. Results After adjustment for demographic and residential characteristics, analyses yielded a number of statistically significant associations: Increased report of NSPS was predominantly predicted by higher levels of self-reported environmental sensitivity; perceived proximity to base stations and powerlines, lower perceived control and increased avoidance (coping) behavior were also associated with NSPS. A trend towards a moderator effect of perceived environmental sensitivity on the relation between perceived proximity to BS and NSPS was verified (p = 0.055). There was no significant association between symptom occurrence and actual distance to BS or powerlines. Conclusions Perceived proximity to BS, psychological components and socio-demographic characteristics are associated with the report of symptomatology. Actual distance to the EMF source did not show up as determinant of NSPS. PMID:21631930
ERIC Educational Resources Information Center
Eastman, Susan T.
1984-01-01
Argues that the telecommunications field has specific computer applications; therefore courses on how to use computer programs for audience analysis, station accounting, newswriting, etc., should be included in the telecommunications curriculum. (PD)
Temporal and spatial deviation in F2 peak parameters derived from FORMOSAT-3/COSMIC
NASA Astrophysics Data System (ADS)
Kumar, Sanjay; Singh, R. P.; Tan, Eng Leong; Singh, A. K.; Ghodpage, R. N.; Siingh, Devendraa
2016-06-01
The plasma frequency profiles derived from the Constellation of Observing System for Meteorology, Ionosphere and Climate (COSMIC) radio occultation measurements are compared with ground-based ionosonde data during the year 2013. Equatorial and midlatitude five stations located in the Northern and Southern Hemisphere are considered: Jicamarca, Jeju, Darwin, Learmonth, and Juliusruh. The aim is to validate the COSMIC-derived data with ground-based measurements and to estimate the difference in plasma frequency (which represents electron density) and height of F2 layer peak during the daytime/nighttime and during different seasons by comparing the two data sets. Analysis showed that the nighttime data are better correlated than the daytime, and the maximum difference occurs at the equatorial ionospheric anomaly (EIA) station as compared to lower and midlatitude stations during the equinox months. The difference between daytime and nighttime correlations becomes insignificant at midlatitude stations. The statistical analysis of computed errors in foF2 (hmF2) showed Gaussian nature with the most probable error range of ±15% (±10%) at the equatorial and EIA stations, ±9% (±7%) outside the EIA region which reduced to ±8% (±6%) at midlatitude stations. The reduction in error at midlatitudes is attributed to the decrease in latitudinal electron density gradients. Comparing the analyzed data during the three geomagnetic storms and quiet days of the same months, it is observed that the differences are significantly enhanced during storm periods and the magnitude of difference in foF2 increases with the intensity of geomagnetic storm.
Simulating Future GPS Clock Scenarios with Two Composite Clock Algorithms
NASA Technical Reports Server (NTRS)
Suess, Matthias; Matsakis, Demetrios; Greenhall, Charles A.
2010-01-01
Using the GPS Toolkit, the GPS constellation is simulated using 31 satellites (SV) and a ground network of 17 monitor stations (MS). At every 15-minutes measurement epoch, the monitor stations measure the time signals of all satellites above a parameterized elevation angle. Once a day, the satellite clock estimates the station and satellite clocks. The first composite clock (B) is based on the Brown algorithm, and is now used by GPS. The second one (G) is based on the Greenhall algorithm. The composite clock of G and B performance are investigated using three ground-clock models. Model C simulates the current GPS configuration, in which all stations are equipped with cesium clocks, except for masers at USNO and Alternate Master Clock (AMC) sites. Model M is an improved situation in which every station is equipped with active hydrogen masers. Finally, Models F and O are future scenarios in which the USNO and AMC stations are equipped with fountain clocks instead of masers. Model F is a rubidium fountain, while Model O is more precise but futuristic Optical Fountain. Each model is evaluated using three performance metrics. The timing-related user range error having all satellites available is the first performance index (PI1). The second performance index (PI2) relates to the stability of the broadcast GPS system time itself. The third performance index (PI3) evaluates the stability of the time scales computed by the two composite clocks. A distinction is made between the "Signal-in-Space" accuracy and that available through a GNSS receiver.
Shope, William G.; ,
1987-01-01
The US Geological Survey is utilizing a national network of more than 1000 satellite data-collection stations, four satellite-relay direct-readout ground stations, and more than 50 computers linked together in a private telecommunications network to acquire, process, and distribute hydrological data in near real-time. The four Survey offices operating a satellite direct-readout ground station provide near real-time hydrological data to computers located in other Survey offices through the Survey's Distributed Information System. The computerized distribution system permits automated data processing and distribution to be carried out in a timely manner under the control and operation of the Survey office responsible for the data-collection stations and for the dissemination of hydrological information to the water-data users.
NASA Astrophysics Data System (ADS)
McGuire, P. C.; Gross, C.; Wendt, L.; Bonnici, A.; Souza-Egipsy, V.; Ormö, J.; Díaz-Martínez, E.; Foing, B. H.; Bose, R.; Walter, S.; Oesker, M.; Ontrup, J.; Haschke, R.; Ritter, H.
2010-01-01
In previous work, a platform was developed for testing computer-vision algorithms for robotic planetary exploration. This platform consisted of a digital video camera connected to a wearable computer for real-time processing of images at geological and astrobiological field sites. The real-time processing included image segmentation and the generation of interest points based upon uncommonness in the segmentation maps. Also in previous work, this platform for testing computer-vision algorithms has been ported to a more ergonomic alternative platform, consisting of a phone camera connected via the Global System for Mobile Communications (GSM) network to a remote-server computer. The wearable-computer platform has been tested at geological and astrobiological field sites in Spain (Rivas Vaciamadrid and Riba de Santiuste), and the phone camera has been tested at a geological field site in Malta. In this work, we (i) apply a Hopfield neural-network algorithm for novelty detection based upon colour, (ii) integrate a field-capable digital microscope on the wearable computer platform, (iii) test this novelty detection with the digital microscope at Rivas Vaciamadrid, (iv) develop a Bluetooth communication mode for the phone-camera platform, in order to allow access to a mobile processing computer at the field sites, and (v) test the novelty detection on the Bluetooth-enabled phone camera connected to a netbook computer at the Mars Desert Research Station in Utah. This systems engineering and field testing have together allowed us to develop a real-time computer-vision system that is capable, for example, of identifying lichens as novel within a series of images acquired in semi-arid desert environments. We acquired sequences of images of geologic outcrops in Utah and Spain consisting of various rock types and colours to test this algorithm. The algorithm robustly recognized previously observed units by their colour, while requiring only a single image or a few images to learn colours as familiar, demonstrating its fast learning capability.
Optimization of Close Range Photogrammetry Network Design Applying Fuzzy Computation
NASA Astrophysics Data System (ADS)
Aminia, A. S.
2017-09-01
Measuring object 3D coordinates with optimum accuracy is one of the most important issues in close range photogrammetry. In this context, network design plays an important role in determination of optimum position of imaging stations. This is, however, not a trivial task due to various geometric and radiometric constraints affecting the quality of the measurement network. As a result, most camera stations in the network are defined on a try and error basis based on the user's experience and generic network concept. In this paper, we propose a post-processing task to investigate the quality of camera positions right after image capturing to achieve the best result. To do this, a new fuzzy reasoning approach is adopted, in which the constraints affecting the network design are all modeled. As a result, the position of all camera locations is defined based on fuzzy rules and inappropriate stations are determined. The experiments carried out show that after determination and elimination of the inappropriate images using the proposed fuzzy reasoning system, the accuracy of measurements is improved and enhanced about 17% for the latter network.
Application of time-variable process noise in terrestrial reference frames determined from VLBI data
NASA Astrophysics Data System (ADS)
Soja, Benedikt; Gross, Richard S.; Abbondanza, Claudio; Chin, Toshio M.; Heflin, Michael B.; Parker, Jay W.; Wu, Xiaoping; Balidakis, Kyriakos; Nilsson, Tobias; Glaser, Susanne; Karbon, Maria; Heinkelmann, Robert; Schuh, Harald
2018-05-01
In recent years, Kalman filtering has emerged as a suitable technique to determine terrestrial reference frames (TRFs), a prime example being JTRF2014. The time series approach allows variations of station coordinates that are neither reduced by observational corrections nor considered in the functional model to be taken into account. These variations are primarily due to non-tidal geophysical loading effects that are not reduced according to the current IERS Conventions (2010). It is standard practice that the process noise models applied in Kalman filter TRF solutions are derived from time series of loading displacements and account for station dependent differences. So far, it has been assumed that the parameters of these process noise models are constant over time. However, due to the presence of seasonal and irregular variations, this assumption does not truly reflect reality. In this study, we derive a station coordinate process noise model allowing for such temporal variations. This process noise model and one that is a parameterized version of the former are applied in the computation of TRF solutions based on very long baseline interferometry data. In comparison with a solution based on a constant process noise model, we find that the station coordinates are affected at the millimeter level.
The proposed monitoring system for the Fermilab D0 colliding beams detector
NASA Astrophysics Data System (ADS)
Goodwin, Robert; Florian, Robert; Johnson, Marvin; Jones, Alan; Shea, Mike
1986-06-01
The Fermilab D0 Detector is a collaborative effort that includes seventeen universities and national laboratories. The monitoring and control system for this detector will be separate from the online detector data system. A distributed, stand-alone, microprocessor-based system is being designed to allow monitoring and control functions to be available to the collaborators at their home institutions during the design, fabrication, and testing phases of the project. Individual stations are VMEbus-based 68000 systems that are networked together during installation using an ARCnet (by Datapoint Corporation) Local Area Network. One station, perhaps a MicroVAX, would have a hard disk to store a backup copy of the distributed database located in non-volatile RAM in the local stations. This station would also serve as a gateway to the online system, so that data from the control system will be available for logging with the detector data. Apple Macintosh personal computers are being developed for use as the local control consoles. Each would be interfaced to ARCnet to provide access to all control system data. Through the use of bit-mapped graphics with multiple windows and pull-down menus, a cost effective, flexible display system can be provided, taking advantage of familiar modern software tools to support the operator interface.
CBESW: sequence alignment on the Playstation 3.
Wirawan, Adrianto; Kwoh, Chee Keong; Hieu, Nim Tri; Schmidt, Bertil
2008-09-17
The exponential growth of available biological data has caused bioinformatics to be rapidly moving towards a data-intensive, computational science. As a result, the computational power needed by bioinformatics applications is growing exponentially as well. The recent emergence of accelerator technologies has made it possible to achieve an excellent improvement in execution time for many bioinformatics applications, compared to current general-purpose platforms. In this paper, we demonstrate how the PlayStation 3, powered by the Cell Broadband Engine, can be used as a computational platform to accelerate the Smith-Waterman algorithm. For large datasets, our implementation on the PlayStation 3 provides a significant improvement in running time compared to other implementations such as SSEARCH, Striped Smith-Waterman and CUDA. Our implementation achieves a peak performance of up to 3,646 MCUPS. The results from our experiments demonstrate that the PlayStation 3 console can be used as an efficient low cost computational platform for high performance sequence alignment applications.
CBESW: Sequence Alignment on the Playstation 3
Wirawan, Adrianto; Kwoh, Chee Keong; Hieu, Nim Tri; Schmidt, Bertil
2008-01-01
Background The exponential growth of available biological data has caused bioinformatics to be rapidly moving towards a data-intensive, computational science. As a result, the computational power needed by bioinformatics applications is growing exponentially as well. The recent emergence of accelerator technologies has made it possible to achieve an excellent improvement in execution time for many bioinformatics applications, compared to current general-purpose platforms. In this paper, we demonstrate how the PlayStation® 3, powered by the Cell Broadband Engine, can be used as a computational platform to accelerate the Smith-Waterman algorithm. Results For large datasets, our implementation on the PlayStation® 3 provides a significant improvement in running time compared to other implementations such as SSEARCH, Striped Smith-Waterman and CUDA. Our implementation achieves a peak performance of up to 3,646 MCUPS. Conclusion The results from our experiments demonstrate that the PlayStation® 3 console can be used as an efficient low cost computational platform for high performance sequence alignment applications. PMID:18798993
Theoretical magnetograms based on quantitative simulation of a magnetospheric substorm
NASA Technical Reports Server (NTRS)
Chen, C.-K.; Wolf, R. A.; Karty, J. L.; Harel, M.
1982-01-01
Substorm currents derived from the Rice University computer simulation of the September 19, 1976 substorm event are used to compute theoretical magnetograms as a function of universal time for various stations, integrating the Biot-Savart law over a maze of about 2700 wires and bands that carry the ring, Birkeland and horizontal ionospheric currents. A comparison of theoretical results with corresponding observations leads to a claim of general agreement, especially for stations at high and middle magnetic latitudes. Model results suggest that the ground magnetic field perturbations arise from complicated combinations of different kinds of currents, and that magnetic field disturbances due to different but related currents cancel each other out despite the inapplicability of Fukushima's (1973) theorem. It is also found that the dawn-dusk asymmetry in the horizontal magnetic field disturbance component at low latitudes is due to a net downward Birkeland current at noon, a net upward current at midnight, and, generally, antisunward-flowing electrojets.
Fiber optic configurations for local area networks
NASA Technical Reports Server (NTRS)
Nassehi, M. M.; Tobagi, F. A.; Marhic, M. E.
1985-01-01
A number of fiber optic configurations for a new class of demand assignment multiple-access local area networks requiring a physical ordering among stations are proposed. In such networks, the data transmission and linear-ordering functions may be distinguished and be provided by separate data and control subnetworks. The configurations proposed for the data subnetwork are based on the linear, star, and tree topologies. To provide the linear-ordering function, the control subnetwork must always have a linear unidirectional bus structure. Due to the reciprocity and excess loss of optical couplers, the number of stations that can be accommodated on a linear fiber optic bus is severely limited. Two techniques are proposed to overcome this limitation. For each of the data and control subnetwork configurations, the maximum number of stations as a function of the power margin, for both reciprocal and nonreciprocal couplers, is computed.
Space Shuttle and Space Station Radio Frequency (RF) Exposure Analysis
NASA Technical Reports Server (NTRS)
Hwu, Shian U.; Loh, Yin-Chung; Sham, Catherine C.; Kroll, Quin D.
2005-01-01
This paper outlines the modeling techniques and important parameters to define a rigorous but practical procedure that can verify the compliance of RF exposure to the NASA standards for astronauts and electronic equipment. The electromagnetic modeling techniques are applied to analyze RF exposure in Space Shuttle and Space Station environments with reasonable computing time and resources. The modeling techniques are capable of taking into account the field interactions with Space Shuttle and Space Station structures. The obtained results illustrate the multipath effects due to the presence of the space vehicle structures. It's necessary to include the field interactions with the space vehicle in the analysis for an accurate assessment of the RF exposure. Based on the obtained results, the RF keep out zones are identified for appropriate operational scenarios, flight rules and necessary RF transmitter constraints to ensure a safe operating environment and mission success.
High-Temperature RF Probe Station For Device Characterization Through 500 deg C and 50 GHz
NASA Technical Reports Server (NTRS)
Schwartz, Zachary D.; Downey, Alan N.; Alterovitz, Samuel A.; Ponchak, George E.; Williams, W. D. (Technical Monitor)
2003-01-01
A high-temperature measurement system capable of performing on-wafer microwave testing of semiconductor devices has been developed. This high temperature probe station can characterize active and passive devices and circuits at temperatures ranging from room temperature to above 500 C. The heating system uses a ceramic heater mounted on an insulating block of NASA shuttle tile material. The temperature is adjusted by a graphical computer interface and is controlled by the software-based feedback loop. The system is used with a Hewlett-Packard 8510C Network Analyzer to measure scattering parameters over a frequency range of 1 to 50 GHz. The microwave probes, cables, and inspection microscope are all shielded to protect from heat damage. The high temperature probe station has been successfully used to characterize gold transmission lines on silicon carbide at temperatures up to 540 C.
User manual of the CATSS system (version 1.0) communication analysis tool for space station
NASA Technical Reports Server (NTRS)
Tsang, C. S.; Su, Y. T.; Lindsey, W. C.
1983-01-01
The Communication Analysis Tool for the Space Station (CATSS) is a FORTRAN language software package capable of predicting the communications links performance for the Space Station (SS) communication and tracking (C & T) system. An interactive software package was currently developed to run on the DEC/VAX computers. The CATSS models and evaluates the various C & T links of the SS, which includes the modulation schemes such as Binary-Phase-Shift-Keying (BPSK), BPSK with Direct Sequence Spread Spectrum (PN/BPSK), and M-ary Frequency-Shift-Keying with Frequency Hopping (FH/MFSK). Optical Space Communication link is also included. CATSS is a C & T system engineering tool used to predict and analyze the system performance for different link environment. Identification of system weaknesses is achieved through evaluation of performance with varying system parameters. System tradeoff for different values of system parameters are made based on the performance prediction.
Zhou, Wenliang; Yang, Xia; Deng, Lianbo
2014-01-01
Not only is the operating plan the basis of organizing marshalling station's operation, but it is also used to analyze in detail the capacity utilization of each facility in marshalling station. In this paper, a long-term operating plan is optimized mainly for capacity utilization analysis. Firstly, a model is developed to minimize railcars' average staying time with the constraints of minimum time intervals, marshalling track capacity, and so forth. Secondly, an algorithm is designed to solve this model based on genetic algorithm (GA) and simulation method. It divides the plan of whole planning horizon into many subplans, and optimizes them with GA one by one in order to obtain a satisfactory plan with less computing time. Finally, some numeric examples are constructed to analyze (1) the convergence of the algorithm, (2) the effect of some algorithm parameters, and (3) the influence of arrival train flow on the algorithm. PMID:25525614
On-Orbit Performance Degradation of the International Space Station P6 Photovoltaic Arrays
NASA Technical Reports Server (NTRS)
Kerslake, Thomas W.; Gustafson, Eric D.
2003-01-01
This paper discusses the on-orbit performance and performance degradation of the International Space Station P6 solar array wings (SAWs) from the period of December 2000 through February 2003. Data selection considerations and data reduction methods are reviewed along with the approach for calculating array performance degradation based on measured string shunt current levels. Measured degradation rates are compared with those predicted by the computational tool SPACE and prior degradation rates measured with the same SAW technology on the Mir space station. Initial results show that the measured SAW short-circuit current is degrading 0.2 to 0.5 percent per year. This degradation rate is below the predicted rate of 0.8 percent per year and is well within the 3 percent estimated uncertainty in measured SAW current levels. General contributors to SAW degradation are briefly discussed.
NetMOD version 1.0 user's manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merchant, Bion John
2014-01-01
NetMOD (Network Monitoring for Optimal Detection) is a Java-based software package for conducting simulation of seismic networks. Specifically, NetMOD simulates the detection capabilities of seismic monitoring networks. Network simulations have long been used to study network resilience to station outages and to determine where additional stations are needed to reduce monitoring thresholds. NetMOD makes use of geophysical models to determine the source characteristics, signal attenuation along the path between the source and station, and the performance and noise properties of the station. These geophysical models are combined to simulate the relative amplitudes of signal and noise that are observed atmore » each of the stations. From these signal-to-noise ratios (SNR), the probability of detection can be computed given a detection threshold. This manual describes how to configure and operate NetMOD to perform seismic detection simulations. In addition, NetMOD is distributed with a simulation dataset for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) International Monitoring System (IMS) seismic network for the purpose of demonstrating NetMOD's capabilities and providing user training. The tutorial sections of this manual use this dataset when describing how to perform the steps involved when running a simulation.« less
Energy Efficient In-network RFID Data Filtering Scheme in Wireless Sensor Networks
Bashir, Ali Kashif; Lim, Se-Jung; Hussain, Chauhdary Sajjad; Park, Myong-Soon
2011-01-01
RFID (Radio frequency identification) and wireless sensor networks are backbone technologies for pervasive environments. In integration of RFID and WSN, RFID data uses WSN protocols for multi-hop communications. Energy is a critical issue in WSNs; however, RFID data contains a lot of duplication. These duplications can be eliminated at the base station, but unnecessary transmissions of duplicate data within the network still occurs, which consumes nodes’ energy and affects network lifetime. In this paper, we propose an in-network RFID data filtering scheme that efficiently eliminates the duplicate data. For this we use a clustering mechanism where cluster heads eliminate duplicate data and forward filtered data towards the base station. Simulation results prove that our approach saves considerable amounts of energy in terms of communication and computational cost, compared to existing filtering schemes. PMID:22163999
Surface Wave Tomography of South China Sea from Ambient Seismic Noise and Two-station Measurements
NASA Astrophysics Data System (ADS)
Liang, W.-T.; Gung, Y.-C.
2012-04-01
We have taken the cross-correlation of seismic ambient noise technique as well as the two-station method to analyze the velocity structure in the South China Sea region. The dataset used in this study includes broadband waveforms recorded at the Taiwan BATS (Broadband Array in Taiwan for Seismology), Japan OHP (Ocean Hemisphere Project), Malaysia and Vietnam seismic networks. We remove the instrument response from daily data and filter the waveform with various frequency bands according to the length of each station-pair. Then we apply the commonly used 1-bit normalization to minimize the effect of earthquakes, instrumental irregularities, and non-stationary noise sources near to the stations. With the derived daily cross correlation function (CCF), we are able to examine the timing quality for each station-pair. We then obtain the surface Rayleigh wave dispersion curves from the stacked CCF for each station-pair. To cover the longer period band in the dispersion curves, we adopt the two-station method to compute both the group and phase velocities of surface waves. A new surface wave tomography based on ambient seismic noise study and traditional two-station technique has been achieved in this study. Raypaths that travel through the Central basin present higher velocity, which is in agreement with the idea of thin crust. On the other hand, the slower velocity between Taiwan and Northern Luzon, Philippine is mainly due to a thick accretionary prism above the Manila trench.
The Jalisco Seismic Telemetric Network (RESJAL)
NASA Astrophysics Data System (ADS)
Nunez-Cornu, F. J.; Nunez-Cornu, F. J.; Reyes-Davila, G.; Reyes-Davila, G.; Suarez-Plascencia, C.; Suarez-Plascencia, C.; Gonzalez-Ledezma, M.; Garcia-Puga, J.
2001-12-01
The region of Jalisco is one of the most active seismic regions in Mexico, the main tectonic units in this region are the Jalisco Block and the Rivera Plate. The greatest earthquake (M=8.2) occurred in Mexico in the Twenty-Century (1932) took place in the coast of Jalisco, this was followed by another one (Ms =7.8) fifteen days later. In 1995 an earthquake magnitude 8.0 took place in the coast of Jalisco, but its rupture area was only the southern half of the rupture area proposed for the 1932 earthquakes, these facts suggest the existence of an important seismic Gap in the north coast of Jalisco which includes the area of Bahía de Banderas. However, not only subduction earthquakes occurred in this region there are also large inland earthquakes, such as the December 27, 1568 and February 11, 1872 events. There are also three active volcanoes Sanganguey, Ceboruco and the most active volcano in Mexico, the Colima volcano. In spite of these facts and the risk associated to these processes, there were only one seismological permanent station in Chamela on the coast of Jalisco and an analog telemetric network (RESCO) located on the Colima Volcano and the south part of the Colima Rift Zone (CRZ). By these reasons, the Unidad Estatal de Protección Civil de Jalisco (Jalisco Civil Defense) began a project to install a Digital Telemetric Network in the region in several phases, this project is carrying out jointly with SisVOc UdeG.; due to the size of the area and the topography of the region it is very difficult to get direct telemetric links, by these reasons the network is designed in cells with nodes, where the nodes are the different Campus of the University of Guadalajara located in the region, all Campus are linked by a computer network. First phase started in August 2001, it includes the installation of six stations, each station with a Kinemetrics Everest 24 bit datalogger, GPS time, and a Lennartz LE3Dlite 1Hz sensor, using KNI NMS to control and data acquisition; these stations were deployed in two cells, each one with three stations. The first one in the area of Bahía de Banderas with direct telemetric links to SisVOc in Campus Puerto Vallarta where is located the central station. The second cell is located from the Colima Volcano to north of CRZ, the first three stations of this cell were installed on the volcano to complement RESCO and to improve the quantity and quality of data from volcano. The stations transmit to the Jalisco Civil Defense base in Cd. Guzman (Zapotlan) which is linked to Campus Cd. Guzman located aside of the base, then from the Campus Cd. Guzman the data are sending through UdeG computer network to Campus Puerto Vallarta where are processed and analysed and returned to Civil Defense base in Cd. Guzman. To guarantee continuity in the transmission of data, these will be sending by INTERNET-2 protocols using Quality of Service. Second phase will start as soon first phase be completely operational, this phase include six additional seismic station, three for each cell and install Antelope system for data acquisition and control. In third phase two cells more will be added in the north and the east of the region; also meteorological instruments will be installed in each seismic station and video cameras and GPS instruments in selected stations.
GEOSCOPE Observatory Recent Developments
NASA Astrophysics Data System (ADS)
Leroy, N.; Pardo, C.; Bonaime, S.; Stutzmann, E.; Maggi, A.
2010-12-01
The GEOSCOPE observatory consists of a global seismic network and a data center. The 31 GEOSCOPE stations are installed in 19 countries, across all continents and on islands throughout the oceans. They are equipped with three component very broadband seismometers (STS1 or STS2) and 24 or 26 bit digitizers, as required by the Federation of Seismic Digital Network (FDSN). In most stations, a pressure gauge and a thermometer are also installed. Currently, 23 stations send data in real or near real time to GEOSCOPE Data Center and tsunami warning centers. In 2009, two stations (SSB and PPTF) have been equipped with warpless base plates. Analysis of one year of data shows that the new installation decreases long period noise (20s to 1000s) by 10 db on horizontal components. SSB is now rated in the top ten long period stations for horizontal components according to the LDEO criteria. In 2010, Stations COYC, PEL and RER have been upgraded with Q330HR, Metrozet electronics and warpless base plates. They have been calibrated with the calibration table CT-EW1 and the software jSeisCal and Calex-EW. Aluminum jars are now installed instead of glass bells. A vacuum of 100 mbars is applied in the jars which improves thermal insulation of the seismometers and reduces moisture and long-term corrosion in the sensor. A new station RODM has just been installed in Rodrigues Island in Mauritius with standard Geoscope STS2 setup: STS2 seismometer on a granite base plate and covered by cooking pot and thermal insulation, it is connected to Q330HR digitizer, active lightning protection, Seiscomp PC and real-time internet connection. Continuous data of all stations are collected in real time or with a delay by the GEOSCOPE Data Center in Paris where they are validated, archived and made available to the international scientific community. Data are freely available to users by different interfaces according data types (see : http://geoscope.ipgp.fr) - Continuous data in real time coming from 23 stations to GEOSCOPE Data Center are available automatically using the seedlink protocol developed by GEOFON (GFZ, Germany). Seedlink also enables to make these data accessible in real time to Tsunami Warning Centers and to other data centers. - Validated continuous waveforms and metadata of all stations are available by using the NetDC system (Networked Data Centers) and Data Handler Interface (DHI, IRIS-DMC) via DHI Clients. Data can be requested from GEOSCOPE Data Center and from other networked centers associated to the FDSN. - A selection of seismograms corresponding to large earthquakes through the GEOSCOPE web portal. - The power spectrum estimates of the seismic noise averaged over sequences of 24 hours for each station. The noise level of the last 10 years of continuous data has been computed and is accessible via the web. The noise level of real time data is computed at day-8. GEOSCOPE data center is networked to the French virtual data center, FOSFORE/RESIF, in order to give a unique access to French seismological data. In Europe, EIDA (European Integrated Data Archive) is operational since June 2009. GEOSCOPE/IPGP is one of the four primary nodes archiving and distributing data inside EIDA. All GEOSCOPE data are available via the European Seismic Portal (http://www.seismicportal.eu).
A facility for training Space Station astronauts
NASA Technical Reports Server (NTRS)
Hajare, Ankur R.; Schmidt, James R.
1992-01-01
The Space Station Training Facility (SSTF) will be the primary facility for training the Space Station Freedom astronauts and the Space Station Control Center ground support personnel. Conceptually, the SSTF will consist of two parts: a Student Environment and an Author Environment. The Student Environment will contain trainers, instructor stations, computers and other equipment necessary for training. The Author Environment will contain the systems that will be used to manage, develop, integrate, test and verify, operate and maintain the equipment and software in the Student Environment.
NASA Astrophysics Data System (ADS)
Morlot, T.; Mathevet, T.; Perret, C.; Favre Pugin, A. C.
2014-12-01
Streamflow uncertainty estimation has recently received a large attention in the literature. A dynamic rating curve assessment method has been introduced (Morlot et al., 2014). This dynamic method allows to compute a rating curve for each gauging and a continuous streamflow time-series, while calculating streamflow uncertainties. Streamflow uncertainty takes into account many sources of uncertainty (water level, rating curve interpolation and extrapolation, gauging aging, etc.) and produces an estimated distribution of streamflow for each days. In order to caracterise streamflow uncertainty, a probabilistic framework has been applied on a large sample of hydrometric stations of the Division Technique Générale (DTG) of Électricité de France (EDF) hydrometric network (>250 stations) in France. A reliability diagram (Wilks, 1995) has been constructed for some stations, based on the streamflow distribution estimated for a given day and compared to a real streamflow observation estimated via a gauging. To build a reliability diagram, we computed the probability of an observed streamflow (gauging), given the streamflow distribution. Then, the reliability diagram allows to check that the distribution of probabilities of non-exceedance of the gaugings follows a uniform law (i.e., quantiles should be equipropables). Given the shape of the reliability diagram, the probabilistic calibration is caracterised (underdispersion, overdispersion, bias) (Thyer et al., 2009). In this paper, we present case studies where reliability diagrams have different statistical properties for different periods. Compared to our knowledge of river bed morphology dynamic of these hydrometric stations, we show how reliability diagram gives us invaluable information on river bed movements, like a continuous digging or backfilling of the hydraulic control due to erosion or sedimentation processes. Hence, the careful analysis of reliability diagrams allows to reconcile statistics and long-term river bed morphology processes. This knowledge improves our real-time management of hydrometric stations, given a better caracterisation of erosion/sedimentation processes and the stability of hydrometric station hydraulic control.
Statistical summaries of New Jersey streamflow records
Laskowski, Stanley L.
1970-01-01
In 1961 the U.S. Geological Survey prepared a report which was published by the State of New Jersey as Water Resources Circular 6, "New Jersey Streamflow Records analyzed with Electronic Computer" by Miller and McCall. Basic discharge data for periods of record through 1958 were analyzed for 59 stream-gaging stations in New Jersey and flow-duration, low-flow, and high-flow tables were presented.The purpose of the current report is to update and expand Circular 6 by presenting, with a few meaningful statistics and tables, the bulk of the information that may be obtained from the mass of streamflow records available. The records for 79 of approximately 110 stream-gaging stations presently or previously operated in New Jersey, plus records for three stations in Pennsylvania, and one in New York are presented in summarized form. In addition to inclusing a great number of stations in this report, more years of record and more tables are listed for each station. A description of the station, three arrangements of data summarizing the daily flow records and one table listing statistics of the monthly mean flows are provided. No data representing instantaneous extreme flows are given. Plotting positions for the three types of curves describing the characteristics of daily discharge are listed for each station. Statistical parameters are also presented so that alternate curves may be drawn.All stations included in this report have 5 or more years of record. The data presented herein are based on observed flow past the gaging station. For any station where the observed flow is affected by regulation or diversion, a "Remarks" paragraph, explaining the possible effect on the data, is included in the station description.Since any streamflow record is a sample in time, the data derived from these records can provide only a guide to expected future flows. For this reason the flow records are analyzed by statistical techniques, and the magnitude of sampling errors should be recognized.These analyzed data will be useful to a large number of municipal, state, and federal agencies, industries, utilities, engineers, and hydrologists concerned with the availability, conservation, control, and use of surface waters. The tabulated data and curves illustrated herein can be used to select sites for water supplies, to determine flood or drought storage requirements, and to appraise the adequacy of flows for dilution of wastes or generation of power. The statistical values presented herein can be used in computer programs available in many universities, Federal and State agencies, and engineering firms for a broad spectrum of research and other studies.
NASA Astrophysics Data System (ADS)
Rakvic, Ryan N.; Ives, Robert W.; Lira, Javier; Molina, Carlos
2011-01-01
General purpose computer designers have recently begun adding cores to their processors in order to increase performance. For example, Intel has adopted a homogeneous quad-core processor as a base for general purpose computing. PlayStation3 (PS3) game consoles contain a multicore heterogeneous processor known as the Cell, which is designed to perform complex image processing algorithms at a high level. Can modern image-processing algorithms utilize these additional cores? On the other hand, modern advancements in configurable hardware, most notably field-programmable gate arrays (FPGAs) have created an interesting question for general purpose computer designers. Is there a reason to combine FPGAs with multicore processors to create an FPGA multicore hybrid general purpose computer? Iris matching, a repeatedly executed portion of a modern iris-recognition algorithm, is parallelized on an Intel-based homogeneous multicore Xeon system, a heterogeneous multicore Cell system, and an FPGA multicore hybrid system. Surprisingly, the cheaper PS3 slightly outperforms the Intel-based multicore on a core-for-core basis. However, both multicore systems are beaten by the FPGA multicore hybrid system by >50%.
Electronic media use and addiction among youth in psychiatric clinic versus school populations.
Baer, Susan; Saran, Kelly; Green, David A; Hong, Irene
2012-12-01
Electronic media use is highly prevalent among today's youth, and its overuse in the general population has been consistently associated with the presence of psychiatric symptoms. In contrast, little information exists about electronic media use among youth with psychiatric disorders. Our study aims to compare patterns of television and computer and gaming station use among youth in psychiatric clinic and community-based school populations. Surveys were completed by 210 youth and parents, from school (n = 110) and psychiatric clinic (n = 100) populations. Duration and frequency of television, video gaming, and nongaming computer activities were ascertained, along with addictive features of use. Descriptive and comparative analyses were conducted, with a statistical threshold of P < 0.05. Quantitative and qualitative differences were identified between the patterns of use reported by the 2 groups. The mean reported daily duration of exposure to electronic media use was 6.6 hours (SD 4.1) for the clinic sample and 4.6 hours (SD 2.6) for the school sample (P < 0.01). Self-reported rates of addictive patterns related to computer and gaming station use were similar between the 2 populations. However, the clinically based sample favoured more violent games, with 29% reporting playing mature-rated games, compared with 13% reported by the school-based sample (P = 0.02). Youth with externalizing disorders expended greater time video gaming, compared with youth with internalizing disorders (P = 0.01). Clinically based samples of youth with mental illnesses spend more time engaged in electronic media activities and are more likely to play violent video games, compared with youth in the general population. Further research is needed to determine the long-term implications of these differences.
The Shuttle Mission Simulator computer generated imagery
NASA Technical Reports Server (NTRS)
Henderson, T. H.
1984-01-01
Equipment available in the primary training facility for the Space Transportation System (STS) flight crews includes the Fixed Base Simulator, the Motion Base Simulator, the Spacelab Simulator, and the Guidance and Navigation Simulator. The Shuttle Mission Simulator (SMS) consists of the Fixed Base Simulator and the Motion Base Simulator. The SMS utilizes four visual Computer Generated Image (CGI) systems. The Motion Base Simulator has a forward crew station with six-degrees of freedom motion simulation. Operation of the Spacelab Simulator is planned for the spring of 1983. The Guidance and Navigation Simulator went into operation in 1982. Aspects of orbital visual simulation are discussed, taking into account the earth scene, payload simulation, the generation and display of 1079 stars, the simulation of sun glare, and Reaction Control System jet firing plumes. Attention is also given to landing site visual simulation, and night launch and landing simulation.
CSM research: Methods and application studies
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.
1989-01-01
Computational mechanics is that discipline of applied science and engineering devoted to the study of physical phenomena by means of computational methods based on mathematical modeling and simulation, utilizing digital computers. The discipline combines theoretical and applied mechanics, approximation theory, numerical analysis, and computer science. Computational mechanics has had a major impact on engineering analysis and design. When applied to structural mechanics, the discipline is referred to herein as computational structural mechanics. Complex structures being considered by NASA for the 1990's include composite primary aircraft structures and the space station. These structures will be much more difficult to analyze than today's structures and necessitate a major upgrade in computerized structural analysis technology. NASA has initiated a research activity in structural analysis called Computational Structural Mechanics (CSM). The broad objective of the CSM activity is to develop advanced structural analysis technology that will exploit modern and emerging computers, such as those with vector and/or parallel processing capabilities. Here, the current research directions for the Methods and Application Studies Team of the Langley CSM activity are described.
Tools for Analyzing Computing Resource Management Strategies and Algorithms for SDR Clouds
NASA Astrophysics Data System (ADS)
Marojevic, Vuk; Gomez-Miguelez, Ismael; Gelonch, Antoni
2012-09-01
Software defined radio (SDR) clouds centralize the computing resources of base stations. The computing resource pool is shared between radio operators and dynamically loads and unloads digital signal processing chains for providing wireless communications services on demand. Each new user session request particularly requires the allocation of computing resources for executing the corresponding SDR transceivers. The huge amount of computing resources of SDR cloud data centers and the numerous session requests at certain hours of a day require an efficient computing resource management. We propose a hierarchical approach, where the data center is divided in clusters that are managed in a distributed way. This paper presents a set of computing resource management tools for analyzing computing resource management strategies and algorithms for SDR clouds. We use the tools for evaluating a different strategies and algorithms. The results show that more sophisticated algorithms can achieve higher resource occupations and that a tradeoff exists between cluster size and algorithm complexity.
2004-02-03
KENNEDY SPACE CENTER, FLA. - Astronaut Tim Kopra aids in Intravehicular Activity (IVA) constraints testing on the Italian-built Node 2, a future element of the International Space Station. The second of three Station connecting modules, the Node 2 attaches to the end of the U.S. Lab and provides attach locations for several other elements. Kopra is currently assigned technical duties in the Space Station Branch of the Astronaut Office, where his primary focus involves the testing of crew interfaces for two future ISS modules as well as the implementation of support computers and operational Local Area Network on ISS. Node 2 is scheduled to launch on mission STS-120, Station assembly flight 10A.
International Space Station (ISS)
1995-04-17
This computer generated scene of the International Space Station (ISS) represents the first addition of hardware following the completion of Phase II. The 8-A Phase shows the addition of the S-9 truss.
NASA Technical Reports Server (NTRS)
Butler, J. H.
1971-01-01
A preliminary analysis of the relative motion of a free flying experiment module in the vicinity of a space station under the perturbative effects of drag and earth oblateness was made. A listing of a computer program developed for determining the relative motion of a module utilizing the Cowell procedure is presented, as well as instructions for its use.
ERIC Educational Resources Information Center
Padilla Mercado, Jeralyne B.; Coombs, Eri M.; De Jesus, Jenny P.; Bretz, Stacey Lowery; Danielson, Neil D.
2018-01-01
Multifunctional chemical analysis (MCA) systems provide a viable alternative for large scale instruction while supporting a hands-on approach to more advanced instrumentation. These systems are robust and typically use student stations connected to a remote central computer for data collection, minimizing the need for computers at every student…
Computer-aided controllability assessment of generic manned Space Station concepts
NASA Technical Reports Server (NTRS)
Ferebee, M. J.; Deryder, L. J.; Heck, M. L.
1984-01-01
NASA's Concept Development Group assessment methodology for the on-orbit rigid body controllability characteristics of each generic configuration proposed for the manned space station is presented; the preliminary results obtained represent the first step in the analysis of these eight configurations. Analytical computer models of each configuration were developed by means of the Interactive Design Evaluation of Advanced Spacecraft CAD system, which created three-dimensional geometry models of each configuration to establish dimensional requirements for module connectivity, payload accommodation, and Space Shuttle berthing; mass, center-of-gravity, inertia, and aerodynamic drag areas were then derived. Attention was also given to the preferred flight attitude of each station concept.
Developing the human-computer interface for Space Station Freedom
NASA Technical Reports Server (NTRS)
Holden, Kritina L.
1991-01-01
For the past two years, the Human-Computer Interaction Laboratory (HCIL) at the Johnson Space Center has been involved in prototyping and prototype reviews of in support of the definition phase of the Space Station Freedom program. On the Space Station, crew members will be interacting with multi-monitor workstations where interaction with several displays at one time will be common. The HCIL has conducted several experiments to begin to address design issues for this complex system. Experiments have dealt with design of ON/OFF indicators, the movement of the cursor across multiple monitors, and the importance of various windowing capabilities for users performing multiple tasks simultaneously.
DomeGene Experiment at Cell Biology Experiment Facility (CBEF) in JPM
2009-03-18
ISS018-E-040985 (18 March 2009) --- Japan Aerospace Exploration Agency (JAXA) astronaut Koichi Wakata, Expedition 18 flight engineer, uses a computer at the Japanese Remote Manipulator System (JEM-RMS) work station in the Kibo laboratory of the International Space Station while Space Shuttle Discovery (STS-119) remains docked with the station.
DomeGene Experiment at Cell Biology Experiment Facility (CBEF) in JPM
2009-03-18
ISS018-E-040986 (18 March 2009) --- Japan Aerospace Exploration Agency (JAXA) astronaut Koichi Wakata, Expedition 18 flight engineer, uses a computer at the Japanese Remote Manipulator System (JEM-RMS) work station in the Kibo laboratory of the International Space Station while Space Shuttle Discovery (STS-119) remains docked with the station.
Space Station 20-kHz power management and distribution system
NASA Technical Reports Server (NTRS)
Hansen, Irving G.; Sundberg, Gale R.
1986-01-01
During the conceptual design phase a 20-kHz power distribution system was selected as the reference for the Space Station. The system is single-phase 400 VRMS, with a sinusoidal wave form. The initial user power level will be 75 kW with growth to 300 kW. The high-frequency system selection was based upon considerations of efficiency, weight, safety, ease of control, interface with computers, and ease of paralleling for growth. Each of these aspects will be discussed as well as the associated trade-offs involved. An advanced development program has been instituted to accelerate the maturation of the high-frequency system. Some technical aspects of the advanced development will be discussed.
Space station 20-kHz power management and distribution system
NASA Technical Reports Server (NTRS)
Hansen, I. G.; Sundberg, G. R.
1986-01-01
During the conceptual design phase a 20-kHz power distribution system was selected as the reference for the space station. The system is single-phase 400 VRMS, with a sinusoidal wave form. The initial user power level will be 75 kW with growth to 300 kW. The high-frequency system selection was based upon considerations of efficiency, weight, safety, ease of control, interface with computers, and ease of paralleling for growth. Each of these aspects will be discussed as well as the associated trade-offs involved. An advanced development program has been instituted to accelerate the maturation of the high-frequency system. Some technical aspects of the advanced development will be discussed.
Design of a monitor and simulation terminal (master) for space station telerobotics and telescience
NASA Technical Reports Server (NTRS)
Lopez, L.; Konkel, C.; Harmon, P.; King, S.
1989-01-01
Based on Space Station and planetary spacecraft communication time delays and bandwidth limitations, it will be necessary to develop an intelligent, general purpose ground monitor terminal capable of sophisticated data display and control of on-orbit facilities and remote spacecraft. The basic elements that make up a Monitor and Simulation Terminal (MASTER) include computer overlay video, data compression, forward simulation, mission resource optimization and high level robotic control. Hardware and software elements of a MASTER are being assembled for testbed use. Applications of Neural Networks (NNs) to some key functions of a MASTER are also discussed. These functions are overlay graphics adjustment, object correlation and kinematic-dynamic characterization of the manipulator.
DPOD2005: An extension of ITRF2005 for Precise Orbit Determination
NASA Astrophysics Data System (ADS)
Willis, P.; Ries, J. C.; Zelensky, N. P.; Soudarin, L.; Fagard, H.; Pavlis, E. C.; Lemoine, F. G.
2009-09-01
For Precise Orbit Determination of altimetry missions, we have computed a data set of DORIS station coordinates defined for specific time intervals called DPOD2005. This terrestrial reference set is an extension of ITRF2005. However, it includes all new DORIS stations and is more reliable, as we disregard stations with large velocity formal errors as they could contaminate POD computations in the near future. About 1/4 of the station coordinates need to be defined as they do not appear in the original ITRF2005 realization. These results were verified with available DORIS and GPS results, as the integrity of DPOD2005 is almost as critical as its accuracy. Besides station coordinates and velocities, we also provide additional information such as periods for which DORIS data should be disregarded for specific DORIS stations, and epochs of coordinate and velocity discontinuities (related to either geophysical events, equipment problem or human intervention). The DPOD model was tested for orbit determination for TOPEX/Poseidon (T/P), Jason-1 and Jason-2. Test results show DPOD2005 offers improvement over the original ITRF2005, improvement that rapidly and significantly increases after 2005. Improvement is also significant for the early T/P cycles indicating improved station velocities in the DPOD2005 model and a more complete station set. Following 2005 the radial accuracy and centering of the ITRF2005-original orbits rapidly degrades due to station loss.
Validation of heart and lung teleauscultation on an Internet-based system.
Fragasso, Gabriele; De Benedictis, Marialuisa; Palloshi, Altin; Moltrasio, Marco; Cappelletti, Alberto; Carlino, Mauro; Marchisi, Angelo; Pala, Mariagrazia; Alfieri, Ottavio; Margonato, Alberto
2003-11-01
The feasibility and accuracy of an Internet-based system for teleauscultation was evaluated in 103 cardiac patients, who were auscultated by the same cardiologist with a conventional stethoscope and with an Internet-based method, using an electronic stethoscope and transmitting heart and lung sounds between computer work stations. In 92% of patients, the results of electronic and acoustic auscultation coincided, indicating that teleauscultation may be considered a reliable method for assessing cardiac patients and could, therefore, be adopted in the context of comprehensive telecare programs.
Lopez-Alegria with records experiment data
2006-10-03
ISS014-E-05129 (3 Oct. 2006) --- Astronaut Michael E. Lopez-Alegria, Expedition 14 commander and NASA space station science officer, uses a computer in the Destiny laboratory of the International Space Station.
Outlaw, G.S.; Butner, D.E.; Kemp, R.L.; Oaks, A.T.; Adams, G.S.
1992-01-01
Rainfall, stage, and streamflow data in the Murfreesboro area, Middle Tennessee, were collected from March 1989 through July 1992 from a network of 68 gaging stations. The network consists of 10 tipping-bucket rain gages, 2 continuous-record streamflow gages, 4 partial-record flood hydrograph gages, and 72 crest-stage gages. Data collected by the gages includes 5minute time-step rainfall hyetographs, 15-minute time-step flood hydrographs, and peak-stage elevations. Data are stored in a computer data base and are available for many computer modeling and engineering applications.
Robust estimators for speech enhancement in real environments
NASA Astrophysics Data System (ADS)
Sandoval-Ibarra, Yuma; Diaz-Ramirez, Victor H.; Kober, Vitaly
2015-09-01
Common statistical estimators for speech enhancement rely on several assumptions about stationarity of speech signals and noise. These assumptions may not always valid in real-life due to nonstationary characteristics of speech and noise processes. We propose new estimators based on existing estimators by incorporation of computation of rank-order statistics. The proposed estimators are better adapted to non-stationary characteristics of speech signals and noise processes. Through computer simulations we show that the proposed estimators yield a better performance in terms of objective metrics than that of known estimators when speech signals are contaminated with airport, babble, restaurant, and train-station noise.
NASA Astrophysics Data System (ADS)
Yoon, S.
2016-12-01
To define geodetic reference frame using GPS data collected by Continuously Operating Reference Stations (CORS) network, historical GPS data needs to be reprocessed regularly. Reprocessing GPS data collected by upto 2000 CORS sites for the last two decades requires a lot of computational resource. At National Geodetic Survey (NGS), there has been one completed reprocessing in 2011, and currently, the second reprocessing is undergoing. For the first reprocessing effort, in-house computing resource was utilized. In the current second reprocessing effort, outsourced cloud computing platform is being utilized. In this presentation, the outline of data processing strategy at NGS is described as well as the effort to parallelize the data processing procedure in order to maximize the benefit of the cloud computing. The time and cost savings realized by utilizing cloud computing approach will also be discussed.
2004-02-03
KENNEDY SPACE CENTER, FLA. - Astronaut Tim Kopra (second from right) talks with workers in the Space Station Processing Facility about the Intravehicular Activity (IVA) constraints testing on the Italian-built Node 2, a future element of the International Space Station. . The second of three Station connecting modules, the Node 2 attaches to the end of the U.S. Lab and provides attach locations for several other elements. Kopra is currently assigned technical duties in the Space Station Branch of the Astronaut Office, where his primary focus involves the testing of crew interfaces for two future ISS modules as well as the implementation of support computers and operational Local Area Network on ISS. Node 2 is scheduled to launch on mission STS-120, Station assembly flight 10A.
Plouff, Donald
2000-01-01
Gravity observations are directly made or are obtained from other sources by the U.S. Geological Survey in order to prepare maps of the anomalous gravity field and consequently to interpret the subsurface distribution of rock densities and associated lithologic or geologic units. Observations are made in the field with gravity meters at new locations and at reoccupations of previously established gravity "stations." This report illustrates an interactively-prompted series of steps needed to convert gravity "readings" to values that are tied to established gravity datums and includes computer programs to implement those steps. Inasmuch as individual gravity readings have small variations, gravity-meter (instrument) drift may not be smoothly variable, and acommodations may be needed for ties to previously established stations, the reduction process is iterative. Decision-making by the program user is prompted by lists of best values and graphical displays. Notes about irregularities of topography, which affect the value of observed gravity but are not shown in sufficient detail on topographic maps, must be recorded in the field. This report illustrates ways to record field notes (distances, heights, and slope angles) and includes computer programs to convert field notes to gravity terrain corrections. This report includes approaches that may serve as models for other applications, for example: portrayal of system flow; style of quality control to document and validate computer applications; lack of dependence on proprietary software except source code compilation; method of file-searching with a dwindling list; interactive prompting; computer code to write directly in the PostScript (Adobe Systems Incorporated) printer language; and high-lighting the four-digit year on the first line of time-dependent data sets for assured Y2K compatibility. Computer source codes provided are written in the Fortran scientific language. In order for the programs to operate, they first must be converted (compiled) into an executable form on the user's computer. Although program testing was done in a UNIX (tradename of American Telephone and Telegraph Company) computer environment, it is anticipated that only a system-dependent date-and-time function may need to be changed for adaptation to other computer platforms that accept standard Fortran code.d del iliscipit volorer sequi ting etue feum zzriliquatum zzriustrud esenibh ex esto esequat.
Source effects on the simulation of the strong groud motion of the 2011 Lorca earthquake
NASA Astrophysics Data System (ADS)
Saraò, Angela; Moratto, Luca; Vuan, Alessandro; Mucciarelli, Marco; Jimenez, Maria Jose; Garcia Fernandez, Mariano
2016-04-01
On May 11, 2011 a moderate seismic event (Mw=5.2) struck the city of Lorca (South-East Spain) causing nine casualties, a large number of injured people and damages at the civil buildings. The largest PGA value (360 cm/s2) ever recorded so far in Spain, was observed at the accelerometric station located in Lorca (LOR), and it was explained as due to the source directivity, rather than to local site effects. During the last years different source models, retrieved from the inversions of geodetic or seismological data, or a combination of the two, have been published. To investigate the variability that equivalent source models of an average earthquake can introduce in the computation of strong motion, we calculated seismograms (up to 1 Hz), using an approach based on the wavenumber integration and, as input, four different source models taken from the literature. The source models differ mainly for the slip distribution on the fault. Our results show that, as effect of the different sources, the ground motion variability, in terms of pseudo-spectral velocity (1s), can reach one order of magnitude for near source receivers or for sites influenced by the forward-directivity effect. Finally, we compute the strong motion at frequencies higher than 1 Hz using the Empirical Green Functions and the source model parameters that better reproduce the recorded shaking up to 1 Hz: the computed seismograms fit satisfactorily the signals recorded at LOR station as well as at the other stations close to the source.
The determination of the most applicable PWV model for Turkey
NASA Astrophysics Data System (ADS)
Deniz, Ilke; Gurbuz, Gokhan; Mekik, Cetin
2016-07-01
Water vapor is a key component for modelling atmosphere and climate studies. Moreover, long-term water vapor changes can be an independent source for detecting climate changes. Since Global Navigation Satellite Systems (GNSS) use microwaves passing through the atmosphere, atmospheric effects are modeled with high accuracy. Tropospheric effects on GNSS signals are estimated with total zenith delay parameter (ZTD) which is the sum of hydrostatic (ZHD) and wet zenith delay (ZWD). The first component can be obtained from meteorological observations with high accuracy; the second component, however, can be computed by subtracting ZHD from ZTD (ZWD=ZTD-ZHD). Afterwards, the weighted mean temperature (Tm) or the conversion factor (Q) is used for the conversion between the precipitable water vapor (PWV) and ZWD. The parameters Tm and Q are derived from the analysis of radiosonde stations' profile observations. Numerous Q and Tm models have been developed for each radiosonde station, radiosonde station group, countries and global fields such as Bevis Tm model and Emardson and Derks' Q models. So, PWV models (Tm and Q models) applied for Turkey have been developed using a year of radiosonde data (2011) from 8 radiosonde stations. In this study the models developed are tested by comparing PWVGNSS computed applying Tm and Q models to the ZTD estimates derived by Bernese and GAMIT/GLOBK software at GNSS stations established at Istanbul and Ankara with those from the collocated radiosonde stations (PWVRS) from October 2013 to December 2014 with the data obtained from a project (no 112Y350) supported by the Scientific and Technological Research Council of Turkey (TUBITAK). The comparison results show that PWVGNSS and PWVRS are in high correlation (86 % for Ankara and 90% for Istanbul). Thus, the most applicable model for Turkey and the accuracy of GNSS meteorology are investigated. In addition, Tm model was applied to the ZTD estimates of 20 TUSAGA-Active (CORS-TR) stations in the 38.0°-42.0° northern latitudes and 28.0°-34.0° eastern longitudes of Turkey and PWV were computed. ZTD estimates of these stations were computed using Bernese GNSS Software v5.0 during the period from June 2013 to June 2014. Preceding the PWV estimation, meteorological parameters for these stations (temperature, pressure and humidity) are derived by applying spherical harmonics modelling and interpolation to the above-mentioned meteorological parameters measured by meteorological stations surrounding TUSAGA-Active stations. Results of spherical harmonics modelling and interpolation yield the precision of ±1.74 K in temperature, ±0.95 hPa in pressure and ±14.88 % in humidity. Also, the PWV of TUSAGA-Active stations selected were estimated.
Lopez-Alegria working in the U.S. Laboratory
2006-09-23
ISS013-E-84249 (23 Sept. 2006) --- Astronaut Michael E. Lopez-Alegria, Expedition 14 commander and NASA space station science officer, uses a computer in the Destiny laboratory of the International Space Station.
An objective structured clinical exam to measure intrinsic CanMEDS roles.
Kassam, Aliya; Cowan, Michèle; Donnon, Tyrone
2016-01-01
Background The CanMEDS roles provide a comprehensive framework to organize competency-based curricula; however, there is a challenge in finding feasible, valid, and reliable assessment methods to measure intrinsic roles such as Communicator and Collaborator. The objective structured clinical exam (OSCE) is more commonly used in postgraduate medical education for the assessment of clinical skills beyond medical expertise. Method We developed the CanMEDS In-Training Exam (CITE), a six-station OSCE designed to assess two different CanMEDS roles (one primary and one secondary) and general communication skills at each station. Correlation coefficients were computed for CanMEDS roles within and between stations, and for general communication, global rating, and total scores. One-way analysis of variance (ANOVA) was used to investigate differences between year of residency, sex, and the type of residency program. Results In total, 63 residents participated in the CITE; 40 residents (63%) were from internal medicine programs, whereas the remaining 23 (37%) were pursuing other specialties. There was satisfactory internal consistency for all stations, and the total scores of the stations were strongly correlated with the global scores r=0.86, p<0.05. Noninternal medicine residents scored higher in terms of the Professional competency overall, whereas internal medicine residents scored significantly higher in the Collaborator competency overall. Discussion The OSCE checklists developed for the assessment of intrinsic CanMEDS roles were functional, but the specific items within stations required more uniformity to be used between stations. More generic types of checklists may also improve correlations across stations. Conclusion An OSCE measuring intrinsic competence is feasible; however, further development of our cases and checklists is needed. We provide a model of how to develop an OSCE to measure intrinsic CanMEDS roles that educators may adopt as residency programs move into competency-based medical education.
Modernization of the Slovenian National Seismic Network
NASA Astrophysics Data System (ADS)
Vidrih, R.; Godec, M.; Gosar, A.; Sincic, P.; Tasic, I.; Zivcic, M.
2003-04-01
The Environmental Agency of the Republic of Slovenia, the Seismology Office is responsible for the fast and reliable information about earthquakes, originating in the area of Slovenia and nearby. In the year 2000 the project Modernization of the Slovenian National Seismic Network started. The purpose of a modernized seismic network is to enable fast and accurate automatic location of earthquakes, to determine earthquake parameters and to collect data of local, regional and global earthquakes. The modernized network will be finished in the year 2004 and will consist of 25 Q730 remote broadband data loggers based seismic station subsystems transmitting in real-time data to the Data Center in Ljubljana, where the Seismology Office is located. The remote broadband station subsystems include 16 surface broadband seismometers CMG-40T, 5 broadband seismometers CMG-40T with strong motion accelerographs EpiSensor, 4 borehole broadband seismometers CMG-40T, all with accurate timing provided by GPS receivers. The seismic network will cover the entire Slovenian territory, involving an area of 20,256 km2. The network is planned in this way; more seismic stations will be around bigger urban centres and in regions with greater vulnerability (NW Slovenia, Krsko Brezice region). By the end of the year 2002, three old seismic stations were modernized and ten new seismic stations were built. All seismic stations transmit data to UNIX-based computers running Antelope system software. The data is transmitted in real time using TCP/IP protocols over the Goverment Wide Area Network . Real-time data is also exchanged with seismic networks in the neighbouring countries, where the data are collected from the seismic stations, close to the Slovenian border. A typical seismic station consists of the seismic shaft with the sensor and the data acquisition system and, the service shaft with communication equipment (modem, router) and power supply with a battery box. which provides energy in case of mains failure. The data acquisition systems are recording continuous time-series sampled at 200 sps, 20 sps and 1sps.
The Data Base of the International Geodynamics and Earth Tide Service (IGETS)
NASA Astrophysics Data System (ADS)
Voigt, Christian; Förste, Christoph; Wziontek, Hartmut; Crossley, David; Meurers, Bruno; Pálinkáš, Vojtech; Hinderer, Jacques; Boy, Jean-Paul; Barriot, Jean-Pierre; Sun, Heping
2017-04-01
The International Geodynamics and Earth Tide Service (IGETS) was established in 2015 by the International Association of Geodesy (IAG). IGETS continues the activities of the Global Geodynamics Project (GGP, 1997-2015) to provide support to geodetic and geophysical research activities using superconducting gravimeter data within the context of an international network. The primary objective of IGETS is to provide a service for continuous ground based measurements to monitor temporal variations of the Earth's gravity field and deformation of the Earth's surface by long term records from ground gravimeters, tiltmeters, strainmeters and other geodynamic sensors. IGETS also continues the activities of the International Center for Earth Tides (ICET), in particular, in collecting, archiving and distributing Earth tide records from long series of the various geodynamic sensors. This presentation introduces the IGETS data base hosted by GFZ and accessible via http://igets.gfz-potsdam.de to the geodetic and geodynamics community as well as to all other interested data producers and users. At present, records from superconducting gravimeters at 34 stations worldwide are available. Level 1 products are raw gravity and local pressure records decimated at 1 minute samples. As a new feature, records with 1 or 2 seconds samples are already provided for a few stations. Level 2 products consist of gravity and pressure data corrected for instrumental perturbations and ready for tidal analysis, which are derived from Level 1 datasets and computed by the University of French Polynesia (Tahiti, French Polynesia). Gravity residuals after particular geophysical corrections (including solid Earth tides, polar motion, tidal and non-tidal loading effects) considered as Level 3 products are derived from Level 2 datasets and computed by EOST (Ecole et Observatoire des Sciences de la Terre, Strasbourg, France). The IGETS data sets are stored by GFZ on a FTP server and are freely available after a compulsory user registration. A major benefit of IGETS is the provision of digital object identifiers (DOI) by the research repository of GFZ Data Services for the data sets of every station. This ensures a long term storage and an increased visibility as part of an international network but also a proper data citation. At present, the IGETS data base is supported by 24 data producers providing records to almost 100 registered users. All relevant information on the data base, i.e., data availability and access, stations and sensors, conventional data formats, etc. are compiled in a specific scientific technical report (see http://doi.org/10.2312/GFZ.b103-16087). As IGETS is seeking for providing all kinds of long-term geodynamic time series, interested station operators are cordially invited to provide their data sets to the IGETS data base and, in return, benefit from being part of the IAG service IGETS.
Simulated building energy demand biases resulting from the use of representative weather stations
Burleyson, Casey D.; Voisin, Nathalie; Taylor, Z. Todd; ...
2017-11-06
Numerical building models are typically forced with weather data from a limited number of “representative cities” or weather stations representing different climate regions. The use of representative weather stations reduces computational costs, but often fails to capture spatial heterogeneity in weather that may be important for simulations aimed at understanding how building stocks respond to a changing climate. Here, we quantify the potential reduction in temperature and load biases from using an increasing number of weather stations over the western U.S. Our novel approach is based on deriving temperature and load time series using incrementally more weather stations, ranging frommore » 8 to roughly 150, to evaluate the ability to capture weather patterns across different seasons. Using 8 stations across the western U.S., one from each IECC climate zone, results in an average absolute summertime temperature bias of ~4.0 °C with respect to a high-resolution gridded dataset. The mean absolute bias drops to ~1.5 °C using all available weather stations. Temperature biases of this magnitude could translate to absolute summertime mean simulated load biases as high as 13.5%. Increasing the size of the domain over which biases are calculated reduces their magnitude as positive and negative biases may cancel out. Using 8 representative weather stations can lead to a 20–40% bias of peak building loads during both summer and winter, a significant error for capacity expansion planners who may use these types of simulations. Using weather stations close to population centers reduces both mean and peak load biases. Our approach could be used by others designing aggregate building simulations to understand the sensitivity to their choice of weather stations used to drive the models.« less
Simulated building energy demand biases resulting from the use of representative weather stations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burleyson, Casey D.; Voisin, Nathalie; Taylor, Z. Todd
Numerical building models are typically forced with weather data from a limited number of “representative cities” or weather stations representing different climate regions. The use of representative weather stations reduces computational costs, but often fails to capture spatial heterogeneity in weather that may be important for simulations aimed at understanding how building stocks respond to a changing climate. Here, we quantify the potential reduction in temperature and load biases from using an increasing number of weather stations over the western U.S. Our novel approach is based on deriving temperature and load time series using incrementally more weather stations, ranging frommore » 8 to roughly 150, to evaluate the ability to capture weather patterns across different seasons. Using 8 stations across the western U.S., one from each IECC climate zone, results in an average absolute summertime temperature bias of ~4.0 °C with respect to a high-resolution gridded dataset. The mean absolute bias drops to ~1.5 °C using all available weather stations. Temperature biases of this magnitude could translate to absolute summertime mean simulated load biases as high as 13.5%. Increasing the size of the domain over which biases are calculated reduces their magnitude as positive and negative biases may cancel out. Using 8 representative weather stations can lead to a 20–40% bias of peak building loads during both summer and winter, a significant error for capacity expansion planners who may use these types of simulations. Using weather stations close to population centers reduces both mean and peak load biases. Our approach could be used by others designing aggregate building simulations to understand the sensitivity to their choice of weather stations used to drive the models.« less
Simulated building energy demand biases resulting from the use of representative weather stations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burleyson, Casey D.; Voisin, Nathalie; Taylor, Z. Todd
Numerical building models are typically forced with weather data from a limited number of “representative cities” or weather stations representing different climate regions. The use of representative weather stations reduces computational costs, but often fails to capture spatial heterogeneity in weather that may be important for simulations aimed at understanding how building stocks respond to a changing climate. We quantify the potential reduction in bias from using an increasing number of weather stations over the western U.S. The approach is based on deriving temperature and load time series using incrementally more weather stations, ranging from 8 to roughly 150, tomore » capture weather across different seasons. Using 8 stations, one from each climate zone, across the western U.S. results in an average absolute summertime temperature bias of 7.2°F with respect to a spatially-resolved gridded dataset. The mean absolute bias drops to 2.8°F using all available weather stations. Temperature biases of this magnitude could translate to absolute summertime mean simulated load biases as high as 13.8%, a significant error for capacity expansion planners who may use these types of simulations. Increasing the size of the domain over which biases are calculated reduces their magnitude as positive and negative biases may cancel out. Using 8 representative weather stations can lead to a 20-40% overestimation of peak building loads during both summer and winter. Using weather stations close to population centers reduces both mean and peak load biases. This approach could be used by others designing aggregate building simulations to understand the sensitivity to their choice of weather stations used to drive the models.« less
Carreón, Gustavo; Gershenson, Carlos; Pineda, Luis A
2017-01-01
The equal headway instability-the fact that a configuration with regular time intervals between vehicles tends to be volatile-is a common regulation problem in public transportation systems. An unsatisfactory regulation results in low efficiency and possible collapses of the service. Computational simulations have shown that self-organizing methods can regulate the headway adaptively beyond the theoretical optimum. In this work, we develop a computer simulation for metro systems fed with real data from the Mexico City Metro to test the current regulatory method with a novel self-organizing approach. The current model considers overall system's data such as minimum and maximum waiting times at stations, while the self-organizing method regulates the headway in a decentralized manner using local information such as the passenger's inflow and the positions of neighboring trains. The simulation shows that the self-organizing method improves the performance over the current one as it adapts to environmental changes at the timescale they occur. The correlation between the simulation of the current model and empirical observations carried out in the Mexico City Metro provides a base to calculate the expected performance of the self-organizing method in case it is implemented in the real system. We also performed a pilot study at the Balderas station to regulate the alighting and boarding of passengers through guide signs on platforms. The analysis of empirical data shows a delay reduction of the waiting time of trains at stations. Finally, we provide recommendations to improve public transportation systems.
Gershenson, Carlos; Pineda, Luis A.
2017-01-01
The equal headway instability—the fact that a configuration with regular time intervals between vehicles tends to be volatile—is a common regulation problem in public transportation systems. An unsatisfactory regulation results in low efficiency and possible collapses of the service. Computational simulations have shown that self-organizing methods can regulate the headway adaptively beyond the theoretical optimum. In this work, we develop a computer simulation for metro systems fed with real data from the Mexico City Metro to test the current regulatory method with a novel self-organizing approach. The current model considers overall system’s data such as minimum and maximum waiting times at stations, while the self-organizing method regulates the headway in a decentralized manner using local information such as the passenger’s inflow and the positions of neighboring trains. The simulation shows that the self-organizing method improves the performance over the current one as it adapts to environmental changes at the timescale they occur. The correlation between the simulation of the current model and empirical observations carried out in the Mexico City Metro provides a base to calculate the expected performance of the self-organizing method in case it is implemented in the real system. We also performed a pilot study at the Balderas station to regulate the alighting and boarding of passengers through guide signs on platforms. The analysis of empirical data shows a delay reduction of the waiting time of trains at stations. Finally, we provide recommendations to improve public transportation systems. PMID:29287120
NASA Technical Reports Server (NTRS)
Barber, Bryan; Kahn, Laura; Wong, David
1990-01-01
Offshore operations such as oil drilling and radar monitoring require semisubmersible platforms to remain stationary at specific locations in the Gulf of Mexico. Ocean currents, wind, and waves in the Gulf of Mexico tend to move platforms away from their desired locations. A computer model was created to predict the station keeping requirements of a platform. The computer simulation uses remote sensing data from satellites and buoys as input. A background of the project, alternate approaches to the project, and the details of the simulation are presented.
Status of NGS CORS Network and Its Contribution to the GGOS Infrastructure
NASA Astrophysics Data System (ADS)
Choi, K. K.; Haw, D.; Sun, L.
2017-12-01
Recent advancement of Satellite Geodesy techniques can now contribute to the global frame realization needed to improve worldwide accuracies. These techniques rely on coordinates computed using continuously observed GPS data and corresponding satellite orbits. The GPS-based reference system continues to depend on the physical stability of a ground-based network of points as the primary foundation for these observations. NOAA's National Geodetic Survey (NGS) has been operating Continuously Operating Reference Stations (CORS) to provide direct access to the National Spatial Reference System (NSRS). By virtue of NGS' scientific reputation and leadership in national and international geospatial issues, NGS has determined to increase its participation in the maintenance of the U.S. component of the global GPS tracking network in order to realize a long-term stable national terrestrial reference frame. NGS can do so by leveraging its national leadership role coupled with NGS' scientific expertise, in designating and upgrading a subset of the current tracking network for this purpose. This subset of stations must have the highest operational standards to serve the dual functions: being the U.S. contribution to the international frame, along with providing the link to the national datum. These stations deserve special attention to ensure that the highest possible levels of quality and stability are maintained. To meet this need, NGS is working with the international scientific groups to add and designate these reference stations based on scientific merit such as: colocation with other geodetic techniques, geographic area, and monumentation stability.
Installation Restoration Program Preliminary Assessment, Big Mountain Radio Relay Station, Alaska
1989-04-01
DETOX , Inc. (1986): Manager, Technical Services Responsible for the overall development, design, project management and implementation of various...management functions, as well as direct involvement in project marketing, corporate computer and CAD operations, and company R&D efforts. 5 DETOX , Inc...interest base for the groundwater treatment equipment and technical services offered by DETOX , as well as sale and management of several substantial and
The whole space three-dimensional magnetotelluric inversion algorithm with static shift correction
NASA Astrophysics Data System (ADS)
Zhang, K.
2016-12-01
Base on the previous studies on the static shift correction and 3D inversion algorithms, we improve the NLCG 3D inversion method and propose a new static shift correction method which work in the inversion. The static shift correction method is based on the 3D theory and real data. The static shift can be detected by the quantitative analysis of apparent parameters (apparent resistivity and impedance phase) of MT in high frequency range, and completed correction with inversion. The method is an automatic processing technology of computer with 0 cost, and avoids the additional field work and indoor processing with good results.The 3D inversion algorithm is improved (Zhang et al., 2013) base on the NLCG method of Newman & Alumbaugh (2000) and Rodi & Mackie (2001). For the algorithm, we added the parallel structure, improved the computational efficiency, reduced the memory of computer and added the topographic and marine factors. So the 3D inversion could work in general PC with high efficiency and accuracy. And all the MT data of surface stations, seabed stations and underground stations can be used in the inversion algorithm. The verification and application example of 3D inversion algorithm is shown in Figure 1. From the comparison of figure 1, the inversion model can reflect all the abnormal bodies and terrain clearly regardless of what type of data (impedance/tipper/impedance and tipper). And the resolution of the bodies' boundary can be improved by using tipper data. The algorithm is very effective for terrain inversion. So it is very useful for the study of continental shelf with continuous exploration of land, marine and underground.The three-dimensional electrical model of the ore zone reflects the basic information of stratum, rock and structure. Although it cannot indicate the ore body position directly, the important clues are provided for prospecting work by the delineation of diorite pluton uplift range. The test results show that, the high quality of the data processing and efficient inversion method for electromagnetic method is an important guarantee for porphyry ore.
NASA Technical Reports Server (NTRS)
Groom, N. J.; Anderson, W. W.; Phillips, W. H. (Inventor)
1981-01-01
The invention includes an angular momentum control device (AMCD) having a rim and several magnetic bearing stations. The AMCD is in a strapped down position on a spacecraft. Each magnetic bearing station comprises means, including an axial position sensor, for controlling the position of the rim in the axial direction; and means, including a radial position sensor, for controlling the position of the rim in the radial direction. A first computer receives the signals from all the axial position sensors and computes the angular rates about first and second mutually perpendicular axes in the plane of the rim and computes the linear acceleration along a third axis perpendicular to the first and second axes. A second computer receives the signals from all the radial position sensors and computes the linear accelerations along the first and second axes.
1983-08-01
AD- R136 99 THE INTEGRATED MISSION-PLNNING STATION: FUNCTIONAL 1/3 REQUIREMENTS AVIATOR-..(U) RNACAPR SCIENCES INC SANTA BARBARA CA S P ROGERS RUG...Continue on reverse side o necess.ar and identify by btock number) Interactive Systems Aviation Control-Display Functional Require- Plan-Computer...Dialogue Avionics Systems ments Map Display Army Aviation Design Criteria Helicopters M4ission Planning Cartography Digital Map Human Factors Navigation
STS-41 Commander Richards uses DTO 1206 portable computer onboard OV-103
NASA Technical Reports Server (NTRS)
1990-01-01
STS-41 Commander Richard N. Richards, at pilots station, uses Detailed Test Objective (DTO) Space Station Cursor Control Device Evaluation MACINTOSH portable computer on the forward flight deck of Discovery, Orbiter Vehicle (OV) 103. Richards tests the roller ball cursor control device. Surrounding Richards are checklists, forward flight deck windows, his lightweight communications kit assembly headset, a beverage container (orange-mango drink), and the pilots seat back and headrest.
Conceptual spacecraft systems design and synthesis
NASA Technical Reports Server (NTRS)
Wright, R. L.; Deryder, D. D.; Ferebee, M. J., Jr.
1984-01-01
An interactive systems design and synthesis is performed on future spacecraft concepts using the Interactive Design and Evaluation of Advanced Systems (IDEAS) computer-aided design and analysis system. The capabilities and advantages of the systems-oriented interactive computer-aided design and analysis system are described. The synthesis of both large antenna and space station concepts, and space station evolutionary growth designs is demonstrated. The IDEAS program provides the user with both an interactive graphics and an interactive computing capability which consists of over 40 multidisciplinary synthesis and analysis modules. Thus, the user can create, analyze, and conduct parametric studies and modify earth-orbiting spacecraft designs (space stations, large antennas or platforms, and technologically advanced spacecraft) at an interactive terminal with relative ease. The IDEAS approach is useful during the conceptual design phase of advanced space missions when a multiplicity of parameters and concepts must be analyzed and evaluated in a cost-effective and timely manner.
Interactive systems design and synthesis of future spacecraft concepts
NASA Technical Reports Server (NTRS)
Wright, R. L.; Deryder, D. D.; Ferebee, M. J., Jr.
1984-01-01
An interactive systems design and synthesis is performed on future spacecraft concepts using the Interactive Design and Evaluation of Advanced spacecraft (IDEAS) computer-aided design and analysis system. The capabilities and advantages of the systems-oriented interactive computer-aided design and analysis system are described. The synthesis of both large antenna and space station concepts, and space station evolutionary growth is demonstrated. The IDEAS program provides the user with both an interactive graphics and an interactive computing capability which consists of over 40 multidisciplinary synthesis and analysis modules. Thus, the user can create, analyze and conduct parametric studies and modify Earth-orbiting spacecraft designs (space stations, large antennas or platforms, and technologically advanced spacecraft) at an interactive terminal with relative ease. The IDEAS approach is useful during the conceptual design phase of advanced space missions when a multiplicity of parameters and concepts must be analyzed and evaluated in a cost-effective and timely manner.
Merritt, M.L.
1977-01-01
A computerized index of water-data collection activities and retrieval software to generate publication list of this information was developed for Florida. This system serves a vital need in the administration of the many and diverse water-data collection activities. Previously, needed data was very difficult to assemble for use in program planning or project implementation. Largely descriptive, the report tells how a file of computer card images has been established which contains entries for all sites in Florida at which there is currently a water-data-collection activity. Entries include information such as identification number, station name, location, type of site, county, information about data collection, funding, and other pertinent details. The computer program FINDEX selectively retrieves entries and lists them in a format suitable for publication. Updating the index is done routinely. (Woodard-USGS)
Lunar laser ranging data processing in a Unix/X windows environment
NASA Technical Reports Server (NTRS)
Ricklefs, Randall L.; Ries, Judit G.
1993-01-01
In cooperation with the NASA Crustal Dynamics Project initiative placing workstation computers at each of its laser ranging stations to handle data filtering and normalpointing, MLRS personnel have developed a new generation of software to provide the same services for the lunar laser ranging data type. The Unix operating system and X windows/Motif provides an environment for both batch and interactive filtering and normalpointing as well as prediction calculations. The goal is to provide a transportable and maintainable data reduction environment. This software and some sample displays are presented. that the lunar (or satellite) datacould be processed on one computer while data was taken on the other. The reduction of the data was totally interactive and in no way automated. In addition, lunar predictions were produced on-site, another first in the effort to down-size historically mainframe-based applications. Extraction of earth rotation parameters was at one time attempted on site in near-realtime. In 1988, the Crustal Dynamics Project SLR Computer Panel mandated the installation of Hewlett-Packard 9000/360 Unix workstations at each NASA-operated laser ranging station to relieve the aging controller computers of much of their data and communications handling responsibility and to provide on-site data filtering and normal pointing for a growing list of artificial satellite targets. This was seen by MLRS staff as an opportunity to provide a better lunar data processing environment as well.
Lunar laser ranging data processing in a Unix/X windows environment
NASA Astrophysics Data System (ADS)
Ricklefs, Randall L.; Ries, Judit G.
1993-06-01
In cooperation with the NASA Crustal Dynamics Project initiative placing workstation computers at each of its laser ranging stations to handle data filtering and normalpointing, MLRS personnel have developed a new generation of software to provide the same services for the lunar laser ranging data type. The Unix operating system and X windows/Motif provides an environment for both batch and interactive filtering and normalpointing as well as prediction calculations. The goal is to provide a transportable and maintainable data reduction environment. This software and some sample displays are presented. that the lunar (or satellite) datacould be processed on one computer while data was taken on the other. The reduction of the data was totally interactive and in no way automated. In addition, lunar predictions were produced on-site, another first in the effort to down-size historically mainframe-based applications. Extraction of earth rotation parameters was at one time attempted on site in near-realtime. In 1988, the Crustal Dynamics Project SLR Computer Panel mandated the installation of Hewlett-Packard 9000/360 Unix workstations at each NASA-operated laser ranging station to relieve the aging controller computers of much of their data and communications handling responsibility and to provide on-site data filtering and normal pointing for a growing list of artificial satellite targets. This was seen by MLRS staff as an opportunity to provide a better lunar data processing environment as well.
Modular space station, phase B extension. Program operations plan
NASA Technical Reports Server (NTRS)
1971-01-01
An organized approach is defined for establishing the most significant requirements pertaining to mission operations, information management, and computer program design and development for the modular space station program. The operations plan pertains to the space station and experiment module program elements and to the ground elements required for mission management and mission support operations.
Intelligent man/machine interfaces on the space station
NASA Technical Reports Server (NTRS)
Daughtrey, Rodney S.
1987-01-01
Some important topics in the development of good, intelligent, usable man/machine interfaces for the Space Station are discussed. These computer interfaces should adhere strictly to three concepts or doctrines: generality, simplicity, and elegance. The motivation for natural language interfaces and their use and value on the Space Station, both now and in the future, are discussed.
Ries, Kernell G.; Eng, Ken
2010-01-01
The U.S. Geological Survey, in cooperation with the Maryland Department of the Environment, operated a network of 20 low-flow partial-record stations during 2008 in a region that extends from southwest of Baltimore to the northeastern corner of Maryland to obtain estimates of selected streamflow statistics at the station locations. The study area is expected to face a substantial influx of new residents and businesses as a result of military and civilian personnel transfers associated with the Federal Base Realignment and Closure Act of 2005. The estimated streamflow statistics, which include monthly 85-percent duration flows, the 10-year recurrence-interval minimum base flow, and the 7-day, 10-year low flow, are needed to provide a better understanding of the availability of water resources in the area to be affected by base-realignment activities. Streamflow measurements collected for this study at the low-flow partial-record stations and measurements collected previously for 8 of the 20 stations were related to concurrent daily flows at nearby index streamgages to estimate the streamflow statistics. Three methods were used to estimate the streamflow statistics and two methods were used to select the index streamgages. Of the three methods used to estimate the streamflow statistics, two of them--the Moments and MOVE1 methods--rely on correlating the streamflow measurements at the low-flow partial-record stations with concurrent streamflows at nearby, hydrologically similar index streamgages to determine the estimates. These methods, recommended for use by the U.S. Geological Survey, generally require about 10 streamflow measurements at the low-flow partial-record station. The third method transfers the streamflow statistics from the index streamgage to the partial-record station based on the average of the ratios of the measured streamflows at the partial-record station to the concurrent streamflows at the index streamgage. This method can be used with as few as one pair of streamflow measurements made on a single streamflow recession at the low-flow partial-record station, although additional pairs of measurements will increase the accuracy of the estimates. Errors associated with the two correlation methods generally were lower than the errors associated with the flow-ratio method, but the advantages of the flow-ratio method are that it can produce reasonably accurate estimates from streamflow measurements much faster and at lower cost than estimates obtained using the correlation methods. The two index-streamgage selection methods were (1) selection based on the highest correlation coefficient between the low-flow partial-record station and the index streamgages, and (2) selection based on Euclidean distance, where the Euclidean distance was computed as a function of geographic proximity and the basin characteristics: drainage area, percentage of forested area, percentage of impervious area, and the base-flow recession time constant, t. Method 1 generally selected index streamgages that were significantly closer to the low-flow partial-record stations than method 2. The errors associated with the estimated streamflow statistics generally were lower for method 1 than for method 2, but the differences were not statistically significant. The flow-ratio method for estimating streamflow statistics at low-flow partial-record stations was shown to be independent from the two correlation-based estimation methods. As a result, final estimates were determined for eight low-flow partial-record stations by weighting estimates from the flow-ratio method with estimates from one of the two correlation methods according to the respective variances of the estimates. Average standard errors of estimate for the final estimates ranged from 90.0 to 7.0 percent, with an average value of 26.5 percent. Average standard errors of estimate for the weighted estimates were, on average, 4.3 percent less than the best average standard errors of estima
Williams in the U.S. Laboratory during Expedition 13
2006-08-22
ISS013-E-70806 (22 Aug. 2006) --- Astronaut Jeffrey N. Williams, Expedition 13 NASA space station science officer and flight engineer, uses a computer in the Destiny laboratory of the International Space Station.
Space Station power distribution and control
NASA Technical Reports Server (NTRS)
Willis, A. H.
1986-01-01
A general description of the Space Station is given with the basic requirements of the power distribution and controls system presented. The dual bus and branch circuit concepts are discussed and a computer control method presented.
2007-07-31
David L. Iverson of NASA Ames Research center, Moffett Field, California, led development of computer software to monitor the conditions of the gyroscopes that keep the International Space Station (ISS) properly oriented in space as the ISS orbits Earth. The gyroscopes are flywheels that control the station's attitude without the use of propellant fuel. NASA computer scientists designed the new software, the Inductive Monitoring System, to detect warning signs that precede a gyroscope's failure. According to NASA officials, engineers will add the new software tool to a group of existing tools to identify and track problems related to the gyroscopes. If the software detects warning signs, it will quickly warn the space station's mission control center.
Hanigan, Ivan; Hall, Gillian; Dear, Keith B G
2006-09-13
To explain the possible effects of exposure to weather conditions on population health outcomes, weather data need to be calculated at a level in space and time that is appropriate for the health data. There are various ways of estimating exposure values from raw data collected at weather stations but the rationale for using one technique rather than another; the significance of the difference in the values obtained; and the effect these have on a research question are factors often not explicitly considered. In this study we compare different techniques for allocating weather data observations to small geographical areas and different options for weighting averages of these observations when calculating estimates of daily precipitation and temperature for Australian Postal Areas. Options that weight observations based on distance from population centroids and population size are more computationally intensive but give estimates that conceptually are more closely related to the experience of the population. Options based on values derived from sites internal to postal areas, or from nearest neighbour sites--that is, using proximity polygons around weather stations intersected with postal areas--tended to include fewer stations' observations in their estimates, and missing values were common. Options based on observations from stations within 50 kilometres radius of centroids and weighting of data by distance from centroids gave more complete estimates. Using the geographic centroid of the postal area gave estimates that differed slightly from the population weighted centroids and the population weighted average of sub-unit estimates. To calculate daily weather exposure values for analysis of health outcome data for small areas, the use of data from weather stations internal to the area only, or from neighbouring weather stations (allocated by the use of proximity polygons), is too limited. The most appropriate method conceptually is the use of weather data from sites within 50 kilometres radius of the area weighted to population centres, but a simpler acceptable option is to weight to the geographic centroid.
NASA Astrophysics Data System (ADS)
Matos, Catarina; Silveira, Graça; Custódio, Susana; Domingues, Ana; Dias, Nuno; Fonseca, João F. B.; Matias, Luís; Krueger, Frank; Carrilho, Fernando
2014-05-01
Noise cross-correlations are now widely used to extract Green functions between station pairs. But, do all the cross-correlations routinely computed produce successful Green Functions? What is the relationship between noise recorded in a couple of stations and the cross-correlation between them? During the last decade, we have been involved in the deployment of several temporary dense broadband (BB) networks within the scope of both national projects and international collaborations. From 2000 to 2002, a pool of 8 BB stations continuously operated in the Azores in the scope of the Memorandum of Understanding COSEA (COordinated Seismic Experiment in the Azores). Thanks to the Project WILAS (West Iberia Lithosphere and Astenosphere Structure, PTDC/CTE-GIX/097946/2008) we temporarily increased the number of BB deployed in mainland Portugal to more than 50 (permanent + temporary) during the period 2010 - 2012. In 2011/12 a temporary pool of 12 seismometers continuously recorded BB data in the Madeira archipelago, as part of the DOCTAR (Deep Ocean Test Array Experiment) project. Project CV-PLUME (Investigation on the geometry and deep signature of the Cape Verde mantle plume, PTDC/CTE-GIN/64330/2006) covered the archipelago of Cape Verde, North Atlantic, with 40 temporary BB stations in 2007/08. Project MOZART (Mozambique African Rift Tomography, PTDC/CTE-GIX/103249/2008), covered Mozambique, East Africa, with 30 temporary BB stations in the period 2011 - 2013. These networks, located in very distinct geographical and tectonic environments, offer an interesting opportunity to study seasonal and spatial variations of noise sources and their impact on Empirical Green functions computed from noise cross-correlation. Seismic noise recorded at different seismic stations is evaluated by computation of the probability density functions of power spectral density (PSD) of continuous data. To assess seasonal variations of ambient noise sources in frequency content, time-series of PSD at different frequency bands have been computed. The influence of the spatial and seasonal variation is evaluated by analysis of the one-day length cross-correlations, stacked with a 30-day moving window and with an overlap of 30 days. To inspect the effects of frequency content variations, 30-day cross-correlograms have also been computed at different frequency bands. This work is supported by project QuakeLoc-PT (PTDC/GEO-FIQ/3522/2012) and a contribution to project AQUAREL (PTDC/CTE-GIX/116819/2010).
NASA Astrophysics Data System (ADS)
Tibi, R.; Young, C. J.; Gonzales, A.; Ballard, S.; Encarnacao, A. V.
2016-12-01
The matched filtering technique involving the cross-correlation of a waveform of interest with archived signals from a template library has proven to be a powerful tool for detecting events in regions with repeating seismicity. However, waveform correlation is computationally expensive, and therefore impractical for large template sets unless dedicated distributed computing hardware and software are used. In this study, we introduce an Approximate Nearest Neighbor (ANN) approach that enables the use of very large template libraries for waveform correlation without requiring a complex distributed computing system. Our method begins with a projection into a reduced dimensionality space based on correlation with a randomized subset of the full template archive. Searching for a specified number of nearest neighbors is accomplished by using randomized K-dimensional trees. We used the approach to search for matches to each of 2700 analyst-reviewed signal detections reported for May 2010 for the IMS station MKAR. The template library in this case consists of a dataset of more than 200,000 analyst-reviewed signal detections for the same station from 2002-2014 (excluding May 2010). Of these signal detections, 60% are teleseismic first P, and 15% regional phases (Pn, Pg, Sn, and Lg). The analyses performed on a standard desktop computer shows that the proposed approach performs the search of the large template libraries about 20 times faster than the standard full linear search, while achieving recall rates greater than 80%, with the recall rate increasing for higher correlation values. To decide whether to confirm a match, we use a hybrid method involving a cluster approach for queries with two or more matches, and correlation score for single matches. Of the signal detections that passed our confirmation process, 52% were teleseismic first P, and 30% were regional phases.
Forghani-Arani, Farnoush; Behura, Jyoti; Haines, Seth S.; Batzle, Mike
2013-01-01
In studies on heavy oil, shale reservoirs, tight gas and enhanced geothermal systems, the use of surface passive seismic data to monitor induced microseismicity due to the fluid flow in the subsurface is becoming more common. However, in most studies passive seismic records contain days and months of data and manually analysing the data can be expensive and inaccurate. Moreover, in the presence of noise, detecting the arrival of weak microseismic events becomes challenging. Hence, the use of an automated, accurate and computationally fast technique for event detection in passive seismic data is essential. The conventional automatic event identification algorithm computes a running-window energy ratio of the short-term average to the long-term average of the passive seismic data for each trace. We show that for the common case of a low signal-to-noise ratio in surface passive records, the conventional method is not sufficiently effective at event identification. Here, we extend the conventional algorithm by introducing a technique that is based on the cross-correlation of the energy ratios computed by the conventional method. With our technique we can measure the similarities amongst the computed energy ratios at different traces. Our approach is successful at improving the detectability of events with a low signal-to-noise ratio that are not detectable with the conventional algorithm. Also, our algorithm has the advantage to identify if an event is common to all stations (a regional event) or to a limited number of stations (a local event). We provide examples of applying our technique to synthetic data and a field surface passive data set recorded at a geothermal site.
Performance analysis of a laser propelled interorbital tansfer vehicle
NASA Technical Reports Server (NTRS)
Minovitch, M. A.
1976-01-01
Performance capabilities of a laser-propelled interorbital transfer vehicle receiving propulsive power from one ground-based transmitter was investigated. The laser transmits propulsive energy to the vehicle during successive station fly-overs. By applying a series of these propulsive maneuvers, large payloads can be economically transferred between low earth orbits and synchronous orbits. Operations involving the injection of large payloads onto escape trajectories are also studied. The duration of each successive engine burn must be carefully timed so that the vehicle reappears over the laser station to receive additional propulsive power within the shortest possible time. The analytical solution for determining these time intervals is presented, as is a solution to the problem of determining maximum injection payloads. Parameteric computer analysis based on these optimization studies is presented. The results show that relatively low beam powers, on the order of 50 MW to 60 MW, produce significant performance capabilities.
NASA Astrophysics Data System (ADS)
Machalek, P.; Kim, S. M.; Berry, R. D.; Liang, A.; Small, T.; Brevdo, E.; Kuznetsova, A.
2012-12-01
We describe how the Climate Corporation uses Python and Clojure, a language impleneted on top of Java, to generate climatological forecasts for precipitation based on the Advanced Hydrologic Prediction Service (AHPS) radar based daily precipitation measurements. A 2-year-long forecasts is generated on each of the ~650,000 CONUS land based 4-km AHPS grids by constructing 10,000 ensembles sampled from a 30-year reconstructed AHPS history for each grid. The spatial and temporal correlations between neighboring AHPS grids and the sampling of the analogues are handled by Python. The parallelization for all the 650,000 CONUS stations is further achieved by utilizing the MAP-REDUCE framework (http://code.google.com/edu/parallel/mapreduce-tutorial.html). Each full scale computational run requires hundreds of nodes with up to 8 processors each on the Amazon Elastic MapReduce (http://aws.amazon.com/elasticmapreduce/) distributed computing service resulting in 3 terabyte datasets. We further describe how we have productionalized a monthly run of the simulations process at full scale of the 4km AHPS grids and how the resultant terabyte sized datasets are handled.
NASA Technical Reports Server (NTRS)
Dunne, Matthew J.
2011-01-01
The development of computer software as a tool to generate visual displays has led to an overall expansion of automated computer generated images in the aerospace industry. These visual overlays are generated by combining raw data with pre-existing data on the object or objects being analyzed on the screen. The National Aeronautics and Space Administration (NASA) uses this computer software to generate on-screen overlays when a Visiting Vehicle (VV) is berthing with the International Space Station (ISS). In order for Mission Control Center personnel to be a contributing factor in the VV berthing process, computer software similar to that on the ISS must be readily available on the ground to be used for analysis. In addition, this software must perform engineering calculations and save data for further analysis.
2004-02-03
KENNEDY SPACE CENTER, FLA. - Astronaut Tim Kopra (facing camera) aids in Intravehicular Activity (IVA) constraints testing on the Italian-built Node 2, a future element of the International Space Station. The second of three Station connecting modules, the Node 2 attaches to the end of the U.S. Lab and provides attach locations for several other elements. Kopra is currently assigned technical duties in the Space Station Branch of the Astronaut Office, where his primary focus involves the testing of crew interfaces for two future ISS modules as well as the implementation of support computers and operational Local Area Network on ISS. Node 2 is scheduled to launch on mission STS-120, Station assembly flight 10A.
2004-02-03
KENNEDY SPACE CENTER, FLA. - Astronaut Tim Kopra talks to a technician (off-camera) during Intravehicular Activity (IVA) constraints testing on the Italian-built Node 2, a future element of the International Space Station. The second of three Station connecting modules, the Node 2 attaches to the end of the U.S. Lab and provides attach locations for several other elements. Kopra is currently assigned technical duties in the Space Station Branch of the Astronaut Office, where his primary focus involves the testing of crew interfaces for two future ISS modules as well as the implementation of support computers and operational Local Area Network on ISS. Node 2 is scheduled to launch on mission STS-120, Station assembly flight 10A.
NASA Technical Reports Server (NTRS)
Cebeci, T.; Kaups, K.; Ramsey, J. A.
1977-01-01
The method described utilizes a nonorthogonal coordinate system for boundary-layer calculations. It includes a geometry program that represents the wing analytically, and a velocity program that computes the external velocity components from a given experimental pressure distribution when the external velocity distribution is not computed theoretically. The boundary layer method is general, however, and can also be used for an external velocity distribution computed theoretically. Several test cases were computed by this method and the results were checked with other numerical calculations and with experiments when available. A typical computation time (CPU) on an IBM 370/165 computer for one surface of a wing which roughly consist of 30 spanwise stations and 25 streamwise stations, with 30 points across the boundary layer is less than 30 seconds for an incompressible flow and a little more for a compressible flow.
Monte Carlo dose calculation using a cell processor based PlayStation 3 system
NASA Astrophysics Data System (ADS)
Chow, James C. L.; Lam, Phil; Jaffray, David A.
2012-02-01
This study investigates the performance of the EGSnrc computer code coupled with a Cell-based hardware in Monte Carlo simulation of radiation dose in radiotherapy. Performance evaluations of two processor-intensive functions namely, HOWNEAR and RANMAR_GET in the EGSnrc code were carried out basing on the 20-80 rule (Pareto principle). The execution speeds of the two functions were measured by the profiler gprof specifying the number of executions and total time spent on the functions. A testing architecture designed for Cell processor was implemented in the evaluation using a PlayStation3 (PS3) system. The evaluation results show that the algorithms examined are readily parallelizable on the Cell platform, provided that an architectural change of the EGSnrc was made. However, as the EGSnrc performance was limited by the PowerPC Processing Element in the PS3, PC coupled with graphics processing units or GPCPU may provide a more viable avenue for acceleration.
Hyperspectral anomaly detection using Sony PlayStation 3
NASA Astrophysics Data System (ADS)
Rosario, Dalton; Romano, João; Sepulveda, Rene
2009-05-01
We present a proof-of-principle demonstration using Sony's IBM Cell processor-based PlayStation 3 (PS3) to run-in near real-time-a hyperspectral anomaly detection algorithm (HADA) on real hyperspectral (HS) long-wave infrared imagery. The PS3 console proved to be ideal for doing precisely the kind of heavy computational lifting HS based algorithms require, and the fact that it is a relatively open platform makes programming scientific applications feasible. The PS3 HADA is a unique parallel-random sampling based anomaly detection approach that does not require prior spectra of the clutter background. The PS3 HADA is designed to handle known underlying difficulties (e.g., target shape/scale uncertainties) often ignored in the development of autonomous anomaly detection algorithms. The effort is part of an ongoing cooperative contribution between the Army Research Laboratory and the Army's Armament, Research, Development and Engineering Center, which aims at demonstrating performance of innovative algorithmic approaches for applications requiring autonomous anomaly detection using passive sensors.
An accurate Kriging-based regional ionospheric model using combined GPS/BeiDou observations
NASA Astrophysics Data System (ADS)
Abdelazeem, Mohamed; Çelik, Rahmi N.; El-Rabbany, Ahmed
2018-01-01
In this study, we propose a regional ionospheric model (RIM) based on both of the GPS-only and the combined GPS/BeiDou observations for single-frequency precise point positioning (SF-PPP) users in Europe. GPS/BeiDou observations from 16 reference stations are processed in the zero-difference mode. A least-squares algorithm is developed to determine the vertical total electron content (VTEC) bi-linear function parameters for a 15-minute time interval. The Kriging interpolation method is used to estimate the VTEC values at a 1 ° × 1 ° grid. The resulting RIMs are validated for PPP applications using GNSS observations from another set of stations. The SF-PPP accuracy and convergence time obtained through the proposed RIMs are computed and compared with those obtained through the international GNSS service global ionospheric maps (IGS-GIM). The results show that the RIMs speed up the convergence time and enhance the overall positioning accuracy in comparison with the IGS-GIM model, particularly the combined GPS/BeiDou-based model.
International Space Station (ISS)
1995-04-17
International Cooperation Phase III: A Space Shuttle docked to the International Space Station (ISS) in this computer generated representation of the ISS in its completed and fully operational state with elements from the U.S., Europe, Canada, Japan, and Russia.
2013-09-01
DATES COVERED (From - To) 1 Sep 2013–30 Sep 2013 4 . TITLE AND SUBTITLE Detecting Key Inter-Joint Distances and Anthropometry Effects for Static Gesture...13. SUPPLEMENTARY NOTES “Nintendo Wii” is a registered trademark of Nintendo Company, Ltd. “ PlayStation ” is a registered trademark of Sony...Computer Entertainment; PlayStation “Move” ® (Sony Computer Entertainment). “Kinect” is a registered trademark of Microsoft Corporation. Merriam-Webster
Operations research investigations of satellite power stations
NASA Technical Reports Server (NTRS)
Cole, J. W.; Ballard, J. L.
1976-01-01
A systems model reflecting the design concepts of Satellite Power Stations (SPS) was developed. The model is of sufficient scope to include the interrelationships of the following major design parameters: the transportation to and between orbits; assembly of the SPS; and maintenance of the SPS. The systems model is composed of a set of equations that are nonlinear with respect to the system parameters and decision variables. The model determines a figure of merit from which alternative concepts concerning transportation, assembly, and maintenance of satellite power stations are studied. A hybrid optimization model was developed to optimize the system's decision variables. The optimization model consists of a random search procedure and the optimal-steepest descent method. A FORTRAN computer program was developed to enable the user to optimize nonlinear functions using the model. Specifically, the computer program was used to optimize Satellite Power Station system components.
Energy consumption analysis of the Venus Deep Space Station (DSS-13)
NASA Technical Reports Server (NTRS)
Hayes, N. V.
1983-01-01
This report continues the energy consumption analysis and verification study of the tracking stations of the Goldstone Deep Space Communications Complex, and presents an audit of the Venus Deep Space Station (DSS 13). Due to the non-continuous radioastronomy research and development operations at the station, estimations of energy usage were employed in the energy consumption simulation of both the 9-meter and 26-meter antenna buildings. A 17.9% decrease in station energy consumption was experienced over the 1979-1981 years under study. A comparison of the ECP computer simulations and the station's main watt-hour meter readings showed good agreement.
Computer networking at SLR stations
NASA Technical Reports Server (NTRS)
Novotny, Antonin
1993-01-01
There are several existing communication methods to deliver data from the satellite laser ranging (SLR) station to the SLR data center and back: telephonmodem, telex, and computer networks. The SLR scientific community has been exploiting mainly INTERNET, BITNET/EARN, and SPAN. The total of 56 countries are connected to INTERNET and the number of nodes is exponentially growing. The computer networks mentioned above and others are connected through E-mail protocol. The scientific progress of SLR requires the increase of communication speed and the amount of the transmitted data. The TOPEX/POSEIDON test campaign required to deliver Quick Look data (1.7 kB/pass) from a SLR site to SLR data center within 8 hours and full rate data (up to 500 kB/pass) within 24 hours. We developed networking for the remote SLR station in Helwan, Egypt. The reliable scheme for data delivery consists of: compression of MERIT2 format (up to 89 percent), encoding to ASCII Me (files); and e-mail sending from SLR station--e-mail receiving, decoding, and decompression at the center. We do propose to use the ZIP method for compression/decompression and the UUCODE method for ASCII encoding/decoding. This method will be useful for stations connected via telephonemodems or commercial networks. The electronics delivery could solve the problem of the too late receiving of the FR data by SLR data center.
Computer networking at SLR stations
NASA Astrophysics Data System (ADS)
Novotny, Antonin
1993-06-01
There are several existing communication methods to deliver data from the satellite laser ranging (SLR) station to the SLR data center and back: telephonmodem, telex, and computer networks. The SLR scientific community has been exploiting mainly INTERNET, BITNET/EARN, and SPAN. The total of 56 countries are connected to INTERNET and the number of nodes is exponentially growing. The computer networks mentioned above and others are connected through E-mail protocol. The scientific progress of SLR requires the increase of communication speed and the amount of the transmitted data. The TOPEX/POSEIDON test campaign required to deliver Quick Look data (1.7 kB/pass) from a SLR site to SLR data center within 8 hours and full rate data (up to 500 kB/pass) within 24 hours. We developed networking for the remote SLR station in Helwan, Egypt. The reliable scheme for data delivery consists of: compression of MERIT2 format (up to 89 percent), encoding to ASCII Me (files); and e-mail sending from SLR station--e-mail receiving, decoding, and decompression at the center. We do propose to use the ZIP method for compression/decompression and the UUCODE method for ASCII encoding/decoding. This method will be useful for stations connected via telephonemodems or commercial networks. The electronics delivery could solve the problem of the too late receiving of the FR data by SLR data center.
LaRC local area networks to support distributed computing
NASA Technical Reports Server (NTRS)
Riddle, E. P.
1984-01-01
The Langley Research Center's (LaRC) Local Area Network (LAN) effort is discussed. LaRC initiated the development of a LAN to support a growing distributed computing environment at the Center. The purpose of the network is to provide an improved capability (over inteactive and RJE terminal access) for sharing multivendor computer resources. Specifically, the network will provide a data highway for the transfer of files between mainframe computers, minicomputers, work stations, and personal computers. An important influence on the overall network design was the vital need of LaRC researchers to efficiently utilize the large CDC mainframe computers in the central scientific computing facility. Although there was a steady migration from a centralized to a distributed computing environment at LaRC in recent years, the work load on the central resources increased. Major emphasis in the network design was on communication with the central resources within the distributed environment. The network to be implemented will allow researchers to utilize the central resources, distributed minicomputers, work stations, and personal computers to obtain the proper level of computing power to efficiently perform their jobs.
Morin, Robert L.; Glen, Jonathan M.G.
2003-01-01
Gravity data were collected between 1999 and 2002 along transects in the Talkeetna Mountains of south-central Alaska as part of a geological and geophysical study of the framework geology of the region. The study area lies between 61° 30’ and 63° 45’ N. latitude and 145° and 151° W. longitude. This data set includes 408 gravity stations. These data, combined with the pre-existing 3,286 stations, brings the total data in this area to 3,694 gravity stations. Principal facts for the 408 new gravity stations and the 15 gravity base stations used for control are listed in this report. During the summer of 1999, a gravity survey was conducted in the western Talkeetna Mountains. Measurements at 55 gravity stations were made. One gravity base station was used for control for this survey. This base station, STEP, is located at the Stephan Lake Lodge on Stephan Lake. The observed gravity of this station was calculated based on an indirect tie to base station ANCL in Anchorage. The temporary base used to tie between STEP and ANCL was REGL in Anchorage. During the summer of 2000, a gravity survey was conducted in the western Talkeetna Mountains. Measurements at 56 gravity stations were made. One gravity base station was used for control for this survey. This base station, GRHS, is located at the Gracious House Lodge on the Denali Highway. The observed gravity of this station was calculated based on multiple ties to base stations D87, and D57 along the Denali Highway. During the summer of 2001, a gravity survey was conducted in the western Talkeetna Mountains. Measurements at 90 gravity stations were made. One gravity base station was used for control for this survey. This base station, HLML, is located at the High Lake Lodge. The observed gravity of this station was calculated based on multiple ties to base stations ANCU in Anchorage, PALH in Palmer, WASA in Wasilla, and TLKM in Talkeetna. Also during the summer of 2001, a gravity survey was conducted in the vicinity of Tangle Lakes. Measurements at 86 gravity stations were made. The Tangle Lakes area is located about 25 km west of Paxson and north of the Denali Highway. One gravity base station was used for control for this survey. This base station, TLIN, is located at the Tangle Lakes Inn. The observed gravity of this station was calculated based on multiple ties to base stations ANCU in Anchorage, PALH in Palmer, BD27 in Gulkana, B-07 on the Richardson Highway, and base stations D42, and D57 along the Denali Highway. During the summer of 2002, measurements at an additional 107 gravity stations were made in the vicinity of Tangle Lakes. Base station TLIN at the Tangle Lakes Inn was again used for control. Additional ties to base stations ANCU and B-07 were made.
NASA Astrophysics Data System (ADS)
Matsumoto, Monica M. S.; Beig, Niha G.; Udupa, Jayaram K.; Archer, Steven; Torigian, Drew A.
2014-03-01
Lung cancer is associated with the highest cancer mortality rates among men and women in the United States. The accurate and precise identification of the lymph node stations on computed tomography (CT) images is important for staging disease and potentially for prognosticating outcome in patients with lung cancer, as well as for pretreatment planning and response assessment purposes. To facilitate a standard means of referring to lymph nodes, the International Association for the Study of Lung Cancer (IASLC) has recently proposed a definition of the different lymph node stations and zones in the thorax. However, nodal station identification is typically performed manually by visual assessment in clinical radiology. This approach leaves room for error due to the subjective and potentially ambiguous nature of visual interpretation, and is labor intensive. We present a method of automatically recognizing the mediastinal IASLC-defined lymph node stations by modifying a hierarchical fuzzy modeling approach previously developed for body-wide automatic anatomy recognition (AAR) in medical imagery. Our AAR-lymph node (AAR-LN) system follows the AAR methodology and consists of two steps. In the first step, the various lymph node stations are manually delineated on a set of CT images following the IASLC definitions. These delineations are then used to build a fuzzy hierarchical model of the nodal stations which are considered as 3D objects. In the second step, the stations are automatically located on any given CT image of the thorax by using the hierarchical fuzzy model and object recognition algorithms. Based on 23 data sets used for model building, 22 independent data sets for testing, and 10 lymph node stations, a mean localization accuracy of within 1-6 voxels has been achieved by the AAR-LN system.
Modeling a Wireless Network for International Space Station
NASA Technical Reports Server (NTRS)
Alena, Richard; Yaprak, Ece; Lamouri, Saad
2000-01-01
This paper describes the application of wireless local area network (LAN) simulation modeling methods to the hybrid LAN architecture designed for supporting crew-computing tools aboard the International Space Station (ISS). These crew-computing tools, such as wearable computers and portable advisory systems, will provide crew members with real-time vehicle and payload status information and access to digital technical and scientific libraries, significantly enhancing human capabilities in space. A wireless network, therefore, will provide wearable computer and remote instruments with the high performance computational power needed by next-generation 'intelligent' software applications. Wireless network performance in such simulated environments is characterized by the sustainable throughput of data under different traffic conditions. This data will be used to help plan the addition of more access points supporting new modules and more nodes for increased network capacity as the ISS grows.
Delay/Disruption Tolerant Networking for the International Space Station (ISS)
NASA Technical Reports Server (NTRS)
Schlesinger, Adam; Willman, Brett M.; Pitts, Lee; Davidson, Suzanne R.; Pohlchuck, William A.
2017-01-01
Disruption Tolerant Networking (DTN) is an emerging data networking technology designed to abstract the hardware communication layer from the spacecraft/payload computing resources. DTN is specifically designed to operate in environments where link delays and disruptions are common (e.g., space-based networks). The National Aeronautics and Space Administration (NASA) has demonstrated DTN on several missions, such as the Deep Impact Networking (DINET) experiment, the Earth Observing Mission 1 (EO-1) and the Lunar Laser Communication Demonstration (LLCD). To further the maturation of DTN, NASA is implementing DTN protocols on the International Space Station (ISS). This paper explains the architecture of the ISS DTN network, the operational support for the system, the results from integrated ground testing, and the future work for DTN expansion.
ERIC Educational Resources Information Center
Lombardi, Don
1991-01-01
Studies suggest that computer work stations may induce high levels of physical and psychological stress. Advises school districts to take a proactive stance on ergonomics. Cites laws and pending litigation regulating computer use in the workspace and offers guidelines for computer users. (MLF)
Continuously Deformation Monitoring of Subway Tunnel Based on Terrestrial Point Clouds
NASA Astrophysics Data System (ADS)
Kang, Z.; Tuo, L.; Zlatanova, S.
2012-07-01
The deformation monitoring of subway tunnel is of extraordinary necessity. Therefore, a method for deformation monitoring based on terrestrial point clouds is proposed in this paper. First, the traditional adjacent stations registration is replaced by sectioncontrolled registration, so that the common control points can be used by each station and thus the error accumulation avoided within a section. Afterwards, the central axis of the subway tunnel is determined through RANSAC (Random Sample Consensus) algorithm and curve fitting. Although with very high resolution, laser points are still discrete and thus the vertical section is computed via the quadric fitting of the vicinity of interest, instead of the fitting of the whole model of a subway tunnel, which is determined by the intersection line rotated about the central axis of tunnel within a vertical plane. The extraction of the vertical section is then optimized using RANSAC for the purpose of filtering out noises. Based on the extracted vertical sections, the volume of tunnel deformation is estimated by the comparison between vertical sections extracted at the same position from different epochs of point clouds. Furthermore, the continuously extracted vertical sections are deployed to evaluate the convergent tendency of the tunnel. The proposed algorithms are verified using real datasets in terms of accuracy and computation efficiency. The experimental result of fitting accuracy analysis shows the maximum deviation between interpolated point and real point is 1.5 mm, and the minimum one is 0.1 mm; the convergent tendency of the tunnel was detected by the comparison of adjacent fitting radius. The maximum error is 6 mm, while the minimum one is 1 mm. The computation cost of vertical section abstraction is within 3 seconds/section, which proves high efficiency..
ERIC Educational Resources Information Center
IntelliSys, Inc., Syracuse, NY.
This was Phase I of a three-phased project. This phase of the project investigated the feasibility of a computer-based instruction (CBI) workstation, designed for use by teachers of handicapped students within a school structure. This station is to have as a major feature the ability to produce in-house full-motion video using one of the…
Proceedings of GeoTech 85: Personal computers in geology conference
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1985-01-01
This book presents the papers given at a conference which considered the use of microprocessors in the exploration of petroleum and natural gas deposits. Topics covered at the conference included seismic surveys, geochemistry, expert systems, artificial intelligence, data base management systems, a portable exploration work station, open pit planning on a microcomputer, well logging, fracture analysis, production scheduling of open pit mines, resistivity logging, and coal washability.
Crew appliance computer program manual, volume 1
NASA Technical Reports Server (NTRS)
Russell, D. J.
1975-01-01
Trade studies of numerous appliance concepts for advanced spacecraft galley, personal hygiene, housekeeping, and other areas were made to determine which best satisfy the space shuttle orbiter and modular space station mission requirements. Analytical models of selected appliance concepts not currently included in the G-189A Generalized Environmental/Thermal Control and Life Support Systems (ETCLSS) Computer Program subroutine library were developed. The new appliance subroutines are given along with complete analytical model descriptions, solution methods, user's input instructions, and validation run results. The appliance components modeled were integrated with G-189A ETCLSS models for shuttle orbiter and modular space station, and results from computer runs of these systems are presented.
NASA Technical Reports Server (NTRS)
Robinson, Peter; Shirley, Mark; Fletcher, Daryl; Alena, Rick; Duncavage, Dan; Lee, Charles
2003-01-01
All of the International Space Station (ISS) systems which require computer control depend upon the hardware and software of the Command and Data Handling System (C&DH) system, currently a network of over 30 386-class computers called Multiplexor/Dimultiplexors (MDMs)[18]. The Caution and Warning System (C&W)[7], a set of software tasks that runs on the MDMs, is responsible for detecting, classifying, and reporting errors in all ISS subsystems including the C&DH. Fault Detection, Isolation and Recovery (FDIR) of these errors is typically handled with a combination of automatic and human effort. We are developing an Advanced Diagnostic System (ADS) to augment the C&W system with decision support tools to aid in root cause analysis as well as resolve differing human and machine C&DH state estimates. These tools which draw from sources in model-based reasoning[ 16,291, will improve the speed and accuracy of flight controllers by reducing the uncertainty in C&DH state estimation, allowing for a more complete assessment of risk. We have run tests with ISS telemetry and focus on those C&W events which relate to the C&DH system itself. This paper describes our initial results and subsequent plans.
A design of the u-health monitoring system using a Nintendo DS game machine.
Lee, Sangjoon; Kim, Jinkwon; Kim, Jungkuk; Lee, Myoungho
2009-01-01
In this paper, we used the hand held type a Nintendo DS Game Machine for consisting of a u-Health Monitoring system. This system is consists of four parts. Biosignal acquire device is the first. The Second is a wireless sensor network device. The third is a wireless base-station for connecting internet network. Displaying units are the last part which were a personal computer and a Nintendo DS game machine. The bio-signal measurement device among the four parts the u-health monitoring system can acquire 7-channels data which have 3-channels ECG(Electrocardiogram), 3-axis accelerometer and tilting sensor data. Acquired data connect up the internet network throughout the wireless sensor network and a base-station. In the experiment, we concurrently display the bio-signals on to a monitor of personal computer and LCD of a Nintendo DS using wireless internet protocol and those monitoring devices placed off to the one side an office building. The result of the experiment, this proposed system effectively can transmit patient's biosignal data as a long time and a long distance. This suggestion of the u-health monitoring system need to operate in the ambulance, general hospitals and geriatric institutions as a u-health monitoring device.
MSNoise: a Python Package for Monitoring Seismic Velocity Changes using Ambient Seismic Noise
NASA Astrophysics Data System (ADS)
Lecocq, T.; Caudron, C.; Brenguier, F.
2013-12-01
Earthquakes occur every day all around the world and are recorded by thousands of seismic stations. In between earthquakes, stations are recording "noise". In the last 10 years, the understanding of this noise and its potential usage have been increasing rapidly. The method, called "seismic interferometry", uses the principle that seismic waves travel between two recorders and are multiple-scattered in the medium. By cross-correlating the two records, one gets an information on the medium below/between the stations. The cross-correlation function (CCF) is a proxy to the Green Function of the medium. Recent developments of the technique have shown those CCF can be used to image the earth at depth (3D seismic tomography) or study the medium changes with time. We present MSNoise, a complete software suite to compute relative seismic velocity changes under a seismic network, using ambient seismic noise. The whole is written in Python, from the monitoring of data archives, to the production of high quality figures. All steps have been optimized to only compute the necessary steps and to use 'job'-based processing. We present a validation of the software on a dataset acquired during the UnderVolc[1] project on the Piton de la Fournaise Volcano, La Réunion Island, France, for which precursory relative changes of seismic velocity are visible for three eruptions betwee 2009 and 2011.
NASA Technical Reports Server (NTRS)
1989-01-01
The results of the refined conceptual design phase (task 5) of the Simulation Computer System (SCS) study are reported. The SCS is the computational portion of the Payload Training Complex (PTC) providing simulation based training on payload operations of the Space Station Freedom (SSF). In task 4 of the SCS study, the range of architectures suitable for the SCS was explored. Identified system architectures, along with their relative advantages and disadvantages for SCS, were presented in the Conceptual Design Report. Six integrated designs-combining the most promising features from the architectural formulations-were additionally identified in the report. The six integrated designs were evaluated further to distinguish the more viable designs to be refined as conceptual designs. The three designs that were selected represent distinct approaches to achieving a capable and cost effective SCS configuration for the PTC. Here, the results of task 4 (input to this task) are briefly reviewed. Then, prior to describing individual conceptual designs, the PTC facility configuration and the SSF systems architecture that must be supported by the SCS are reviewed. Next, basic features of SCS implementation that have been incorporated into all selected SCS designs are considered. The details of the individual SCS designs are then presented before making a final comparison of the three designs.
NASA Astrophysics Data System (ADS)
Fulton, J. W.; Bjerklie, D. M.; Jones, J. W.; Minear, J. T.
2015-12-01
Measuring streamflow, developing, and maintaining rating curves at new streamgaging stations is both time-consuming and problematic. Hydro 21 was an initiative by the U.S. Geological Survey to provide vision and leadership to identify and evaluate new technologies and methods that had the potential to change the way in which streamgaging is conducted. Since 2014, additional trials have been conducted to evaluate some of the methods promoted by the Hydro 21 Committee. Emerging technologies such as continuous-wave radars and computationally-efficient methods such as the Probability Concept require significantly less field time, promote real-time velocity and streamflow measurements, and apply to unsteady flow conditions such as looped ratings and unsteady-flood flows. Portable and fixed-mount radars have advanced beyond the development phase, are cost effective, and readily available in the marketplace. The Probability Concept is based on an alternative velocity-distribution equation developed by C.-L. Chiu, who pioneered the concept. By measuring the surface-water velocity and correcting for environmental influences such as wind drift, radars offer a reliable alternative for measuring and computing real-time streamflow for a variety of hydraulic conditions. If successful, these tools may allow us to establish ratings more efficiently, assess unsteady flow conditions, and report real-time streamflow at new streamgaging stations.
Designing an Ergonomically Correct CNC Workstation on a Shoe String Budget.
ERIC Educational Resources Information Center
Lightner, Stan
2001-01-01
Describes research to design and construct ergonomically correct work stations for Computer Numerical Control machine tools. By designing ergonomically correct work stations, industrial technology teachers help protect students from repetitive motion injuries. (Contains 12 references.) (JOW)
View of FE Stott working in the JPM
2009-11-22
ISS021-E-031695 (22 Nov. 2009) --- Astronaut Nicole Stott, STS-129 mission specialist, uses a communication system near a computer in the Kibo laboratory of the International Space Station while space shuttle Atlantis remains docked with the station.
Dezhurov works in the sleep station in the U.S. Laboratory during Expedition Three
2001-09-09
ISS003-E-5560 (9 September 2001) --- Cosmonaut Vladimir Dezhurov of Rosaviakosmos, Expedition Three flight engineer, works on a laptop computer in the Temporary Sleep Station (TSS) in the U.S. Laboratory.
NASA Astrophysics Data System (ADS)
Shakibay Senobari, N.; Funning, G.
2016-12-01
Repeating earthquakes (REs) are the regular or semi-regular failures of the same patch on a fault, producing near-identical waveforms at a given station. Sequences of REs are commonly interpreted as slip on small locked patches surrounded by large areas of fault that are creeping (Nadeau and McEvilly, 1999). Detecting them, therefore, places important constraints on the extent of fault creep at depth. In addition, the magnitude and recurrence interval of these RE sequences can be related to the creep rate and used as constraints on slip models. In this study we search for REs in northern California fault systems upon which creep is suspected, but not well constrained, including the Rodgers Creek, Maacama, Bartlett Springs, Concord-Green Valley, West Napa and Greenville faults, targeting events recorded at stations where the instrument was not changed for 10 years or more. A pair of events can be identified as REs based on a high cross-correlation coefficient (CCC) between their waveforms. Thus a fundamental step in RE searches is calculating the CCC for all event waveform pairs recorded at common stations. This becomes computationally expensive for large data sets. To expedite our search, we use a fast and accurate similarity search algorithm developed by the computer science community (Mueen et al., 2015; Zhu et al., 2016). Our initial tests on a data set including 1500 waveforms suggest it is around 40 times faster than the algorithm that we used previously (Shakibay Senobari and Funning, AGU Fall Meeting 2014). We search for event pairs with CCC>0.85 and cluster them based on their similarity. A second, location based filter, based on the differential S-P times for each event pair at 5 or more stations, is used as an independent check. We consider a cluster of events a RE sequence if the source location separation distance for each pair is less than the estimated circular size of the source (e.g. Chen et al., 2008); these are gathered into an RE catalogue. In future, we plan to use this information in combination with geodetic data to produce a robust creep distribution model for all of the faults in this region.
Clinical decision making using teleradiology in urology.
Lee, B R; Allaf, M; Moore, R; Bohlman, M; Wang, G M; Bishoff, J T; Jackman, S V; Cadeddu, J A; Jarrett, T W; Khazan, R; Kavoussi, L R
1999-01-01
Using a personal computer-based teleradiology system, we compared accuracy, confidence, and diagnostic ability in the interpretation of digitized radiographs to determine if teleradiology-imported studies convey sufficient information to make relevant clinical decisions involving urology. Variables of diagnostic accuracy, confidence, image quality, interpretation, and the impact of clinical decisions made after viewing digitized radiographs were compared with those of original radiographs. We evaluated 956 radiographs that included 94 IV pyelograms, four voiding cystourethrograms, and two nephrostograms. The radiographs were digitized and transferred over an Ethernet network to a remote personal computer-based viewing station. The digitized images were viewed by urologists and graded according to confidence in making a diagnosis, image quality, diagnostic difficulty, clinical management based on the image itself, and brief patient history. The hard-copy radiographs were then interpreted immediately afterward, and diagnostic decisions were reassessed. All analog radiographs were reviewed by an attending radiologist. Ninety-seven percent of the decisions made from the digitized radiographs did not change after reviewing conventional radiographs of the same case. When comparing the variables of clinical confidence, quality of the film on the teleradiology system versus analog films, and diagnostic difficulty, we found no statistical difference (p > .05) between the two techniques. Overall accuracy in interpreting the digitized images on the teleradiology system was 88% by urologists compared with that of the attending radiologist's interpretation of the analog radiographs. However, urologists detected findings on five (5%) analog radiographs that had been previously unreported by the radiologist. Viewing radiographs transmitted to a personal computer-based viewing station is an appropriate means of reviewing films with sufficient quality on which to base clinical decisions. Our focus was whether decisions made after viewing the transmitted radiographs would change after viewing the hard-copy images of the same case. In 97% of the cases, the decision did not change. In those cases in which management was altered, recommendation of further imaging studies was the most common factor.
20. SITE BUILDING 002 SCANNER BUILDING IN COMPUTER ...
20. SITE BUILDING 002 - SCANNER BUILDING - IN COMPUTER ROOM LOOKING AT "CONSOLIDATED MAINTENANCE OPERATIONS CENTER" JOB AREA AND OPERATION WORK CENTER. TASKS INCLUDE RADAR MAINTENANCE, COMPUTER MAINTENANCE, CYBER COMPUTER MAINTENANCE AND RELATED ACTIVITIES. - Cape Cod Air Station, Technical Facility-Scanner Building & Power Plant, Massachusetts Military Reservation, Sandwich, Barnstable County, MA
NASA Technical Reports Server (NTRS)
1972-01-01
Information backing up the key features of the manipulator system concept and detailed technical information on the subsystems are presented. Space station assembly and shuttle cargo handling tasks are emphasized in the concept analysis because they involve shuttle berthing, transferring the manipulator boom between shuttle and station, station assembly, and cargo handling. Emphasis is also placed on maximizing commonality in the system areas of manipulator booms, general purpose end effectors, control and display, data processing, telemetry, dedicated computers, and control station design.
Morlock, Scott E.; Stewart, James A.
2000-01-01
An acoustic Doppler current profiler (ADCP) mounted on a boat was used to collect velocity and depth data and to compute positions of the velocity and depth data relative to the boat track. A global positioning system (GPS) was used to collect earth-referenced position data, and a GPS base station receiver was used to improve the accuracy of the earth-referenced position data. The earth-referenced position data were used to transform the ADCP-computed positions (which were relative to boat tracks) to positions referenced to a point on the spillway tower.
STS-120 crew along with Expedition crew members Dan Tani and Sandra Magnus
2007-08-09
JSC2007-E-41538 (9 Aug. 2007) --- Astronauts Stephanie Wilson, STS-120 mission specialist; Sandra Magnus, Expedition 17 flight engineer; and Dan Tani, Expedition 16 flight engineer, use the virtual reality lab at Johnson Space Center to train for their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements. A computer display is visible in the foreground.
ATS simultaneous and turnaround ranging experiments
NASA Technical Reports Server (NTRS)
Watson, J. S.; Putney, B. H.
1971-01-01
This report explains the data reduction and spacecraft position determination used in conjunction with two ATS experiments - Trilateration and Turnaround Ranging - and describes in detail a multilateration program that is used for part of the data reduction process. The process described is for the determination of the inertial position of the satellite, and for formating input for related programs. In the trilateration procedure, a geometric determination of satellite position is made from near simultaneous range measurements made by three different tracking stations. Turnaround ranging involves two stations; one, the master station, transmits the signal to the satellite and the satellite retransmits the signal to the slave station which turns the signal around to the satellite which in turn retransmits the signal to the master station. The results of the satellite position computations using the multilateration program are compared to results of other position determination programs used at Goddard. All programs give nearly the same results which indicates that because of its simplicity and computational speed the trilateration technique is useful in obtaining spacecraft positions for near synchronous satellites.
Space station full-scale docking/berthing mechanisms development
NASA Technical Reports Server (NTRS)
Burns, Gene C.; Price, Harold A.; Buchanan, David B.
1988-01-01
One of the most critical operational functions for the space station is the orbital docking between the station and the STS orbiter. The program to design, fabricate, and test docking/berthing mechanisms for the space station is described. The design reflects space station overall requirements and consists of two mating docking mechanism halves. One half is designed for use on the shuttle orbiter and incorporates capture and energy attenuation systems using computer controlled electromechanical actuators and/or attenuators. The mating half incorporates a flexible feature to allow two degrees of freedom at the module-to-module interface of the space station pressurized habitat volumes. The design concepts developed for the prototype units may be used for the first space station flight hardware.
2013-09-29
ISS037-E-004299 (29 Sept. 2013) --- NASA astronaut Karen Nyberg, Expedition 37 flight engineer, uses a payload and general support computer (PGSC) in the Harmony node of the International Space Station.
Influence of seismic anisotropy on the cross correlation tensor: numerical investigations
NASA Astrophysics Data System (ADS)
Saade, M.; Montagner, J. P.; Roux, P.; Cupillard, P.; Durand, S.; Brenguier, F.
2015-05-01
Temporal changes in seismic anisotropy can be interpreted as variations in the orientation of cracks in seismogenic zones, and thus as variations in the stress field. Such temporal changes have been observed in seismogenic zones before and after earthquakes, although they are still not well understood. In this study, we investigate the azimuthal polarization of surface waves in anisotropic media with respect to the orientation of anisotropy, from a numerical point of view. This technique is based on the observation of the signature of anisotropy on the nine-component cross-correlation tensor (CCT) computed from seismic ambient noise recorded on pairs of three-component sensors. If noise sources are spatially distributed in a homogeneous medium, the CCT allows the reconstruction of the surface wave Green's tensor between the station pairs. In homogeneous, isotropic medium, four off-diagonal terms of the surface wave Green's tensor are null, but not in anisotropic medium. This technique is applied to three-component synthetic seismograms computed in a transversely isotropic medium with a horizontal symmetry axis, using a spectral element code. The CCT is computed between each pair of stations and then rotated, to approximate the surface wave Green's tensor by minimizing the off-diagonal components. This procedure allows the calculation of the azimuthal variation of quasi-Rayleigh and quasi-Love waves. In an anisotropic medium, in some cases, the azimuth of seismic anisotropy can induce a large variation in the horizontal polarization of surface waves. This variation depends on the relative angle between a pair of stations and the direction of anisotropy, the amplitude of the anisotropy, the frequency band of the signal and the depth of the anisotropic layer.
Prediction of monthly regional groundwater levels through hybrid soft-computing techniques
NASA Astrophysics Data System (ADS)
Chang, Fi-John; Chang, Li-Chiu; Huang, Chien-Wei; Kao, I.-Feng
2016-10-01
Groundwater systems are intrinsically heterogeneous with dynamic temporal-spatial patterns, which cause great difficulty in quantifying their complex processes, while reliable predictions of regional groundwater levels are commonly needed for managing water resources to ensure proper service of water demands within a region. In this study, we proposed a novel and flexible soft-computing technique that could effectively extract the complex high-dimensional input-output patterns of basin-wide groundwater-aquifer systems in an adaptive manner. The soft-computing models combined the Self Organized Map (SOM) and the Nonlinear Autoregressive with Exogenous Inputs (NARX) network for predicting monthly regional groundwater levels based on hydrologic forcing data. The SOM could effectively classify the temporal-spatial patterns of regional groundwater levels, the NARX could accurately predict the mean of regional groundwater levels for adjusting the selected SOM, the Kriging was used to interpolate the predictions of the adjusted SOM into finer grids of locations, and consequently the prediction of a monthly regional groundwater level map could be obtained. The Zhuoshui River basin in Taiwan was the study case, and its monthly data sets collected from 203 groundwater stations, 32 rainfall stations and 6 flow stations during 2000 and 2013 were used for modelling purpose. The results demonstrated that the hybrid SOM-NARX model could reliably and suitably predict monthly basin-wide groundwater levels with high correlations (R2 > 0.9 in both training and testing cases). The proposed methodology presents a milestone in modelling regional environmental issues and offers an insightful and promising way to predict monthly basin-wide groundwater levels, which is beneficial to authorities for sustainable water resources management.
NASA Technical Reports Server (NTRS)
1990-01-01
NASA formally launched Project LASER (Learning About Science, Engineering and Research) in March 1990, a program designed to help teachers improve science and mathematics education and to provide 'hands on' experiences. It featured the first LASER Mobile Teacher Resource Center (MTRC), is designed to reach educators all over the nation. NASA hopes to operate several MTRCs with funds provided by private industry. The mobile unit is a 22-ton tractor-trailer stocked with NASA educational publications and outfitted with six work stations. Each work station, which can accommodate two teachers at a time, has a computer providing access to NASA Spacelink. Each also has video recorders and photocopy/photographic equipment for the teacher's use. MTRC is only one of the five major elements within LASER. The others are: a Space Technology Course, to promote integration of space science studies with traditional courses; the Volunteer Databank, in which NASA employees are encouraged to volunteer as tutors, instructors, etc; Mobile Discovery Laboratories that will carry simple laboratory equipment and computers to provide hands-on activities for students and demonstrations of classroom activities for teachers; and the Public Library Science Program which will present library based science and math programs.
Distributed cooperating processes in a mobile robot control system
NASA Technical Reports Server (NTRS)
Skillman, Thomas L., Jr.
1988-01-01
A mobile inspection robot has been proposed for the NASA Space Station. It will be a free flying autonomous vehicle that will leave a berthing unit to accomplish a variety of inspection tasks around the Space Station, and then return to its berth to recharge, refuel, and transfer information. The Flying Eye robot will receive voice communication to change its attitude, move at a constant velocity, and move to a predefined location along a self generated path. This mobile robot control system requires integration of traditional command and control techniques with a number of AI technologies. Speech recognition, natural language understanding, task and path planning, sensory abstraction and pattern recognition are all required for successful implementation. The interface between the traditional numeric control techniques and the symbolic processing to the AI technologies must be developed, and a distributed computing approach will be needed to meet the real time computing requirements. To study the integration of the elements of this project, a novel mobile robot control architecture and simulation based on the blackboard architecture was developed. The control system operation and structure is discussed.
Radiofrequency-electromagnetic field exposures in kindergarten children.
Bhatt, Chhavi Raj; Redmayne, Mary; Billah, Baki; Abramson, Michael J; Benke, Geza
2017-09-01
The aim of this study was to assess environmental and personal radiofrequency-electromagnetic field (RF-EMF) exposures in kindergarten children. Ten children and 20 kindergartens in Melbourne, Australia participated in personal and environmental exposure measurements, respectively. Order statistics of RF-EMF exposures were computed for 16 frequency bands between 88 MHz and 5.8 GHz. Of the 16 bands, the three highest sources of environmental RF-EMF exposures were: Global System for Mobile Communications (GSM) 900 MHz downlink (82 mV/m); Universal Mobile Telecommunications System (UMTS) 2100MHz downlink (51 mV/m); and GSM 900 MHz uplink (45 mV/m). Similarly, the three highest personal exposure sources were: GSM 900 MHz downlink (50 mV/m); UMTS 2100 MHz downlink, GSM 900 MHz uplink and GSM 1800 MHz downlink (20 mV/m); and Frequency Modulation radio, Wi-Fi 2.4 GHz and Digital Video Broadcasting-Terrestrial (10 mV/m). The median environmental exposures were: 179 mV/m (total all bands), 123 mV/m (total mobile phone base station downlinks), 46 mV/m (total mobile phone base station uplinks), and 16 mV/m (Wi-Fi 2.4 GHz). Similarly, the median personal exposures were: 81 mV/m (total all bands), 62 mV/m (total mobile phone base station downlinks), 21 mV/m (total mobile phone base station uplinks), and 9 mV/m (Wi-Fi 2.4 GHz). The measurements showed that environmental RF-EMF exposure levels exceeded the personal RF-EMF exposure levels at kindergartens.
NASA Technical Reports Server (NTRS)
Barber, Peter W.; Demerdash, Nabeel A. O.; Wang, R.; Hurysz, B.; Luo, Z.
1991-01-01
The goal is to analyze the potential effects of electromagnetic interference (EMI) originating from power system processing and transmission components for Space Station Freedom.The approach consists of four steps: (1) develop analytical tools (models and computer programs); (2) conduct parameterization studies; (3) predict the global space station EMI environment; and (4) provide a basis for modification of EMI standards.
Operational procedures for ground station operation: ATS-3 Hawaii-Ames satellite link experiment
NASA Technical Reports Server (NTRS)
Nishioka, K.; Gross, E. H.
1979-01-01
Hardware description and operational procedures for the ATS-3 Hawaii-Ames satellite computer link are presented in basic step-by-step instructions. Transmit and receive channels and frequencies are given. Details such as switch settings for activating the station to the sequence of turning switches on are provided. Methods and procedures for troubleshooting common problems encountered with communication stations are also provided.
Army AL&T, October-December 2008
2008-12-01
during the WIN-T technology demonstration Nov. 8, 2007, at Naval Air Engineering Station , Lakehurst, NJ. (U.S. Army photo by Russ Messeroll.) 16 OCTOBER...worldwide communications architecture, enabling connectivity from the global backbone to regional networks to posts/camps/ stations , and, lastly, to...Force Tracker. • Tacticomp™ wireless and Global Positioning System(GPS)-enabled hand-held computer. • One Station Remote Video Terminal. • Counter
Software-Implemented Fault Tolerance in Communications Systems
NASA Technical Reports Server (NTRS)
Gantenbein, Rex E.
1994-01-01
Software-implemented fault tolerance (SIFT) is used in many computer-based command, control, and communications (C(3)) systems to provide the nearly continuous availability that they require. In the communications subsystem of Space Station Alpha, SIFT algorithms are used to detect and recover from failures in the data and command link between the Station and its ground support. The paper presents a review of these algorithms and discusses how such techniques can be applied to similar systems found in applications such as manufacturing control, military communications, and programmable devices such as pacemakers. With support from the Tracking and Communication Division of NASA's Johnson Space Center, researchers at the University of Wyoming are developing a testbed for evaluating the effectiveness of these algorithms prior to their deployment. This testbed will be capable of simulating a variety of C(3) system failures and recording the response of the Space Station SIFT algorithms to these failures. The design of this testbed and the applicability of the approach in other environments is described.
Considerations for a design and operations knowledge support system for Space Station Freedom
NASA Technical Reports Server (NTRS)
Erickson, Jon D.; Crouse, Kenneth H.; Wechsler, Donald B.; Flaherty, Douglas R.
1989-01-01
Engineering and operations of modern engineered systems depend critically upon detailed design and operations knowledge that is accurate and authoritative. A design and operations knowledge support system (DOKSS) is a modern computer-based information system providing knowledge about the creation, evolution, and growth of an engineered system. The purpose of a DOKSS is to provide convenient and effective access to this multifaceted information. The complexity of Space Station Freedom's (SSF's) systems, elements, interfaces, and organizations makes convenient access to design knowledge especially important, when compared to simpler systems. The life cycle length, being 30 or more years, adds a new dimension to space operations, maintenance, and evolution. Provided here is a review and discussion of design knowledge support systems to be delivered and operated as a critical part of the engineered system. A concept of a DOKSS for Space Station Freedom (SSF) is presented. This is followed by a detailed discussion of a DOKSS for the Lyndon B. Johnson Space Center and Work Package-2 portions of SSF.
Two-agent cooperative search using game models with endurance-time constraints
NASA Astrophysics Data System (ADS)
Sujit, P. B.; Ghose, Debasish
2010-07-01
In this article, the problem of two Unmanned Aerial Vehicles (UAVs) cooperatively searching an unknown region is addressed. The search region is discretized into hexagonal cells and each cell is assumed to possess an uncertainty value. The UAVs have to cooperatively search these cells taking limited endurance, sensor and communication range constraints into account. Due to limited endurance, the UAVs need to return to the base station for refuelling and also need to select a base station when multiple base stations are present. This article proposes a route planning algorithm that takes endurance time constraints into account and uses game theoretical strategies to reduce the uncertainty. The route planning algorithm selects only those cells that ensure the agent will return to any one of the available bases. A set of paths are formed using these cells which the game theoretical strategies use to select a path that yields maximum uncertainty reduction. We explore non-cooperative Nash, cooperative and security strategies from game theory to enhance the search effectiveness. Monte-Carlo simulations are carried out which show the superiority of the game theoretical strategies over greedy strategy for different look ahead step length paths. Within the game theoretical strategies, non-cooperative Nash and cooperative strategy perform similarly in an ideal case, but Nash strategy performs better than the cooperative strategy when the perceived information is different. We also propose a heuristic based on partitioning of the search space into sectors to reduce computational overhead without performance degradation.
Low-flow characteristics of Virginia streams
Austin, Samuel H.; Krstolic, Jennifer L.; Wiegand, Ute
2011-01-01
Low-flow annual non-exceedance probabilities (ANEP), called probability-percent chance (P-percent chance) flow estimates, regional regression equations, and transfer methods are provided describing the low-flow characteristics of Virginia streams. Statistical methods are used to evaluate streamflow data. Analysis of Virginia streamflow data collected from 1895 through 2007 is summarized. Methods are provided for estimating low-flow characteristics of gaged and ungaged streams. The 1-, 4-, 7-, and 30-day average streamgaging station low-flow characteristics for 290 long-term, continuous-record, streamgaging stations are determined, adjusted for instances of zero flow using a conditional probability adjustment method, and presented for non-exceedance probabilities of 0.9, 0.8, 0.7, 0.6, 0.5, 0.4, 0.3, 0.2, 0.1, 0.05, 0.02, 0.01, and 0.005. Stream basin characteristics computed using spatial data and a geographic information system are used as explanatory variables in regional regression equations to estimate annual non-exceedance probabilities at gaged and ungaged sites and are summarized for 290 long-term, continuous-record streamgaging stations, 136 short-term, continuous-record streamgaging stations, and 613 partial-record streamgaging stations. Regional regression equations for six physiographic regions use basin characteristics to estimate 1-, 4-, 7-, and 30-day average low-flow annual non-exceedance probabilities at gaged and ungaged sites. Weighted low-flow values that combine computed streamgaging station low-flow characteristics and annual non-exceedance probabilities from regional regression equations provide improved low-flow estimates. Regression equations developed using the Maintenance of Variance with Extension (MOVE.1) method describe the line of organic correlation (LOC) with an appropriate index site for low-flow characteristics at 136 short-term, continuous-record streamgaging stations and 613 partial-record streamgaging stations. Monthly streamflow statistics computed on the individual daily mean streamflows of selected continuous-record streamgaging stations and curves describing flow-duration are presented. Text, figures, and lists are provided summarizing low-flow estimates, selected low-flow sites, delineated physiographic regions, basin characteristics, regression equations, error estimates, definitions, and data sources. This study supersedes previous studies of low flows in Virginia.
Erdinc, Oguzhan
2011-01-01
This study explored the prevalence and work interference (WI) of upper extremity musculoskeletal discomfort (UEMSD) and investigated the associations of individual and work-related risk factors and using a notebook stand or docking station with UEMSD among symptomatic occupational notebook personal computer (PC) users. The participant group included 45 Turkish occupational notebook PC users. The study used self-reports of participants. The Turkish version of the Cornell Musculoskeletal Discomfort Questionnaire (T-CMDQ) was used to collect symptom data. UEMSD prevailed mostly in the neck, the upper back, and the lower back with prevalence rates of 77.8%, 73.3%, and 60.0% respectively, and with WI rates of 28.9%, 24.4%, and 26.7% respectively. Aggregated results showed that 44% of participants reported WI due to UEMSD in at least one body region. Significant risk factors were: being female, being aged <31 years, having computer work experience <10 years, and physical discomfort during computer use. UEMSD prevalence and WI rates were considerable in the neck, the upper back, and the lower back. Significant associations between certain risk factors and UEMSD were identified, but no association was found between using notebook stand and docking station and UEMSD among participants.
Improvement of the Earth's gravity field from terrestrial and satellite data
NASA Technical Reports Server (NTRS)
1987-01-01
The terrestrial gravity data base was updated. Studies related to the Geopotential Research Mission (GRM) have primarily considered the local recovery of gravity anomalies on the surface of the Earth based on satellite to satellite tracking or gradiometer data. A simulation study was used to estimate the accuracy of 1 degree-mean anomalies which could be recovered from the GRM data. Numerous procedures were developed for the intent of performing computations at the laser stations in the SL6 system to improve geoid undulation calculations.
Evaluation of the Interplate and Intraplate Deformations of the African Continent Using cGNSS Data
NASA Astrophysics Data System (ADS)
Apolinário, J. P.; Fernandes, R. M. S.; Bos, M. S.; Meghraoui, M.; Miranda, J. M. A.
2014-12-01
Two main plates, Nubia and Somalia, plus some few more tectonic blocks in the East African Rift System (EARS) delimit the African continent. The major part of the external plate boundaries of Africa is well defined by oceanic ridge systems with the exception of the Nubia-Eurasia complex convergence-collision tectonic zone. In addition, the number and distribution of the tectonic blocks along the EARS region is a major scientific issue that has not been completely answered so far. Nevertheless, the increased number of cGNSS (continuous Global Navigation Satellite Systems) stations in Africa with sufficient long data span is helping to better understand and constrain the complex sub-plate distribution in the EARS as well as in the other plate boundaries of Africa. This work is the geodetic contribution for the IGCP-Project 601 - "Seismotectonics and Seismic Hazards in Africa". It presents the current tectonic relative motions of the African continent based on the analysis of the estimated velocity field derived from the existing network of cGNSS stations in Africa and bordering plate tectonics. For the majority of the plate pairs, we present the most recent estimation of their relative velocity using a dedicated processing. The velocity solutions are computed using HECTOR, a software that takes into account the existing temporal correlations between the daily solutions of the stations. It allows to properly estimate the velocity uncertainties and to detect any artifacts in the time-series. For some of the plate pairs, we compare our solutions of the angular velocities with other geodetic and geophysical models. In addition, we also study the sensitivity of the derived angular velocity to changes in the data (longer data-span for some stations) for tectonic units with few stations, and in particular for the Victoria and Rovuma blocks of the EARS. Finally, we compute estimates of velocity fields for several sub-regions correlated with the seismotectonic provinces and discuss the level of interplate and intraplate deformations in Africa.
NASA Astrophysics Data System (ADS)
Ren, Feixiang; Huang, Jinsheng; Terauchi, Mutsuhiro; Jiang, Ruyi; Klette, Reinhard
A robust and efficient lane detection system is an essential component of Lane Departure Warning Systems, which are commonly used in many vision-based Driver Assistance Systems (DAS) in intelligent transportation. Various computation platforms have been proposed in the past few years for the implementation of driver assistance systems (e.g., PC, laptop, integrated chips, PlayStation, and so on). In this paper, we propose a new platform for the implementation of lane detection, which is based on a mobile phone (the iPhone). Due to physical limitations of the iPhone w.r.t. memory and computing power, a simple and efficient lane detection algorithm using a Hough transform is developed and implemented on the iPhone, as existing algorithms developed based on the PC platform are not suitable for mobile phone devices (currently). Experiments of the lane detection algorithm are made both on PC and on iPhone.
Space Shuttle Underside Astronaut Communications Performance Evaluation
NASA Technical Reports Server (NTRS)
Hwu, Shian U.; Dobbins, Justin A.; Loh, Yin-Chung; Kroll, Quin D.; Sham, Catherine C.
2005-01-01
The Space Shuttle Ultra High Frequency (UHF) communications system is planned to provide Radio Frequency (RF) coverage for astronauts working underside of the Space Shuttle Orbiter (SSO) for thermal tile inspection and repairing. This study is to assess the Space Shuttle UHF communication performance for astronauts in the shadow region without line-of-sight (LOS) to the Space Shuttle and Space Station UHF antennas. To insure the RF coverage performance at anticipated astronaut worksites, the link margin between the UHF antennas and Extravehicular Activity (EVA) Astronauts with significant vehicle structure blockage was analyzed. A series of near-field measurements were performed using the NASA/JSC Anechoic Chamber Antenna test facilities. Computational investigations were also performed using the electromagnetic modeling techniques. The computer simulation tool based on the Geometrical Theory of Diffraction (GTD) was used to compute the signal strengths. The signal strength was obtained by computing the reflected and diffracted fields along the propagation paths between the transmitting and receiving antennas. Based on the results obtained in this study, RF coverage for UHF communication links was determined for the anticipated astronaut worksite in the shadow region underneath the Space Shuttle.
Space Station Workstation Technology Workshop Report
NASA Technical Reports Server (NTRS)
Moe, K. L.; Emerson, C. M.; Eike, D. R.; Malone, T. B.
1985-01-01
This report describes the results of a workshop conducted at Goddard Space Flight Center (GSFC) to identify current and anticipated trends in human-computer interface technology that may influence the design or operation of a space station workstation. The workshop was attended by approximately 40 persons from government and academia who were selected for their expertise in some aspect of human-machine interaction research. The focus of the workshop was a 1 1/2 brainstorming/forecasting session in which the attendees were assigned to interdisciplinary working groups and instructed to develop predictions for each of the following technology areas: (1) user interface, (2) resource management, (3) control language, (4) data base systems, (5) automatic software development, (6) communications, (7) training, and (8) simulation. This report is significant in that it provides a unique perspective on workstation design for the space station. This perspective, which is characterized by a major emphasis on user requirements, should be most valuable to Phase B contractors involved in design development of the space station workstation. One of the more compelling results of the workshop is the recognition that no major technological breakthroughs are required to implement the current workstation concept. What is required is the creative application of existing knowledge and technology.
The computing and data infrastructure to interconnect EEE stations
NASA Astrophysics Data System (ADS)
Noferini, F.; EEE Collaboration
2016-07-01
The Extreme Energy Event (EEE) experiment is devoted to the search of high energy cosmic rays through a network of telescopes installed in about 50 high schools distributed throughout the Italian territory. This project requires a peculiar data management infrastructure to collect data registered in stations very far from each other and to allow a coordinated analysis. Such an infrastructure is realized at INFN-CNAF, which operates a Cloud facility based on the OpenStack opensource Cloud framework and provides Infrastructure as a Service (IaaS) for its users. In 2014 EEE started to use it for collecting, monitoring and reconstructing the data acquired in all the EEE stations. For the synchronization between the stations and the INFN-CNAF infrastructure we used BitTorrent Sync, a free peer-to-peer software designed to optimize data syncronization between distributed nodes. All data folders are syncronized with the central repository in real time to allow an immediate reconstruction of the data and their publication in a monitoring webpage. We present the architecture and the functionalities of this data management system that provides a flexible environment for the specific needs of the EEE project.
GRAVSAT/GEOPAUSE refraction study
NASA Technical Reports Server (NTRS)
Llewellyn, S. K.
1977-01-01
A ground station network tracked a high altitude spacecraft which in turn tracked a low orbiting satellite. Orbit data are relayed back to the ground stations. A refraction study was performed on this configuration to compute ionospheric and tropospheric refraction effects along the satellite and ground links.
2014-08-04
ISS040-E-088730 (4 Aug. 2014) --- In the International Space Station?s Harmony node, NASA astronauts Steve Swanson (foreground), Expedition 40 commander; and Reid Wiseman, flight engineer, perform a portable onboard computer Dynamic Onboard Ubiquitous Graphics (DOUG) software review in preparation for two upcoming U.S. spacewalks.
Tibi, Rigobert; Young, Christopher; Gonzales, Antonio; ...
2017-07-04
The matched filtering technique that uses the cross correlation of a waveform of interest with archived signals from a template library has proven to be a powerful tool for detecting events in regions with repeating seismicity. However, waveform correlation is computationally expensive and therefore impractical for large template sets unless dedicated distributed computing hardware and software are used. In this paper, we introduce an approximate nearest neighbor (ANN) approach that enables the use of very large template libraries for waveform correlation. Our method begins with a projection into a reduced dimensionality space, based on correlation with a randomized subset ofmore » the full template archive. Searching for a specified number of nearest neighbors for a query waveform is accomplished by iteratively comparing it with the neighbors of its immediate neighbors. We used the approach to search for matches to each of ~2300 analyst-reviewed signal detections reported in May 2010 for the International Monitoring System station MKAR. The template library in this case consists of a data set of more than 200,000 analyst-reviewed signal detections for the same station from February 2002 to July 2016 (excluding May 2010). Of these signal detections, 73% are teleseismic first P and 17% regional phases (Pn, Pg, Sn, and Lg). Finally, the analyses performed on a standard desktop computer show that the proposed ANN approach performs a search of the large template libraries about 25 times faster than the standard full linear search and achieves recall rates greater than 80%, with the recall rate increasing for higher correlation thresholds.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tibi, Rigobert; Young, Christopher; Gonzales, Antonio
The matched filtering technique that uses the cross correlation of a waveform of interest with archived signals from a template library has proven to be a powerful tool for detecting events in regions with repeating seismicity. However, waveform correlation is computationally expensive and therefore impractical for large template sets unless dedicated distributed computing hardware and software are used. In this paper, we introduce an approximate nearest neighbor (ANN) approach that enables the use of very large template libraries for waveform correlation. Our method begins with a projection into a reduced dimensionality space, based on correlation with a randomized subset ofmore » the full template archive. Searching for a specified number of nearest neighbors for a query waveform is accomplished by iteratively comparing it with the neighbors of its immediate neighbors. We used the approach to search for matches to each of ~2300 analyst-reviewed signal detections reported in May 2010 for the International Monitoring System station MKAR. The template library in this case consists of a data set of more than 200,000 analyst-reviewed signal detections for the same station from February 2002 to July 2016 (excluding May 2010). Of these signal detections, 73% are teleseismic first P and 17% regional phases (Pn, Pg, Sn, and Lg). Finally, the analyses performed on a standard desktop computer show that the proposed ANN approach performs a search of the large template libraries about 25 times faster than the standard full linear search and achieves recall rates greater than 80%, with the recall rate increasing for higher correlation thresholds.« less
Development of a Real-Time GPS/Seismic Displacement Meter: GPS Component
NASA Astrophysics Data System (ADS)
Bock, Y.; Canas, J.; Andrew, A.; Vernon, F.
2002-12-01
We report on the status of the Orange County Real-Time GPS Network (OCRTN), an upgrade of the SCIGN sites in Orange County and Catalina Island to low latency (1 sec), high-rate (1 Hz) data streaming, analysis, and dissemination. The project is a collaborative effort of the California Spatial Reference Center (CSRC) and the Orange County Dept. of Geomatics, with partners from the geophysical community (SCIGN), local and state government, and the private sector. As part of Phase 1 of the project, nine sites are streaming data by dedicated, point-to-point radio modems to a central data server located in Santa Ana. Instantaneous positions are computed for each site. Data are converted from 1 Hz Ashtech binary MBEN format to (1) 1 Hz RTCM format, and (2) decimated (15 sec) RINEX format. A second computer outside a firewall and located in another building at the Orange County's Computer Center is a TCP-based client of RTCM data (messages 18, 19, 3, and 22) from the data server, as well as a TCP-based server of RTCM data to the outside world. An external computer can access the RTCM data from all active sites through an IP socket connection. Data latency, in the best case, is less than 1 sec from real-time. Once a day, the decimated RINEX data are transferred by ftp from the data server to the SOPAC-CSRC archive at Scripps. Data recovery is typically 99-100%. As part of the second phase of the project, the RTCM server provides data to field receivers to perform RTK surveying. On connection to the RTCM server the user gets a list of active stations, and can then choose from which site to retrieve RTCM data. This site then plays the role of the RTK base station and a CDPD-based wireless Internet device plays the role of the normal RTK radio link. If an Internet connection is available, we will demonstrate how the system operates. This system will serve as a prototype for the GPS component of the GPS/seismic displacement meter.
47 CFR 90.305 - Location of stations.
Code of Federal Regulations, 2010 CFR
2010-10-01
... stations. (a) The transmitter site(s) for base station(s), including mobile relay stations, shall be.... (b) Mobile units shall be operated within 48 km. (30 mi.) of their associated base station or...). (c) Control stations must be located within the area of operation of the mobile units. (d) Base and...
NASA Astrophysics Data System (ADS)
Gao, J. L.
2002-04-01
In this article, we present a system-level characterization of the energy consumption for sensor network application scenarios. We compute a power efficiency metric -- average watt-per-meter -- for each radio transmission and extend this local metric to find the global energy consumption. This analysis shows how overall energy consumption varies with transceiver characteristics, node density, data traffic distribution, and base-station location.
NASA Astrophysics Data System (ADS)
Saikia, C. K.; Roman-nieves, J. I.; Woods, M. T.
2013-12-01
Source parameters of nuclear and chemical explosions are often estimated by matching either the corner frequency and spectral level of a single event or the spectral ratio when spectra from two events are available with known source parameters for one. In this study, we propose an alternative method in which waveforms from two or more events can be simultaneously equalized by setting the differential of the processed seismograms at one station from any two individual events to zero. The method involves convolving the equivalent Mueller-Murphy displacement source time function (MMDSTF) of one event with the seismogram of the second event and vice-versa, and then computing their difference seismogram. MMDSTF is computed at the elastic radius including both near and far-field terms. For this method to yield accurate source parameters, an inherent assumption is that green's functions for the any paired events from the source to a receiver are same. In the frequency limit of the seismic data, this is a reasonable assumption and is concluded based on the comparison of green's functions computed for flat-earth models at various source depths ranging from 100m to 1Km. Frequency domain analysis of the initial P wave is, however, sensitive to the depth phase interaction, and if tracked meticulously can help estimating the event depth. We applied this method to the local waveforms recorded from the three SPE shots and precisely determined their yields. These high-frequency seismograms exhibit significant lateral path effects in spectrogram analysis and 3D numerical computations, but the source equalization technique is independent of any variation as long as their instrument characteristics are well preserved. We are currently estimating the uncertainty in the derived source parameters assuming the yields of the SPE shots as unknown. We also collected regional waveforms from 95 NTS explosions at regional stations ALQ, ANMO, CMB, COR, JAS LON, PAS, PFO and RSSD. We are currently employing a station based analysis using the equalization technique to estimate depth and yields of many relative to those of the announced explosions; and to develop their relationship with the Mw and Mo for the NTS explosions.
NASA Astrophysics Data System (ADS)
Rude, C. M.; Li, J. D.; Gowanlock, M.; Herring, T.; Pankratius, V.
2016-12-01
Surface subsidence due to depletion of groundwater can lead to permanent compaction of aquifers and damaged infrastructure. However, studies of such effects on a large scale are challenging and compute intensive because they involve fusing a variety of data sets beyond direct measurements from groundwater wells, such as gravity change measurements from the Gravity Recovery and Climate Experiment (GRACE) or surface displacements measured by GPS receivers. Our work therefore leverages Amazon cloud computing to enable these types of analyses spanning the entire continental US. Changes in groundwater storage are inferred from surface displacements measured by GPS receivers stationed throughout the country. Receivers located on bedrock are anti-correlated with changes in water levels from elastic deformation due to loading, while stations on aquifers correlate with groundwater changes due to poroelastic expansion and compaction. Correlating linearly detrended equivalent water thickness measurements from GRACE with linearly detrended and Kalman filtered vertical displacements of GPS stations located throughout the United States helps compensate for the spatial and temporal limitations of GRACE. Our results show that the majority of GPS stations are negatively correlated with GRACE in a statistically relevant way, as most GPS stations are located on bedrock in order to provide stable reference locations and measure geophysical processes such as tectonic deformations. Additionally, stations located on the Central Valley California aquifer show statistically significant positive correlations. Through the identification of positive and negative correlations, deformation phenomena can be classified as loading or poroelastic expansion due to changes in groundwater. This method facilitates further studies of terrestrial water storage on a global scale. This work is supported by NASA AIST-NNX15AG84G (PI: V. Pankratius) and Amazon.
Conrads, Paul; Journey, Celeste A.; Clark, Jimmy M.; Levesque, Victor A.
2013-01-01
To effectively plan site-specific studies to understand the connection between wastewater effluent and shellfish beds, data are needed concerning flow dynamics and background fluorescence in the Atlantic Intracoastal Waterway near the effluent outfalls on Sullivan’s Island and the Isle of Palms. Tidal flows were computed by the U.S. Geological Survey for three stations and longitudinal water-quality profiles were collected at high and low tide. Flows for the three U.S. Geological Survey stations, the Atlantic Intracoastal Waterway by the Isle of Palms Marina, the Atlantic Intracoastal Waterway by the Ben M. Sawyer Memorial Bridge at Sullivan’s Island, and Breach Inlet, were computed for the 53-day period from December 4, 2011, to January 26, 2012. The largest flows occurred at Breach Inlet and ranged from -58,600 cubic feet per second (ft3/s) toward the Atlantic Intracoastal Waterway to 63,300 ft3/s toward the Atlantic Ocean. Of the two stations on the Atlantic Intracoastal Waterway, the Sullivan’s Island station had the larger flows and ranged from -6,360 ft3/s to the southwest (toward Charleston Harbor) to 8,930 ft3/s to the northeast. Computed tidal flow at the Isle of Palms station ranged from -3,460 ft3/s toward the southwest to 6,410 ft3/s toward the northeast. The synoptic water-quality study showed that the stations were well mixed vertically and horizontally. All fluorescence measurements (recorded as rhodamine concentration) were below the accuracy of the sensor and the background fluorescence would not likely interfere with a dye-tracer study.
47 CFR 95.139 - Adding a small base station or a small control station.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 5 2010-10-01 2010-10-01 false Adding a small base station or a small control... base station or a small control station. (a) Except for a GMRS system licensed to a non-individual, one or more small base stations or a small control station may be added to a GMRS system at any point...
Hybrid evolutionary computing model for mobile agents of wireless Internet multimedia
NASA Astrophysics Data System (ADS)
Hortos, William S.
2001-03-01
The ecosystem is used as an evolutionary paradigm of natural laws for the distributed information retrieval via mobile agents to allow the computational load to be added to server nodes of wireless networks, while reducing the traffic on communication links. Based on the Food Web model, a set of computational rules of natural balance form the outer stage to control the evolution of mobile agents providing multimedia services with a wireless Internet protocol WIP. The evolutionary model shows how mobile agents should behave with the WIP, in particular, how mobile agents can cooperate, compete and learn from each other, based on an underlying competition for radio network resources to establish the wireless connections to support the quality of service QoS of user requests. Mobile agents are also allowed to clone themselves, propagate and communicate with other agents. A two-layer model is proposed for agent evolution: the outer layer is based on the law of natural balancing, the inner layer is based on a discrete version of a Kohonen self-organizing feature map SOFM to distribute network resources to meet QoS requirements. The former is embedded in the higher OSI layers of the WIP, while the latter is used in the resource management procedures of Layer 2 and 3 of the protocol. Algorithms for the distributed computation of mobile agent evolutionary behavior are developed by adding a learning state to the agent evolution state diagram. When an agent is in an indeterminate state, it can communicate to other agents. Computing models can be replicated from other agents. Then the agents transitions to the mutating state to wait for a new information-retrieval goal. When a wireless terminal or station lacks a network resource, an agent in the suspending state can change its policy to submit to the environment before it transitions to the searching state. The agents learn the facts of agent state information entered into an external database. In the cloning process, two agents on a host station sharing a common goal can be merged or married to compose a new agent. Application of the two-layer set of algorithms for mobile agent evolution, performed in a distributed processing environment, is made to the QoS management functions of the IP multimedia IM sub-network of the third generation 3G Wideband Code-division Multiple Access W-CDMA wireless network.
NASA Astrophysics Data System (ADS)
Sur, D.; Paul, A.
2017-12-01
The equatorial ionosphere shows sharp diurnal and latitudinal Total Electron Content (TEC) variations over a major part of the day. Equatorial ionosphere also exhibits intense post-sunset ionospheric irregularities. Accurate prediction of TEC in these low latitudes is not possible from standard ionospheric models. An Artificial Neural Network (ANN) based Vertical TEC (VTEC) model has been designed using TEC data in low latitude Indian longitude sector for accurate prediction of VTEC. GPS TEC data from the stations Calcutta (22.58°N, 88.38°E geographic, magnetic dip 32°), Baharampore (24.09°N, 88.25°E geographic, magnetic dip 35°) and Siliguri (26.72°N, 88.39°E geographic; magnetic dip 40°) are used as training dataset for the duration of January 2007-September 2011. Poleward VTEC gradients from northern EIA crest to region beyond EIA crest have been calculated from measured VTEC and compared with that obtained from ANN based VTEC model. TEC data from Calcutta and Siliguri are used to compute VTEC gradients during April 2013 and August-September 2013. It has been observed that poleward VTEC gradient computed from ANN based TEC model has shown good correlation with measured values during vernal and autumnal equinoxes of high solar activity periods of 2013. Possible correlation between measured poleward TEC gradients and post-sunset scintillations (S4 ≥ 0.4) from northern crest of EIA has been observed in this paper. From the observation, a suitable threshold poleward VTEC gradient has been proposed for possible occurrence of post-sunset scintillations at northern crest of EIA along 88°E longitude. Poleward VTEC gradients obtained from ANN based VTEC model are used to forecast possible ionospheric scintillation after post-sunset period using the threshold value. It has been observed that these predicted VTEC gradients obtained from ANN based VTEC model can forecast post-sunset L-band scintillation with an accuracy of 67% to 82% in this dynamic low latitude region. The use of VTEC gradients from ANN based VTEC model removes the necessity of continuous operation of multi-station ground based TEC receivers in this low latitude region.
Johnson, Kevin K.; Goodwin, Greg E.
2013-01-01
Lake Michigan diversion accounting is the process used by the U. S. Army Corps of Engineers to quantify the amount of water that is diverted from the Lake Michigan watershed into the Illinois and Mississippi River Basins. A network of streamgages within the Chicago area waterway system monitor tributary river flows and the major river flow on the Chicago Sanitary and Ship Canal near Lemont as one of the instrumental tools used for Lake Michigan diversion accounting. The mean annual discharges recorded by these streamgages are used as additions or deductions to the mean annual discharge recorded by the main stream gaging station currently used in the Lake Michigan diversion accounting process, which is the Chicago Sanitary and Ship Canal near Lemont, Illinois (station number 05536890). A new stream gaging station, Summit Conduit near Summit, Illinois (station number 414757087490401), was installed on September 23, 2010, for the purpose of monitoring stage, velocity, and discharge through the Summit Conduit for the U.S. Army Corps of Engineers in accordance with Lake Michigan diversion accounting. Summit Conduit conveys flow from a small part of the lower Des Plaines River watershed underneath the Des Plaines River directly into the Chicago Sanitary and Ship Canal. Because the Summit Conduit discharges into the Chicago Sanitary and Ship Canal upstream from the stream gaging station at Lemont, Illinois, but does not contain flow diverted from the Lake Michigan watershed, it is considered a flow deduction to the discharge measured by the Lemont stream gaging station in the Lake Michigan diversion accounting process. This report offers a technical summary of the techniques and methods used for the collection and computation of the stage, velocity, and discharge data at the Summit Conduit near Summit, Illinois stream gaging station for the 2011 and 2012 Water Years. The stream gaging station Summit Conduit near Summit, Illinois (station number 414757087490401) is an example of a nonstandard stream gage. Traditional methods of equating stage to discharge historically were not effective. Examples of the nonstandard conditions include the converging tributary flows directly upstream of the gage; the trash rack and walkway near the opening of the conduit introducing turbulence and occasionally entraining air bubbles into the flow; debris within the conduit creating conditions of variable backwater and the constant influx of smaller debris that escapes the trash rack and catches or settles in the conduit and on the equipment. An acoustic Doppler velocity meter was installed to measure stage and velocity to compute discharge. The stage is used to calculate area based the stage-area rating. The index-velocity from the acoustic Doppler velocity meter is applied to the velocity-velocity rating and the product of the two rated values is a rated discharge by the index-velocity method. Nonstandard site conditions prevalent at the Summit Conduit stream gaging station generally are overcome through the index-velocity method. Despite the difficulties in gaging and measurements, improvements continue to be made in data collection, transmission, and measurements. Efforts to improve the site and to improve the ratings continue to improve the quality and quantity of the data available for Lake Michigan diversion accounting.
Space Station Astronauts Return Safely to Earth on This Week @NASA – December 11, 2015
2015-12-11
On Dec. 11 aboard the International Space Station, NASA’s Kjell Lindgren, Russian cosmonaut Oleg Kononenko and Kimiya Yui of the Japan Aerospace Exploration Agency, bid farewell to crew members remaining on the station -- including Commander Scott Kelly, NASA’s one-year mission astronaut. The returning members of Expedition 45 then climbed aboard their Soyuz spacecraft for the trip back to Earth. They safely touched down hours later in Kazakhstan – closing out a 141-day stay in space. Also, Next space station crew prepares for launch, Supply mission arrives at space station, Quantum computing lab and more!
Present status and future of the sophisticated work station
NASA Astrophysics Data System (ADS)
Ishida, Haruhisa
The excellency of the work station is explained, by comparing the functions of software and hardware of work station with those of personal computer. As one of the examples utilizing the functions of work station, desk top publishing is explained. By describing the competition between the Group of ATT · Sun Microsystems which intends to have the leadership by integrating Berkeley version which is most popular at this moment and System V version, and the group led by IBM, future of UNIX as OS of work station is predicted. Development of RISC processor, TRON Plan and Sigma Projects by MITI are also mentioned as its background.
NASA Technical Reports Server (NTRS)
Wang, Ren H.
1991-01-01
A method of combined use of magnetic vector potential (MVP) based finite element (FE) formulations and magnetic scalar potential (MSP) based FE formulations for computation of three-dimensional (3D) magnetostatic fields is developed. This combined MVP-MSP 3D-FE method leads to considerable reduction by nearly a factor of 3 in the number of unknowns in comparison to the number of unknowns which must be computed in global MVP based FE solutions. This method allows one to incorporate portions of iron cores sandwiched in between coils (conductors) in current-carrying regions. Thus, it greatly simplifies the geometries of current carrying regions (in comparison with the exclusive MSP based methods) in electric machinery applications. A unique feature of this approach is that the global MSP solution is single valued in nature, that is, no branch cut is needed. This is again a superiority over the exclusive MSP based methods. A Newton-Raphson procedure with a concept of an adaptive relaxation factor was developed and successfully used in solving the 3D-FE problem with magnetic material anisotropy and nonlinearity. Accordingly, this combined MVP-MSP 3D-FE method is most suited for solution of large scale global type magnetic field computations in rotating electric machinery with very complex magnetic circuit geometries, as well as nonlinear and anisotropic material properties.
Usachev typing while in sleep station in the Service Module
2001-03-23
ISS002-E-5730 (23 March 2001) --- Cosmonaut Yury V. Usachev, Expedition Two commander, works at a laptop computer in his crew compartment in the Zvezda Service Module aboard the International Space Station (ISS). The image was recorded with a digital still camera.
Archuleta, Christy-Ann M.; Gonzales, Sophia L.; Maltby, David R.
2012-01-01
The U.S. Geological Survey (USGS), in cooperation with the Texas Commission on Environmental Quality, developed computer scripts and applications to automate the delineation of watershed boundaries and compute watershed characteristics for more than 3,000 surface-water-quality monitoring stations in Texas that were active during 2010. Microsoft Visual Basic applications were developed using ArcGIS ArcObjects to format the source input data required to delineate watershed boundaries. Several automated scripts and tools were developed or used to calculate watershed characteristics using Python, Microsoft Visual Basic, and the RivEX tool. Automated methods were augmented by the use of manual methods, including those done using ArcMap software. Watershed boundaries delineated for the monitoring stations are limited to the extent of the Subbasin boundaries in the USGS Watershed Boundary Dataset, which may not include the total watershed boundary from the monitoring station to the headwaters.
NASA Technical Reports Server (NTRS)
1973-01-01
Results of a two-phase study of the (Data Handling and Management System DHMS) are presented. An original baseline DHMS is described. Its estimated costs are presented in detail. The DHMS automates the Tracking and Data Relay Satellite System (TDRSS) ground station's functions and handles both the forward and return link user and relay satellite data passing through the station. Direction of the DHMS is effected via a TDRSS Operations Control Central (OCC) that is remotely located. A composite ground station system, a modified DHMS (MDHMS), was conceptually developed. The MDHMS performs both the DHMS and OCC functions. Configurations and costs are presented for systems using minicomputers and midicomputers. It is concluded that a MDHMS should be configured with a combination of the two computer types. The midicomputers provide the system's organizational direction and computational power, and the minicomputers (or interface processors) perform repetitive data handling functions that relieve the midicomputers of these burdensome tasks.
NASA's OCA Mirroring System: An Application of Multiagent Systems in Mission Control
NASA Technical Reports Server (NTRS)
Sierhuis, Maarten; Clancey, William J.; vanHoof, Ron J. J.; Seah, Chin H.; Scott, Michael S.; Nado, Robert A.; Blumenberg, Susan F.; Shafto, Michael G.; Anderson, Brian L.; Bruins, Anthony C.;
2009-01-01
Orbital Communications Adaptor (OCA) Flight Controllers, in NASA's International Space Station Mission Control Center, use different computer systems to uplink, downlink, mirror, archive, and deliver files to and from the International Space Station (ISS) in real time. The OCA Mirroring System (OCAMS) is a multiagent software system (MAS) that is operational in NASA's Mission Control Center. This paper presents OCAMS and its workings in an operational setting where flight controllers rely on the system 24x7. We also discuss the return on investment, based on a simulation baseline, six months of 24x7 operations at NASA Johnson Space Center in Houston, Texas, and a projection of future capabilities. This paper ends with a discussion of the value of MAS and future planned functionality and capabilities.
Telescience testbed pilot program, volume 3: Experiment summaries
NASA Technical Reports Server (NTRS)
Leiner, Barry M.
1989-01-01
Space Station Freedom and its associated labs, coupled with the availability of new computing and communications technologies, have the potential for significantly enhancing scientific research. A Telescience Testbed Pilot Program (TTPP), aimed at developing the experience base to deal with issues in the design of the future information system of the Space Station era. The testbeds represented four scientific disciplines (astronomy and astrophysics, earth science, life sciences, and microgravity sciences) and studied issues in payload design, operation, and data analysis. This volume, of a 3 volume set, which all contain the results of the TTPP, presents summaries of the experiments. This experiment involves the evaluation of the current Internet for the use of file and image transfer between SIRTF instrument teams. The main issue addressed was current network response times.
The Integrated Radiation Mapper Assistant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carlton, R.E.; Tripp, L.R.
1995-03-01
The Integrated Radiation Mapper Assistant (IRMA) system combines state-of-the-art radiation sensors and microprocessor based analysis techniques to perform radiation surveys. Control of the survey function is from a control station located outside the radiation thus reducing time spent in radiation areas performing radiation surveys. The system consists of a directional radiation sensor, a laser range finder, two area radiation sensors, and a video camera mounted on a pan and tilt platform. THis sensor package is deployable on a remotely operated vehicle. The outputs of the system are radiation intensity maps identifying both radiation source intensities and radiation levels throughout themore » room being surveyed. After completion of the survey, the data can be removed from the control station computer for further analysis or archiving.« less
Water recovery and management test support modeling for Space Station Freedom
NASA Technical Reports Server (NTRS)
Mohamadinejad, Habib; Bacskay, Allen S.
1990-01-01
The water-recovery and management (WRM) subsystem proposed for the Space Station Freedom program is outlined, and its computerized modeling and simulation based on a Computer Aided System Engineering and Analysis (CASE/A) program are discussed. A WRM test model consisting of a pretreated urine processing (TIMES), hygiene water processing (RO), RO brine processing using TIMES, and hygiene water storage is presented. Attention is drawn to such end-user equipment characteristics as the shower, dishwasher, clotheswasher, urine-collection facility, and handwash. The transient behavior of pretreated-urine, RO waste-hygiene, and RO brine tanks is assessed, as well as the total input/output to or from the system. The model is considered to be beneficial for pretest analytical predictions as a program cost-saving feature.
NASA Astrophysics Data System (ADS)
Betz, Jessie M. Bethly
1993-12-01
The Video Distribution Subsystem (VDS) for Space Station Freedom provides onboard video communications. The VDS includes three major functions: external video switching; internal video switching; and sync and control generation. The Video Subsystem Routing (VSR) is a part of the VDS Manager Computer Software Configuration Item (VSM/CSCI). The VSM/CSCI is the software which controls and monitors the VDS equipment. VSR activates, terminates, and modifies video services in response to Tier-1 commands to connect video sources to video destinations. VSR selects connection paths based on availability of resources and updates the video routing lookup tables. This project involves investigating the current methodology to automate the Video Subsystem Routing and developing and testing a prototype as 'proof of concept' for designers.
Standard payload computer for the international space station
NASA Astrophysics Data System (ADS)
Knott, Karl; Taylor, Chris; Koenig, Horst; Schlosstein, Uwe
1999-01-01
This paper describes the development and application of a Standard PayLoad Computer (SPLC) which is being applied by the majority of ESA payloads accommodated on the International Space Station (ISS). The strategy of adopting of a standard computer leads to a radical rethink in the payload data handling procurement process. Traditionally, this has been based on a proprietary development with repeating costs for qualification, spares, expertise and maintenance for each new payload. Implementations have also tended to be unique with very little opportunity for reuse or utilisation of previous developments. While this may to some extent have been justified for short duration one-off missions, the availability of a standard, long term space infrastructure calls for a quite different approach. To support a large number of concurrent payloads, the ISS implementation relies heavily on standardisation, and this is particularly true in the area of payloads. Physical accommodation, data interfaces, protocols, component quality, operational requirements and maintenance including spares provisioning must all conform to a common set of standards. The data handling system and associated computer used by each payload must also comply with these common requirements, and thus it makes little sense to instigate multiple developments for the same task. The opportunity exists to provide a single computer suitable for all payloads, but with only a one-off development and qualification cost. If this is combined with the benefits of multiple procurement, centralised spares and maintenance, there is potential for great savings to be made by all those concerned in the payload development process. In response to the above drivers, the SPLC is based on the following concepts: • A one-off development and qualification process • A modular computer, configurable according to the payload developer's needs from a list of space-qualified items • An `open system' which may be added to by payload developers • Core software providing a suite of common communications services including a verified protocol implementation required to communicate with the ISS • A standardized ground support equipment and accompanying software development environment • The use of commercial hardware and software standards and products.
Lunar Orbiter II - Photographic Mission Summary
NASA Technical Reports Server (NTRS)
1967-01-01
Lunar Orbiter II photography of landing sites, and spacecraft systems performance. The second of five Lunar Orbiter spacecraft was successfully launched from Launch Complex 13 at the Air Force Eastern Test Range by an Atlas-Agena launch vehicle at 23:21 GMT on November 6, 1966. Tracking data from the Cape Kennedy and Grand Bahama tracking stations were used to control and guide the launch vehicle during Atlas powered flight. The Agena spacecraft combination was maneuvered into a 100-nautical-mile-altitude Earth orbit by the preset on-board Agena computer. In addition, the Agena computer determined the maneuver 1 and engine-bum period required to inject the spacecraft on the cislunar trajectory 20 minutes after launch. Tracking data from the downrange stations and the Johannesburg, South Africa station were used to monitor the entire boost trajectory.
The impact of joint responses of devices in an airport security system.
Nie, Xiaofeng; Batta, Rajan; Drury, Colin G; Lin, Li
2009-02-01
In this article, we consider a model for an airport security system in which the declaration of a threat is based on the joint responses of inspection devices. This is in contrast to the typical system in which each check station independently declares a passenger as having a threat or not having a threat. In our framework the declaration of threat/no-threat is based upon the passenger scores at the check stations he/she goes through. To do this we use concepts from classification theory in the field of multivariate statistics analysis and focus on the main objective of minimizing the expected cost of misclassification. The corresponding correct classification and misclassification probabilities can be obtained by using a simulation-based method. After computing the overall false alarm and false clear probabilities, we compare our joint response system with two other independently operated systems. A model that groups passengers in a manner that minimizes the false alarm probability while maintaining the false clear probability within specifications set by a security authority is considered. We also analyze the staffing needs at each check station for such an inspection scheme. An illustrative example is provided along with sensitivity analysis on key model parameters. A discussion is provided on some implementation issues, on the various assumptions made in the analysis, and on potential drawbacks of the approach.
Autonomous power expert system
NASA Technical Reports Server (NTRS)
Ringer, Mark J.; Quinn, Todd M.
1990-01-01
The goal of the Autonomous Power System (APS) program is to develop and apply intelligent problem solving and control technologies to the Space Station Freedom Electrical Power Systems (SSF/EPS). The objectives of the program are to establish artificial intelligence/expert system technology paths, to create knowledge based tools with advanced human-operator interfaces, and to integrate and interface knowledge-based and conventional control schemes. This program is being developed at the NASA-Lewis. The APS Brassboard represents a subset of a 20 KHz Space Station Power Management And Distribution (PMAD) testbed. A distributed control scheme is used to manage multiple levels of computers and switchgear. The brassboard is comprised of a set of intelligent switchgear used to effectively switch power from the sources to the loads. The Autonomous Power Expert System (APEX) portion of the APS program integrates a knowledge based fault diagnostic system, a power resource scheduler, and an interface to the APS Brassboard. The system includes knowledge bases for system diagnostics, fault detection and isolation, and recommended actions. The scheduler autonomously assigns start times to the attached loads based on temporal and power constraints. The scheduler is able to work in a near real time environment for both scheduling and dynamic replanning.
Autonomous power expert system
NASA Technical Reports Server (NTRS)
Ringer, Mark J.; Quinn, Todd M.
1990-01-01
The goal of the Autonomous Power System (APS) program is to develop and apply intelligent problem solving and control technologies to the Space Station Freedom Electrical Power Systems (SSF/EPS). The objectives of the program are to establish artificial intelligence/expert system technology paths, to create knowledge based tools with advanced human-operator interfaces, and to integrate and interface knowledge-based and conventional control schemes. This program is being developed at the NASA-Lewis. The APS Brassboard represents a subset of a 20 KHz Space Station Power Management And Distribution (PMAD) testbed. A distributed control scheme is used to manage multiple levels of computers and switchgear. The brassboard is comprised of a set of intelligent switchgear used to effectively switch power from the sources to the loads. The Autonomous Power Expert System (APEX) portion of the APS program integrates a knowledge based fault diagnostic system, a power resource scheduler, and an interface to the APS Brassboard. The system includes knowledge bases for system diagnostics, fault detection and isolation, and recommended actions. The scheduler autonomously assigns start times to the attached loads based on temporal and power constraints. The scheduler is able to work in a near real time environment for both scheduling an dynamic replanning.
Electrochemical carbon dioxide concentrator subsystem math model. [for manned space station
NASA Technical Reports Server (NTRS)
Marshall, R. D.; Carlson, J. N.; Schubert, F. H.
1974-01-01
A steady state computer simulation model has been developed to describe the performance of a total six man, self-contained electrochemical carbon dioxide concentrator subsystem built for the space station prototype. The math model combines expressions describing the performance of the electrochemical depolarized carbon dioxide concentrator cells and modules previously developed with expressions describing the performance of the other major CS-6 components. The model is capable of accurately predicting CS-6 performance over EDC operating ranges and the computer simulation results agree with experimental data obtained over the prediction range.