LOFT L2-3 blowdown experiment safety analyses D, E, and G; LOCA analyses H, K, K1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perryman, J.L.; Keeler, C.D.; Saukkoriipi, L.O.
1978-12-01
Three calculations using conservative off-nominal conditions and evaluation model options were made using RELAP4/MOD5 for blowdown-refill and RELAP4/MOD6 for reflood for Loss-of-Fluid Test Experiment L2-3 to support the experiment safety analysis effort. The three analyses are as follows: Analysis D: Loss of commercial power during Experiment L2-3; Analysis E: Hot leg quick-opening blowdown valve (QOBV) does not open during Experiment L2-3; and Analysis G: Cold leg QOBV does not open during Experiment L2-3. In addition, the results of three LOFT loss-of-coolant accident (LOCA) analyses using a power of 56.1 MW and a primary coolant system flow rate of 3.6 millionmore » 1bm/hr are presented: Analysis H: Intact loop 200% hot leg break; emergency core cooling (ECC) system B unavailable; Analysis K: Pressurizer relief valve stuck in open position; ECC system B unavailable; and Analysis K1: Same as analysis K, but using a primary coolant system flow rate of 1.92 million 1bm/hr (L2-4 pre-LOCE flow rate). For analysis D, the maximum cladding temperature reached was 1762/sup 0/F, 22 sec into reflood. In analyses E and G, the blowdowns were slower due to one of the QOBVs not functioning. The maximum cladding temperature reached in analysis E was 1700/sup 0/F, 64.7 sec into reflood; for analysis G, it was 1300/sup 0/F at the start of reflood. For analysis H, the maximum cladding temperature reached was 1825/sup 0/F, 0.01 sec into reflood. Analysis K was a very slow blowdown, and the cladding temperatures followed the saturation temperature of the system. The results of analysis K1 was nearly identical to analysis K; system depressurization was not affected by the primary coolant system flow rate.« less
Independent Orbiter Assessment (IOA): Analysis of the Orbiter Experiment (OEX) subsystem
NASA Technical Reports Server (NTRS)
Compton, J. M.
1987-01-01
The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. This report documents the independent analysis results corresponding to the Orbiter Experiments hardware. Each level of hardware was evaluated and analyzed for possible failure modes and effects. Criticality was assigned based upon the severity of the effect for each failure mode. The Orbiter Experiments (OEX) Program consists of a multiple set of experiments for the purpose of gathering environmental and aerodynamic data to develop more accurate ground models for Shuttle performance and to facilitate the design of future spacecraft. This assessment only addresses currently manifested experiments and their support systems. Specifically this list consists of: Shuttle Entry Air Data System (SEADS); Shuttle Upper Atmosphere Mass Spectrometer (SUMS); Forward Fuselage Support System for OEX (FFSSO); Shuttle Infrared Laced Temperature Sensor (SILTS); Aerodynamic Coefficient Identification Package (ACIP); and Support System for OEX (SSO). There are only two potential critical items for the OEX, since the experiments only gather data for analysis post mission and are totally independent systems except for power. Failure of any experiment component usually only causes a loss of experiment data and in no way jeopardizes the crew or mission.
NASA Technical Reports Server (NTRS)
Minor, Robert
2002-01-01
Two ISS (International Space Station) experiment payloads will vent a volume of gas overboard via either the ISS Vacuum Exhaust System or the Vacuum Resource System. A system of ducts, valves and sensors, under design, will connect the experiments to the ISS systems. The following tasks are required: Create an analysis tool that will verify the rack vacuum system design with respect to design requirements, more specifically approximate pressure at given locations within the vacuum systems; Determine the vent duration required to achieve desired pressure within the experiment modules; Update the analysis as systems and operations definitions mature.
Overview of the Systems Special Investigation Group investigation
NASA Technical Reports Server (NTRS)
Mason, James B.; Dursch, Harry; Edelman, Joel
1993-01-01
The Long Duration Exposure Facility (LDEF) carried a remarkable variety of electrical, mechanical, thermal, and optical systems, subsystems, and components. Nineteen of the fifty-seven experiments flown on LDEF contained functional systems that were active on-orbit. Almost all of the other experiments possessed at least a few specific components of interest to the Systems Special Investigation Group (Systems SIG), such as adhesives, seals, fasteners, optical components, and thermal blankets. Almost all top level functional testing of the active LDEF and experiment systems has been completed. Failure analysis of both LDEF hardware and individual experiments that failed to perform as designed has also been completed. Testing of system components and experimenter hardware of interest to the Systems SIG is ongoing. All available testing and analysis results were collected and integrated by the Systems SIG. An overview of our findings is provided. An LDEF Optical Experiment Database containing information for all 29 optical related experiments is also discussed.
Gao, Wenyue; Muzyka, Kateryna; Ma, Xiangui; Lou, Baohua; Xu, Guobao
2018-04-28
Developing low-cost and simple electrochemical systems is becoming increasingly important but still challenged for multiplex experiments. Here we report a single-electrode electrochemical system (SEES) using only one electrode not only for a single experiment but also for multiplex experiments based on a resistance induced potential difference. SEESs for a single experiment and multiplex experiments are fabricated by attaching a self-adhesive label with a hole and multiple holes onto an ITO electrode, respectively. This enables multiplex electrochemiluminescence analysis with high sensitivity at a very low safe voltage using a smartphone as a detector. For the multiplex analysis, the SEES using a single electrode is much simpler, cheaper and more user-friendly than conventional electrochemical systems and bipolar electrochemical systems using electrode arrays. Moreover, SEESs are free from the electrochemiluminescent background problem from driving electrodes in bipolar electrochemical systems. Since numerous electrodes and cover materials can be used to fabricate SEESs readily and electrochemistry is being extensively used, SEESs are very promising for broad applications, such as drug screening and high throughput analysis.
An Analysis of the Crash Experience of Vehicles Equipped with Antilock Braking System
DOT National Transportation Integrated Search
1995-06-01
National Center for Statistics and Analysis has recently completed an initial : analysis of the crash experience of passenger cars (PCs) and light trucks and : vans (LTVs) equipped with antilock braking systems (ABS). Four types of crashes : were ide...
Data handling and analysis for the 1971 corn blight watch experiment.
NASA Technical Reports Server (NTRS)
Anuta, P. E.; Phillips, T. L.; Landgrebe, D. A.
1972-01-01
Review of the data handling and analysis methods used in the near-operational test of remote sensing systems provided by the 1971 corn blight watch experiment. The general data analysis techniques and, particularly, the statistical multispectral pattern recognition methods for automatic computer analysis of aircraft scanner data are described. Some of the results obtained are examined, and the implications of the experiment for future data communication requirements of earth resource survey systems are discussed.
Integrated Safety Analysis Teams
NASA Technical Reports Server (NTRS)
Wetherholt, Jonathan C.
2008-01-01
Today's complex systems require understanding beyond one person s capability to comprehend. Each system requires a team to divide the system into understandable subsystems which can then be analyzed with an Integrated Hazard Analysis. The team must have both specific experiences and diversity of experience. Safety experience and system understanding are not always manifested in one individual. Group dynamics make the difference between success and failure as well as the difference between a difficult task and a rewarding experience. There are examples in the news which demonstrate the need to connect the pieces of a system into a complete picture. The Columbia disaster is now a standard example of a low consequence hazard in one part of the system; the External Tank is a catastrophic hazard cause for a companion subsystem, the Space Shuttle Orbiter. The interaction between the hardware, the manufacturing process, the handling, and the operations contributed to the problem. Each of these had analysis performed, but who constituted the team which integrated this analysis together? This paper will explore some of the methods used for dividing up a complex system; and how one integration team has analyzed the parts. How this analysis has been documented in one particular launch space vehicle case will also be discussed.
Data processing for a cosmic ray experiment onboard the solar probes Helios 1 and 2: Experiment 6
NASA Technical Reports Server (NTRS)
Mueller-Mellin, R.; Green, G.; Iwers, B.; Kunow, H.; Wibberenz, G.; Fuckner, J.; Hempe, H.; Witte, M.
1982-01-01
The data processing system for the Helios experiment 6, measuring energetic charged particles of solar, planetary and galactic origin in the inner solar system, is described. The aim of this experiment is to extend knowledge on origin and propagation of cosmic rays. The different programs for data reduction, analysis, presentation, and scientific evaluation are described as well as hardware and software of the data processing equipment. A chronological presentation of the data processing operation is given. Procedures and methods for data analysis which were developed can be used with minor modifications for analysis of other space research experiments.
The application of digital techniques to the analysis of metallurgical experiments
NASA Technical Reports Server (NTRS)
Rathz, T. J.
1977-01-01
The application of a specific digital computer system (known as the Image Data Processing System) to the analysis of three NASA-sponsored metallurgical experiments is discussed in some detail. The basic hardware and software components of the Image Data Processing System are presented. Many figures are presented in the discussion of each experimental analysis in an attempt to show the accuracy and speed that the Image Data Processing System affords in analyzing photographic images dealing with metallurgy, and in particular with material processing.
Commissioning of a CERN Production and Analysis Facility Based on xrootd
NASA Astrophysics Data System (ADS)
Campana, Simone; van der Ster, Daniel C.; Di Girolamo, Alessandro; Peters, Andreas J.; Duellmann, Dirk; Coelho Dos Santos, Miguel; Iven, Jan; Bell, Tim
2011-12-01
The CERN facility hosts the Tier-0 of the four LHC experiments, but as part of WLCG it also offers a platform for production activities and user analysis. The CERN CASTOR storage technology has been extensively tested and utilized for LHC data recording and exporting to external sites according to experiments computing model. On the other hand, to accommodate Grid data processing activities and, more importantly, chaotic user analysis, it was realized that additional functionality was needed including a different throttling mechanism for file access. This paper will describe the xroot-based CERN production and analysis facility for the ATLAS experiment and in particular the experiment use case and data access scenario, the xrootd redirector setup on top of the CASTOR storage system, the commissioning of the system and real life experience for data processing and data analysis.
The Impact of Programming Experience on Successfully Learning Systems Analysis and Design
ERIC Educational Resources Information Center
Wong, Wang-chan
2015-01-01
In this paper, the author reports the results of an empirical study on the relationship between a student's programming experience and their success in a traditional Systems Analysis and Design (SA&D) class where technical skills such as dataflow analysis and entity relationship data modeling are covered. While it is possible to teach these…
Powerful Raman Lidar systems for atmospheric analysis and high-energy physics experiments
NASA Astrophysics Data System (ADS)
Avdikos, George
2015-03-01
In this paper the author presents modern commercial Raman Lidar systems which can be applied to high-energy physics experiments. Raymetrics is a world-leader in laser remote (lidar) sensing applications. Products series include lidar systems for various applications like atmospheric analysis, meteorology, and recently more operational applications including volcanic ash detection systems, visual rangers for application to airports etc.
NASA Technical Reports Server (NTRS)
Hinson, E. W.
1981-01-01
The preliminary analysis and data analysis system development for the shuttle upper atmosphere mass spectrometer (SUMS) experiment are discussed. The SUMS experiment is designed to provide free stream atmospheric density, pressure, temperature, and mean molecular weight for the high altitude, high Mach number region.
MECDAS: A distributed data acquisition system for experiments at MAMI
NASA Astrophysics Data System (ADS)
Krygier, K. W.; Merle, K.
1994-02-01
For the coincidence experiments with the three spectrometer setup at MAMI an experiment control and data acquisition system has been built and was put successfully into final operation in 1992. MECDAS is designed as a distributed system using communication via Ethernet and optical links. As the front end, VME bus systems are used for real time purposes and direct hardware access via CAMAC, Fastbus or VMEbus. RISC workstations running UNIX are used for monitoring, data archiving and online and offline analysis of the experiment. MECDAS consists of several fixed programs and libraries, but large parts of readout and analysis can be configured by the user. Experiment specific configuration files are used to generate efficient and powerful code well adapted to special problems without additional programming. The experiment description is added to the raw collection of partially analyzed data to get self-descriptive data files.
An Ion-Selective Electrode/Flow-Injection Analysis Experiment: Determination of Potassium in Serum.
ERIC Educational Resources Information Center
Meyerhoff, Mark E.; Kovach, Paul M.
1983-01-01
Describes a low-cost, senior-level, instrumental analysis experiment in which a home-made potassium tubular flow-through electrode is constructed and incorporated into a flow injection analysis system (FIA). Also describes experiments for evaluating the electrode's response properties, examining basic FIA concepts, and determining potassium in…
Development of Optimal Stressor Scenarios for New Operational Energy Systems
2017-12-01
Analyzing the previous model using a design of experiments (DOE) and regression analysis provides critical information about the associated operational...from experimentation. The resulting system requirements can be used to revisit the design requirements and develop a more robust system. This process...stressor scenarios for acceptance testing. Analyzing the previous model using a design of experiments (DOE) and regression analysis provides critical
Magnetic Field Experiment Data Analysis System
NASA Technical Reports Server (NTRS)
Holland, D. B.; Zanetti, L. J.; Suther, L. L.; Potemra, T. A.; Anderson, B. J.
1995-01-01
The Johns Hopkins University Applied Physics Laboratory (JHU/APL) Magnetic Field Experiment Data Analysis System (MFEDAS) has been developed to process and analyze satellite magnetic field experiment data from the TRIAD, MAGSAT, AMPTE/CCE, Viking, Polar BEAR, DMSP, HILAT, UARS, and Freja satellites. The MFEDAS provides extensive data management and analysis capabilities. The system is based on standard data structures and a standard user interface. The MFEDAS has two major elements: (1) a set of satellite unique telemetry processing programs for uniform and rapid conversion of the raw data to a standard format and (2) the program Magplot which has file handling, data analysis, and data display sections. This system is an example of software reuse, allowing new data sets and software extensions to be added in a cost effective and timely manner. Future additions to the system will include the addition of standard format file import routines, modification of the display routines to use a commercial graphics package based on X-Window protocols, and a generic utility for telemetry data access and conversion.
IT Educational Experience and Workforce Development for Information Systems and Technology Students
ERIC Educational Resources Information Center
Legier, John T., Jr.; Soares, Andrey
2014-01-01
This study involves an analysis of a cohort of student's during their pursuit of a Bachelor of Science degree in Information Systems Technologies (IST) at a Midwestern university. Demographics and analysis of this cohort include basic demographic information, student home-life and personal responsibilities, employment and work experience, and…
ERIC Educational Resources Information Center
Burns, Timothy J.
2012-01-01
This paper reports the results of a survey and follow-up interviews that were administered to instructors of the undergraduate systems analysis and design course, a core course of the Information Systems curriculum. The goal of this research was to learn if the background of the instructor, in terms of industry experience, affects the purpose and…
Principles of cost-benefit analysis for ERTS experiments, volumes 1 and 2
NASA Technical Reports Server (NTRS)
1973-01-01
The basic elements of a cost-benefit study are discussed along with special considerations for ERTS experiments. Elements required for a complete economic analysis of ERTS are considered to be: statement of objectives, specification of assumptions, enumeration of system alternatives, benefit analysis, cost analysis nonefficiency considerations, and final system selection. A hypothetical cost-benefit example is presented with the assumed objective of an increase in remote sensing surveys of grazing lands to better utilize available forage to lower meat prices.
Data Reprocessing on Worldwide Distributed Systems
NASA Astrophysics Data System (ADS)
Wicke, Daniel
The DØ experiment faces many challenges in terms of enabling access to large datasets for physicists on four continents. The strategy for solving these problems on worldwide distributed computing clusters is presented. Since the beginning of Run II of the Tevatron (March 2001) all Monte-Carlo simulations for the experiment have been produced at remote systems. For data analysis, a system of regional analysis centers (RACs) was established which supply the associated institutes with the data. This structure, which is similar to the tiered structure foreseen for the LHC was used in Fall 2003 to reprocess all DØ data with a much improved version of the reconstruction software. This makes DØ the first running experiment that has implemented and operated all important computing tasks of a high energy physics experiment on systems distributed worldwide.
NASA Technical Reports Server (NTRS)
Williams, Daniel M.; Consiglio, Maria C.; Murdoch, Jennifer L.; Adams, Catherine H.
2005-01-01
This paper provides an analysis of Flight Technical Error (FTE) from recent SATS experiments, called the Higher Volume Operations (HVO) Simulation and Flight experiments, which NASA conducted to determine pilot acceptability of the HVO concept for normal operating conditions. Reported are FTE results from simulation and flight experiment data indicating the SATS HVO concept is viable and acceptable to low-time instrument rated pilots when compared with today s system (baseline). Described is the comparative FTE analysis of lateral, vertical, and airspeed deviations from the baseline and SATS HVO experimental flight procedures. Based on FTE analysis, all evaluation subjects, low-time instrument-rated pilots, flew the HVO procedures safely and proficiently in comparison to today s system. In all cases, the results of the flight experiment validated the results of the simulation experiment and confirm the utility of the simulation platform for comparative Human in the Loop (HITL) studies of SATS HVO and Baseline operations.
NASA Technical Reports Server (NTRS)
Gramling, C. J.; Long, A. C.; Lee, T.; Ottenstein, N. A.; Samii, M. V.
1991-01-01
A Tracking and Data Relay Satellite System (TDRSS) Onboard Navigation System (TONS) is currently being developed by NASA to provide a high accuracy autonomous navigation capability for users of TDRSS and its successor, the Advanced TDRSS (ATDRSS). The fully autonomous user onboard navigation system will support orbit determination, time determination, and frequency determination, based on observation of a continuously available, unscheduled navigation beacon signal. A TONS experiment will be performed in conjunction with the Explorer Platform (EP) Extreme Ultraviolet Explorer (EUVE) mission to flight quality TONS Block 1. An overview is presented of TONS and a preliminary analysis of the navigation accuracy anticipated for the TONS experiment. Descriptions of the TONS experiment and the associated navigation objectives, as well as a description of the onboard navigation algorithms, are provided. The accuracy of the selected algorithms is evaluated based on the processing of realistic simulated TDRSS one way forward link Doppler measurements. The analysis process is discussed and the associated navigation accuracy results are presented.
NASA Astrophysics Data System (ADS)
Yue, Kang; Wang, Danli; Yang, Xinpan; Hu, Haichen; Liu, Yuqing; Zhu, Xiuqing
2016-10-01
To date, as the different application fields, most VR-based training systems have been different. Therefore, we should take the characteristics of application field into consideration and adopt different evaluation methods when evaluate the user experience of these training systems. In this paper, we propose a method to evaluate the user experience of virtual astronauts training system. Also, we design an experiment based on the proposed method. The proposed method takes learning performance as one of the evaluation dimensions, also combines with other evaluation dimensions such as: presence, immersion, pleasure, satisfaction and fatigue to evaluation user experience of the System. We collect subjective and objective data, the subjective data are mainly from questionnaire designed based on the evaluation dimensions and user interview conducted before and after the experiment. While the objective data are consisted of Electrocardiogram (ECG), reaction time, numbers of reaction error and the video data recorded during the experiment. For the analysis of data, we calculate the integrated score of each evaluation dimension by using factor analysis. In order to improve the credibility of the assessment, we use the ECG signal and reaction test data before and after experiment to validate the changes of fatigue during the experiment, and the typical behavioral features extracted from the experiment video to explain the result of subjective questionnaire. Experimental results show that the System has a better user experience and learning performance, but slight visual fatigue exists after experiment.
Hsu, Chi-Lin; Chou, Chih-Hsuan; Huang, Shih-Chuan; Lin, Chia-Yi; Lin, Meng-Ying; Tung, Chun-Che; Lin, Chun-Yen; Lai, Ivan Pochou; Zou, Yan-Fang; Youngson, Neil A; Lin, Shau-Ping; Yang, Chang-Hao; Chen, Shih-Kuo; Gau, Susan Shur-Fen; Huang, Hsien-Sung
2018-03-15
Visual system development is light-experience dependent, which strongly implicates epigenetic mechanisms in light-regulated maturation. Among many epigenetic processes, genomic imprinting is an epigenetic mechanism through which monoallelic gene expression occurs in a parent-of-origin-specific manner. It is unknown if genomic imprinting contributes to visual system development. We profiled the transcriptome and imprintome during critical periods of mouse visual system development under normal- and dark-rearing conditions using B6/CAST F1 hybrid mice. We identified experience-regulated, isoform-specific and brain-region-specific imprinted genes. We also found imprinted microRNAs were predominantly clustered into the Dlk1-Dio3 imprinted locus with light experience affecting some imprinted miRNA expression. Our findings provide the first comprehensive analysis of light-experience regulation of the transcriptome and imprintome during critical periods of visual system development. Our results may contribute to therapeutic strategies for visual impairments and circadian rhythm disorders resulting from a dysfunctional imprintome.
Computer-aided visualization and analysis system for sequence evaluation
Chee, M.S.
1998-08-18
A computer system for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments are improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area and sample sequences in another area on a display device. 27 figs.
Computer-aided visualization and analysis system for sequence evaluation
Chee, Mark S.; Wang, Chunwei; Jevons, Luis C.; Bernhart, Derek H.; Lipshutz, Robert J.
2004-05-11
A computer system for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments are improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area and sample sequences in another area on a display device.
Computer-aided visualization and analysis system for sequence evaluation
Chee, Mark S.
1998-08-18
A computer system for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments are improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area and sample sequences in another area on a display device.
Computer-aided visualization and analysis system for sequence evaluation
Chee, Mark S.
2003-08-19
A computer system for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments may be improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area and sample sequences in another area on a display device.
A Modular Artificial Intelligence Inference Engine System (MAIS) for support of on orbit experiments
NASA Technical Reports Server (NTRS)
Hancock, Thomas M., III
1994-01-01
This paper describes a Modular Artificial Intelligence Inference Engine System (MAIS) support tool that would provide health and status monitoring, cognitive replanning, analysis and support of on-orbit Space Station, Spacelab experiments and systems.
Data Driven Trigger Design and Analysis for the NOvA Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurbanov, Serdar
This thesis primarily describes analysis related to studying the Moon shadow with cosmic rays, an analysis using upward-going muons trigger data, and other work done as part of MSc thesis work conducted at Fermi National Laboratory. While at Fermilab I made hardware and software contributions to two experiments - NOvA and Mu2e. NOvA is a neutrino experiment with the primary goal of measuring parameters related to neutrino oscillation. This is a running experiment, so it's possible to provide analysis of real beam and cosmic data. Most of this work was related to the Data-Driven Trigger (DDT) system of NOvA. Themore » results of the Upward-Going muon analysis was presented at ICHEP in August 2016. The analysis demonstrates the proof of principle for a low-mass dark matter search. Mu2e is an experiment currently being built at Fermilab. Its primary goal is to detect the hypothetical neutrinoless conversion from a muon into an electron. I contributed to the production and tests of Cathode Strip Chambers (CSCs) which are required for testing the Cosmic Ray Veto (CRV) system for the experiment. This contribution is described in the last chapter along with a short description of the technical work provided for the DDT system of the NOvA experiment. All of the work described in this thesis will be extended by the next generation of UVA graduate students and postdocs as new data is collected by the experiment. I hope my eorts of have helped lay the foundation for many years of beautiful results from Mu2e and NOvA.« less
Computer-aided visualization and analysis system for sequence evaluation
Chee, Mark S.
1999-10-26
A computer system (1) for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments may be improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area (814) and sample sequences in another area (816) on a display device (3).
Computer-aided visualization and analysis system for sequence evaluation
Chee, Mark S.
2001-06-05
A computer system (1) for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments may be improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area (814) and sample sequences in another area (816) on a display device (3).
Space Construction System Analysis. Part 2: Executive summary
NASA Technical Reports Server (NTRS)
1980-01-01
A detailed, end-to-end analysis of the activities, techniques, equipment and Shuttle provisions required to construct a reference project system is described. Included are: platform definition; construction analysis; cost and programmatics; and space construction experiments concepts.
Spectral atmospheric observations at Nantucket Island, May 7-14, 1981
NASA Technical Reports Server (NTRS)
Talay, T. A.; Poole, L. R.
1981-01-01
An experiment was conducted by the National Langley Research Center to measure atmospheric optical conditions using a 10-channel solar spectral photometer system. This experiment was part of a larger series of multidisciplinary experiments performed in the area of Nantucket Shoals aimed at studying the dynamics of phytoplankton production processes. Analysis of the collected atmospheric data yield total and aerosol optical depths, transmittances, normalized sky radiance distributions, and total and sky irradiances. Results of this analysis may aid in atmospheric corrections of remote sensor data obtained by several sensors overflying the Nantucket Shoals area. Recommendations are presented concerning future experiments using the described solar photometer system and calibration and operational deficiencies uncovered during the experiment.
NASA Astrophysics Data System (ADS)
Simola, Kaisa; Laakso, Kari
1992-01-01
Eight years of operating experiences of 104 motor operated closing valves in different safety systems in nuclear power units were analyzed in a systematic way. The qualitative methods used were Failure Mode and Effect Analysis (FMEA) and Maintenance Effects and Criticality Analysis (MECA). These reliability engineering methods are commonly used in the design stage of equipment. The successful application of these methods for analysis and utilization of operating experiences was demonstrated.
Loran digital phase-locked loop and RF front-end system error analysis
NASA Technical Reports Server (NTRS)
Mccall, D. L.
1979-01-01
An analysis of the system performance of the digital phase locked loops (DPLL) and RF front end that are implemented in the MINI-L4 Loran receiver is presented. Three of the four experiments deal with the performance of the digital phase locked loops. The other experiment deals with the RF front end and DPLL system error which arise in the front end due to poor signal to noise ratios. The ability of the DPLLs to track the offsets is studied.
Overview of the systems special investigation. [long duration exposure facility
NASA Technical Reports Server (NTRS)
Mason, James B.; Dursch, Harry; Edelman, Joel
1992-01-01
The Systems Special Investigation Group (SIG), formed by the Long Duration Exposure Facility (LDEF) Project Office to perform post flight analysis of systems hardware, was chartered to investigate the effects of the extended LDEF mission on both satellite and experiment systems and to coordinate and integrate all systems analysis performed in post flight investigations. Almost all of the top level functional testing of the active experiments has been completed, but many components are still under investigation by either the Systems SIG or individual experimenters. Results reported to date have been collected and integrated by the Systems SIG and an overview of the current results and the status of the Systems Investigation are presented in this paper.
Lee, Yu-Hao; Hsieh, Ya-Ju; Shiah, Yung-Jong; Lin, Yu-Huei; Chen, Chiao-Yun; Tyan, Yu-Chang; GengQiu, JiaCheng; Hsu, Chung-Yao; Chen, Sharon Chia-Ju
2017-01-01
Abstract To quantitate the meditation experience is a subjective and complex issue because it is confounded by many factors such as emotional state, method of meditation, and personal physical condition. In this study, we propose a strategy with a cross-sectional analysis to evaluate the meditation experience with 2 artificial intelligence techniques: artificial neural network and support vector machine. Within this analysis system, 3 features of the electroencephalography alpha spectrum and variant normalizing scaling are manipulated as the evaluating variables for the detection of accuracy. Thereafter, by modulating the sliding window (the period of the analyzed data) and shifting interval of the window (the time interval to shift the analyzed data), the effect of immediate analysis for the 2 methods is compared. This analysis system is performed on 3 meditation groups, categorizing their meditation experiences in 10-year intervals from novice to junior and to senior. After an exhausted calculation and cross-validation across all variables, the high accuracy rate >98% is achievable under the criterion of 0.5-minute sliding window and 2 seconds shifting interval for both methods. In a word, the minimum analyzable data length is 0.5 minute and the minimum recognizable temporal resolution is 2 seconds in the decision of meditative classification. Our proposed classifier of the meditation experience promotes a rapid evaluation system to distinguish meditation experience and a beneficial utilization of artificial techniques for the big-data analysis. PMID:28422856
Lee, Yu-Hao; Hsieh, Ya-Ju; Shiah, Yung-Jong; Lin, Yu-Huei; Chen, Chiao-Yun; Tyan, Yu-Chang; GengQiu, JiaCheng; Hsu, Chung-Yao; Chen, Sharon Chia-Ju
2017-04-01
To quantitate the meditation experience is a subjective and complex issue because it is confounded by many factors such as emotional state, method of meditation, and personal physical condition. In this study, we propose a strategy with a cross-sectional analysis to evaluate the meditation experience with 2 artificial intelligence techniques: artificial neural network and support vector machine. Within this analysis system, 3 features of the electroencephalography alpha spectrum and variant normalizing scaling are manipulated as the evaluating variables for the detection of accuracy. Thereafter, by modulating the sliding window (the period of the analyzed data) and shifting interval of the window (the time interval to shift the analyzed data), the effect of immediate analysis for the 2 methods is compared. This analysis system is performed on 3 meditation groups, categorizing their meditation experiences in 10-year intervals from novice to junior and to senior. After an exhausted calculation and cross-validation across all variables, the high accuracy rate >98% is achievable under the criterion of 0.5-minute sliding window and 2 seconds shifting interval for both methods. In a word, the minimum analyzable data length is 0.5 minute and the minimum recognizable temporal resolution is 2 seconds in the decision of meditative classification. Our proposed classifier of the meditation experience promotes a rapid evaluation system to distinguish meditation experience and a beneficial utilization of artificial techniques for the big-data analysis.
NASA Astrophysics Data System (ADS)
Choi, D. H.; An, Y. H.; Chung, K. J.; Hwang, Y. S.
2012-01-01
A 94 GHz heterodyne interferometer system was designed to measure the plasma density of VEST (Versatile Experiment Spherical Torus), which was recently built at Seoul National University. Two 94 GHz Gunn oscillators with a frequency difference of 40 MHz were used in the microwave electronics part of a heterodyne interferometer system. A compact beam focusing system utilizing a pair of plano-convex lenses and a concave mirror was designed to maximize the effective beam reception and spatial resolution. Beam path analysis based on Gaussian optics was used in the design of the beam focusing system. The design of the beam focusing system and the beam path analysis were verified with a couple of experiments that were done within an experimental framework that considered the real dimensions of a vacuum vessel. Optimum distances between the optical components and the beam radii along the beam path obtained from the experiments were in good agreement with the beam path analysis using the Gaussian optics. Both experimentation and numerical calculations confirmed that the designed beam focusing system maximized the spatial resolution of the measurement; moreover, the beam waist was located at the center of the plasma to generate a phase shift more effectively in plasmas. The interferometer system presented in this paper is expected to be used in the measurements of line integrated plasma densities during the start-up phase of VEST.
NASA Astrophysics Data System (ADS)
Varghese, Nishad G.
Knowledge management (KM) exists in various forms throughout organizations. Process documentation, training courses, and experience sharing are examples of KM activities performed daily. The goal of KM systems (KMS) is to provide a tool set which serves to standardize the creation, sharing, and acquisition of business critical information. Existing literature provides numerous examples of targeted evaluations of KMS, focusing on specific system attributes. This research serves to bridge the targeted evaluations with an industry-specific, holistic approach. The user preferences of aerospace employees in engineering and engineering-related fields were compared to profiles of existing aerospace KMS based on three attribute categories: technical features, system administration, and user experience. The results indicated there is a statistically significant difference between aerospace user preferences and existing profiles in the user experience attribute category, but no statistically significant difference in the technical features and system administration attribute categories. Additional analysis indicated in-house developed systems exhibit higher technical features and user experience ratings than commercial-off-the-self (COTS) systems.
NASA Astrophysics Data System (ADS)
Azarov, A. V.; Zhukova, N. S.; Kozlovtseva, E. Yu; Dobrinsky, D. R.
2018-05-01
The article considers obtaining mathematical models to assess the efficiency of the dust collectors using an integrated system of analysis and data management STATISTICA Design of Experiments. The procedure for obtaining mathematical models and data processing is considered by the example of laboratory studies on a mounted installation containing a dust collector in counter-swirling flows (CSF) using gypsum dust of various fractions. Planning of experimental studies has been carried out in order to reduce the number of experiments and reduce the cost of experimental research. A second-order non-position plan (Box-Bencken plan) was used, which reduced the number of trials from 81 to 27. The order of statistical data research of Box-Benken plan using standard tools of integrated system for analysis and data management STATISTICA Design of Experiments is considered. Results of statistical data processing with significance estimation of coefficients and adequacy of mathematical models are presented.
BATSE spectroscopy analysis system
NASA Technical Reports Server (NTRS)
Schaefer, Bradley E.; Bansal, Sandhia; Basu, Anju; Brisco, Phil; Cline, Thomas L.; Friend, Elliott; Laubenthal, Nancy; Panduranga, E. S.; Parkar, Nuru; Rust, Brad
1992-01-01
The Burst and Transient Source Experiment (BATSE) Spectroscopy Analysis System (BSAS) is the software system which is the primary tool for the analysis of spectral data from BATSE. As such, Guest Investigators and the community as a whole need to know its basic properties and characteristics. Described here are the characteristics of the BATSE spectroscopy detectors and the BSAS.
Spectral Analysis of Forecast Error Investigated with an Observing System Simulation Experiment
NASA Technical Reports Server (NTRS)
Prive, N. C.; Errico, Ronald M.
2015-01-01
The spectra of analysis and forecast error are examined using the observing system simulation experiment (OSSE) framework developed at the National Aeronautics and Space Administration Global Modeling and Assimilation Office (NASAGMAO). A global numerical weather prediction model, the Global Earth Observing System version 5 (GEOS-5) with Gridpoint Statistical Interpolation (GSI) data assimilation, is cycled for two months with once-daily forecasts to 336 hours to generate a control case. Verification of forecast errors using the Nature Run as truth is compared with verification of forecast errors using self-analysis; significant underestimation of forecast errors is seen using self-analysis verification for up to 48 hours. Likewise, self analysis verification significantly overestimates the error growth rates of the early forecast, as well as mischaracterizing the spatial scales at which the strongest growth occurs. The Nature Run-verified error variances exhibit a complicated progression of growth, particularly for low wave number errors. In a second experiment, cycling of the model and data assimilation over the same period is repeated, but using synthetic observations with different explicitly added observation errors having the same error variances as the control experiment, thus creating a different realization of the control. The forecast errors of the two experiments become more correlated during the early forecast period, with correlations increasing for up to 72 hours before beginning to decrease.
Five Years Experience with the CLINFO Data Base Management and Analysis System
Johnston, Howard B.; Higgins, Stanley B.; Harris, Thomas R.; Lacy, William W.
1982-01-01
The CLINFO data base management and analysis system is the result of a project sponsored by the National Institutes of Health (NIH) to identify data management and data analysis activities that are critical to clinical investigation. In February of 1977, one of the three prototype CLINFO systems developed by the RAND Corporation was installed in the Clinical Research Center (CRC) at Vanderbilt University Medical Center. The Vanderbilt experience with this CLINFO system over the past five years is described. Its impact on the way clinical research data has been managed and analyzed is discussed in terms of utilization by more than 100 clinical investigators and their staff. The Vanderbilt evaluation of the system and additional information on its usage since the original evaluation is presented. Factors in the design philosophy of CLINFO which create an environment that enhances the clinical investigator's capabilities to perform computer data management and analysis of his data are discussed.
Data handling and analysis for the 1971 corn blight watch experiment
NASA Technical Reports Server (NTRS)
Anuta, P. E.; Phillips, T. L.
1973-01-01
The overall corn blight watch experiment data flow is described and the organization of the LARS/Purdue data center is discussed. Data analysis techniques are discussed in general and the use of statistical multispectral pattern recognition methods for automatic computer analysis of aircraft scanner data is described. Some of the results obtained are discussed and the implications of the experiment on future data communication requirements for earth resource survey systems is discussed.
Compact Microscope Imaging System Developed
NASA Technical Reports Server (NTRS)
McDowell, Mark
2001-01-01
The Compact Microscope Imaging System (CMIS) is a diagnostic tool with intelligent controls for use in space, industrial, medical, and security applications. The CMIS can be used in situ with a minimum amount of user intervention. This system, which was developed at the NASA Glenn Research Center, can scan, find areas of interest, focus, and acquire images automatically. Large numbers of multiple cell experiments require microscopy for in situ observations; this is only feasible with compact microscope systems. CMIS is a miniature machine vision system that combines intelligent image processing with remote control capabilities. The software also has a user-friendly interface that can be used independently of the hardware for post-experiment analysis. CMIS has potential commercial uses in the automated online inspection of precision parts, medical imaging, security industry (examination of currency in automated teller machines and fingerprint identification in secure entry locks), environmental industry (automated examination of soil/water samples), biomedical field (automated blood/cell analysis), and microscopy community. CMIS will improve research in several ways: It will expand the capabilities of MSD experiments utilizing microscope technology. It may be used in lunar and Martian experiments (Rover Robot). Because of its reduced size, it will enable experiments that were not feasible previously. It may be incorporated into existing shuttle orbiter and space station experiments, including glove-box-sized experiments as well as ground-based experiments.
Research program for experiment M133
NASA Technical Reports Server (NTRS)
Frost, J. D., Jr.
1972-01-01
The development of the automatic data-acquisition and sleep-analysis system is reported. The purpose was consultation and evaluation in the transition of the Skylab M133 Sleep-Monitoring Experiment equipment from prototype of flight status; review of problems associated with acquisition and on-line display of data in near-real time via spacecraft telemetry; and development of laboratory facilities and design of equipment to assure reliable playback and analysis of analog data. The existing prototype system modified, and the changes improve the performance of the analysis circuitry and increase its reliability. These modifications are useful for pre- and postflight analysis, but are not now proposed for the inflight system. There were improvements in the EEG recording cap, some of which will be incorporated into the flight hardware.
NASA Technical Reports Server (NTRS)
Torres-Pomales, Wilfredo
2012-01-01
Preliminary data analysis for a physical fault injection experiment of a digital system exposed to High Intensity Radiated Fields (HIRF) in an electromagnetic reverberation chamber suggests a direct causal relation between the time profile of the field strength amplitude in the chamber and the severity of observed effects at the outputs of the radiated system. This report presents an analysis of the field strength modulation induced by the movement of the field stirrers in the reverberation chamber. The analysis is framed as a characterization of the discrete features of the field strength waveform responsible for the faults experienced by a radiated digital system. The results presented here will serve as a basis to refine the approach for a detailed analysis of HIRF-induced upsets observed during the radiation experiment. This work offers a novel perspective into the use of an electromagnetic reverberation chamber to generate upset-inducing stimuli for the study of fault effects in digital systems.
ERIC Educational Resources Information Center
Bermejo, Belen G.; Mateos, Pedro M.; Sanchez-Mateos, Juan Degado
2014-01-01
The present study provides information on the emotional experience of people with intellectual disability. To evaluate this emotional experience, we have used the International Affective Pictures System (IAPS). The most important result from this study is that the emotional reaction of people with intellectual disability to affective stimuli is…
The House System: Evaluating Its Role in the Experience of Business Students
ERIC Educational Resources Information Center
Antoniadou, Marilena
2017-01-01
This paper presents a case study of students' experiences of the House system, an innovative scheme introduced for business students, aiming to enhance student experience. The findings are based on a survey of 350 students and 4 group interviews. Analysis of the findings, both statistical and qualitative, indicated perceived clear benefits for the…
Electric Vehicle Grid Experiments and Analysis
DOT National Transportation Integrated Search
2018-02-02
This project developed a low cost building energy management system (EMS) and conducted vehicle-to-grid (V2G) experiments on a commercial office building. The V2G effort included theinstallation and operation of a Princeton Power System CA-30 bi-dire...
ERIC Educational Resources Information Center
Dawson, Shane; Heathcote, Liz; Poole, Gary
2010-01-01
Purpose: This paper aims to examine how effective higher education institutions have been in harnessing the data capture mechanisms from their student information systems, learning management systems and communication tools for improving the student learning experience and informing practitioners of the achievement of specific learning outcomes.…
2009-12-01
Research and Predictability Experiment (THORPEX) Pacific Asian Regional Campaigns (T- PARC ). Aircraft dropwindsondes, special ELDORA radar observations...systems within TCS025 at 2030 UTC 24 August 2008. D. ELDORA BACKGROUND For the combined TCS08 and T- PARC field experiment, the ELDORA radar was...SUBJECT TERMS Electra Doppler Radar (ELDORA), Tropical Cyclone Structure (TCS08), TCS08, Tropical Cyclone Formation, Tropical Circulation System
Lin, Weilu; Wang, Zejian; Huang, Mingzhi; Zhuang, Yingping; Zhang, Siliang
2018-06-01
The isotopically non-stationary 13C labelling experiments, as an emerging experimental technique, can estimate the intracellular fluxes of the cell culture under an isotopic transient period. However, to the best of our knowledge, the issue of the structural identifiability analysis of non-stationary isotope experiments is not well addressed in the literature. In this work, the local structural identifiability analysis for non-stationary cumomer balance equations is conducted based on the Taylor series approach. The numerical rank of the Jacobian matrices of the finite extended time derivatives of the measured fractions with respect to the free parameters is taken as the criterion. It turns out that only one single time point is necessary to achieve the structural identifiability analysis of the cascaded linear dynamic system of non-stationary isotope experiments. The equivalence between the local structural identifiability of the cascaded linear dynamic systems and the local optimum condition of the nonlinear least squares problem is elucidated in the work. Optimal measurements sets can then be determined for the metabolic network. Two simulated metabolic networks are adopted to demonstrate the utility of the proposed method. Copyright © 2018 Elsevier Inc. All rights reserved.
Conceptual Design and Analysis of Orbital Cryogenic Liquid Storage and Supply Systems.
1981-05-01
MCR -79-561, Martin Marietta Corporation, June 1979. 5. Tegart, J. R.: Hydrodynamic Analysis Report - Cryogenic Fluid Management...Experiment, MCR -79-563, Martin Marietta Corporation, June 1979, (Contract NAS3-2 1591). 6. Gille, J. P.: Thermal Analysis Report - Cryogenic Fluid Management...Analysis Report - Cryogenic Fluid Management Experiment, MCR -79-567, Martin Marietta Corporation, June 1979, (Contract NAS3-21591). 8. "Low
13C-based metabolic flux analysis: fundamentals and practice.
Yang, Tae Hoon
2013-01-01
Isotope-based metabolic flux analysis is one of the emerging technologies applied to system level metabolic phenotype characterization in metabolic engineering. Among the developed approaches, (13)C-based metabolic flux analysis has been established as a standard tool and has been widely applied to quantitative pathway characterization of diverse biological systems. To implement (13)C-based metabolic flux analysis in practice, comprehending the underlying mathematical and computational modeling fundamentals is of importance along with carefully conducted experiments and analytical measurements. Such knowledge is also crucial when designing (13)C-labeling experiments and properly acquiring key data sets essential for in vivo flux analysis implementation. In this regard, the modeling fundamentals of (13)C-labeling systems and analytical data processing are the main topics we will deal with in this chapter. Along with this, the relevant numerical optimization techniques are addressed to help implementation of the entire computational procedures aiming at (13)C-based metabolic flux analysis in vivo.
Applying Real-Time UML: Real-World Experiences
NASA Astrophysics Data System (ADS)
Cooling, Niall; Pachschwoell, Stefan
2004-06-01
This paper presents Austrian Aerospace's experiences of applying UML for the design of an embedded real-time avionics system based on Feabhas' "Pragma Process". It describes the complete lifecycle from adoption of UML, through training, CASE-tool selection, system analysis, and software design and development of the project itself. It concludes by reflecting on the experiences obtained and some lessons learnt.
Observing system simulation experiments with multiple methods
NASA Astrophysics Data System (ADS)
Ishibashi, Toshiyuki
2014-11-01
An observing System Simulation Experiment (OSSE) is a method to evaluate impacts of hypothetical observing systems on analysis and forecast accuracy in numerical weather prediction (NWP) systems. Since OSSE requires simulations of hypothetical observations, uncertainty of OSSE results is generally larger than that of observing system experiments (OSEs). To reduce such uncertainty, OSSEs for existing observing systems are often carried out as calibration of the OSSE system. The purpose of this study is to achieve reliable OSSE results based on results of OSSEs with multiple methods. There are three types of OSSE methods. The first one is the sensitivity observing system experiment (SOSE) based OSSE (SOSEOSSE). The second one is the ensemble of data assimilation cycles (ENDA) based OSSE (ENDA-OSSE). The third one is the nature-run (NR) based OSSE (NR-OSSE). These three OSSE methods have very different properties. The NROSSE evaluates hypothetical observations in a virtual (hypothetical) world, NR. The ENDA-OSSE is very simple method but has a sampling error problem due to a small size ensemble. The SOSE-OSSE requires a very highly accurate analysis field as a pseudo truth of the real atmosphere. We construct these three types of OSSE methods in the Japan meteorological Agency (JMA) global 4D-Var experimental system. In the conference, we will present initial results of these OSSE systems and their comparisons.
NASA Technical Reports Server (NTRS)
Prive, N. C.; Errico, R. M.; Tai, K.-S.
2013-01-01
The Global Modeling and Assimilation Office (GMAO) observing system simulation experiment (OSSE) framework is used to explore the response of analysis error and forecast skill to observation quality. In an OSSE, synthetic observations may be created that have much smaller error than real observations, and precisely quantified error may be applied to these synthetic observations. Three experiments are performed in which synthetic observations with magnitudes of applied observation error that vary from zero to twice the estimated realistic error are ingested into the Goddard Earth Observing System Model (GEOS-5) with Gridpoint Statistical Interpolation (GSI) data assimilation for a one-month period representing July. The analysis increment and observation innovation are strongly impacted by observation error, with much larger variances for increased observation error. The analysis quality is degraded by increased observation error, but the change in root-mean-square error of the analysis state is small relative to the total analysis error. Surprisingly, in the 120 hour forecast increased observation error only yields a slight decline in forecast skill in the extratropics, and no discernable degradation of forecast skill in the tropics.
Zhu, Linlin; Nie, Yaoxin; Chang, Chunqi; Gao, Jia-Hong; Niu, Zhendong
2014-06-01
The neural systems for phonological processing of written language have been well identified now, while models based on these neural systems are different for different language systems or age groups. Although each of such models is mostly concordant across different experiments, the results are sensitive to the experiment design and intersubject variability. Activation likelihood estimation (ALE) meta-analysis can quantitatively synthesize the data from multiple studies and minimize the interstudy or intersubject differences. In this study, we performed two ALE meta-analysis experiments: one was to examine the neural activation patterns of the phonological processing of two different types of written languages and the other was to examine the development characteristics of such neural activation patterns based on both alphabetic language and logographic language data. The results of our first meta-analysis experiment were consistent with the meta-analysis which was based on the studies published before 2005. And there were new findings in our second meta-analysis experiment, where both adults and children groups showed great activation in the left frontal lobe, the left superior/middle temporal gyrus, and the bilateral middle/superior occipital gyrus. However, the activation of the left middle/inferior frontal gyrus was found increase with the development, and the activation was found decrease in the following areas: the right claustrum and inferior frontal gyrus, the left inferior/medial frontal gyrus, the left middle/superior temporal gyrus, the right cerebellum, and the bilateral fusiform gyrus. It seems that adults involve more phonological areas, whereas children involve more orthographic areas and semantic areas. Copyright © 2013 Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ionescu-Bujor, Mihaela; Jin Xuezhou; Cacuci, Dan G.
2005-09-15
The adjoint sensitivity analysis procedure for augmented systems for application to the RELAP5/MOD3.2 code system is illustrated. Specifically, the adjoint sensitivity model corresponding to the heat structure models in RELAP5/MOD3.2 is derived and subsequently augmented to the two-fluid adjoint sensitivity model (ASM-REL/TF). The end product, called ASM-REL/TFH, comprises the complete adjoint sensitivity model for the coupled fluid dynamics/heat structure packages of the large-scale simulation code RELAP5/MOD3.2. The ASM-REL/TFH model is validated by computing sensitivities to the initial conditions for various time-dependent temperatures in the test bundle of the Quench-04 reactor safety experiment. This experiment simulates the reflooding with water ofmore » uncovered, degraded fuel rods, clad with material (Zircaloy-4) that has the same composition and size as that used in typical pressurized water reactors. The most important response for the Quench-04 experiment is the time evolution of the cladding temperature of heated fuel rods. The ASM-REL/TFH model is subsequently used to perform an illustrative sensitivity analysis of this and other time-dependent temperatures within the bundle. The results computed by using the augmented adjoint sensitivity system, ASM-REL/TFH, highlight the reliability, efficiency, and usefulness of the adjoint sensitivity analysis procedure for computing time-dependent sensitivities.« less
Investigation on RGB laser source applied to dynamic photoelastic experiment
NASA Astrophysics Data System (ADS)
Li, Songgang; Yang, Guobiao; Zeng, Weiming
2014-06-01
When the elastomer sustains the shock load or the blast load, its internal stress state of every point will change rapidly over time. Dynamic photoelasticity method is an experimental stress analysis method, which researches the dynamic stress and the stress wave propagation. Light source is one of very important device in dynamic photoelastic experiment system, and the RGB laser light source applied in dynamic photoelastic experiment system is innovative and evolutive to the system. RGB laser is synthesized by red laser, green laser and blue laser, either as a single wavelength laser light source, also as synthesized white laser light source. RGB laser as a light source for dynamic photoelastic experiment system, the colored isochromatic can be captured in dynamic photoelastic experiment, and even the black zero-level stripe can be collected, and the isoclinics can also be collected, which conducively analysis and study of transient stress and stress wave propagation. RGB laser is highly stable and continuous output, and its power can be adjusted. The three wavelengths laser can be synthesized by different power ratio. RGB laser light source used in dynamic photoelastic experiment has overcome a number of deficiencies and shortcomings of other light sources, and simplifies dynamic photoelastic experiment, which has achieved good results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bass, B.R.; Bryan, R.H.; Bryson, J.W.
This paper summarizes the capabilities and applications of the general-purpose and special-purpose computer programs that have been developed for use in fracture mechanics analyses of HSST pressure vessel experiments. Emphasis is placed on the OCA/USA code, which is designed for analysis of pressurized-thermal-shock (PTS) conditions, and on the ORMGEN/ADINA/ORVIRT system which is used for more general analysis. Fundamental features of these programs are discussed, along with applications to pressure vessel experiments.
A microprocessor-based automation test system for the experiment of the multi-stage compressor
NASA Astrophysics Data System (ADS)
Zhang, Huisheng; Lin, Chongping
1991-08-01
An automation test system that is controlled by the microprocessor and used in the multistage compressor experiment is described. Based on the analysis of the compressor experiment performances, a complete hardware system structure is set up. It is composed of a IBM PC/XT computer, a large scale sampled data system, the moving machine with three directions, the scanners, the digital instrumentation and some output devices. A program structure of real-time software system is described. The testing results show that this test system can take the measure of many parameter magnitudes in the blade row places and on a boundary layer in different states. The automatic extent and the accuracy of experiment is increased and the experimental cost is reduced.
NASA Technical Reports Server (NTRS)
Bremmer, D. A.
1986-01-01
The feasibility of some off-the-shelf microprocessors and state-of-art software is assessed (1) as a development system for the principle investigator (pi) in the design of the experiment model, (2) as an example of available technology application for future PI's experiments, (3) as a system capable of being interactive in the PCTC's simulation of the dedicated experiment processor (DEP), preferably by bringing the PI's DEP software directly into the simulation model, (4) as a system having bus compatibility with host VAX simulation computers, (5) as a system readily interfaced with mock-up panels and information displays, and (6) as a functional system for post mission data analysis.
NASA Technical Reports Server (NTRS)
Kuharski, Robert A.; Jongeward, Gary A.; Wilcox, Katherine G.; Rankin, Tom R.; Roche, James C.
1991-01-01
The authors review the Environment Power System Analysis Tool (EPSAT) design and demonstrate its capabilities by using it to address some questions that arose in designing the SPEAR III experiment. It is shown that that the rocket body cannot be driven to large positive voltages under the constraints of this experiment. Hence, attempts to measure the effects of a highly positive rocket body in the plasma environment should not be made in this experiment. It is determined that a hollow cathode will need to draw only about 50 mA to ground the rocket body. It is shown that a relatively small amount of gas needs to be released to induce a bulk breakdown near the rocket body, and this gas release should not discharge the sphere. Therefore, the experiment provides an excellent opportunity to study the neutralization of a differential charge.
Experimentation in machine discovery
NASA Technical Reports Server (NTRS)
Kulkarni, Deepak; Simon, Herbert A.
1990-01-01
KEKADA, a system that is capable of carrying out a complex series of experiments on problems from the history of science, is described. The system incorporates a set of experimentation strategies that were extracted from the traces of the scientists' behavior. It focuses on surprises to constrain its search, and uses its strategies to generate hypotheses and to carry out experiments. Some strategies are domain independent, whereas others incorporate knowledge of a specific domain. The domain independent strategies include magnification, determining scope, divide and conquer, factor analysis, and relating different anomalous phenomena. KEKADA represents an experiment as a set of independent and dependent entities, with apparatus variables and a goal. It represents a theory either as a sequence of processes or as abstract hypotheses. KEKADA's response is described to a particular problem in biochemistry. On this and other problems, the system is capable of carrying out a complex series of experiments to refine domain theories. Analysis of the system and its behavior on a number of different problems has established its generality, but it has also revealed the reasons why the system would not be a good experimental scientist.
Convention Center Management: A Systems Analysis & Design Course Project
ERIC Educational Resources Information Center
Guidry, Brandi N.; Totaro, Michael W.
2011-01-01
A challenge faced by many instructors of systems analysis and design courses is the selection or development of projects that provide challenging, yet suitable, learning experiences for the students. Employing a system development project case in undergraduate MIS courses offers students a multitude of opportunities to experientially examine…
Teaching Case: A Systems Analysis Role-Play Exercise and Assignment
ERIC Educational Resources Information Center
Mitri, Michel; Cole, Carey; Atkins, Laura
2017-01-01
This paper presents a role-play exercise and assignment that provides an active learning experience related to the system investigation phase of an SDLC. Whether using waterfall or agile approaches, the first SDLC step usually involves system investigation activities, including problem identification, feasibility study, cost-benefit analysis, and…
String-Coupled Pendulum Oscillators: Theory and Experiment.
ERIC Educational Resources Information Center
Moloney, Michael J.
1978-01-01
A coupled-oscillator system is given which is readily set up, using only household materials. The normal-mode analysis of this system is worked out, and an experiment or demonstration is recommended in which one verifies the theory by measuring two times and four lengths. (Author/GA)
Inter-laboratory comparison of the in vivo comet assay including three image analysis systems.
Plappert-Helbig, Ulla; Guérard, Melanie
2015-12-01
To compare the extent of potential inter-laboratory variability and the influence of different comet image analysis systems, in vivo comet experiments were conducted using the genotoxicants ethyl methanesulfonate and methyl methanesulfonate. Tissue samples from the same animals were processed and analyzed-including independent slide evaluation by image analysis-in two laboratories with extensive experience in performing the comet assay. The analysis revealed low inter-laboratory experimental variability. Neither the use of different image analysis systems, nor the staining procedure of DNA (propidium iodide vs. SYBR® Gold), considerably impacted the results or sensitivity of the assay. In addition, relatively high stability of the staining intensity of propidium iodide-stained slides was found in slides that were refrigerated for over 3 months. In conclusion, following a thoroughly defined protocol and standardized routine procedures ensures that the comet assay is robust and generates comparable results between different laboratories. © 2015 Wiley Periodicals, Inc.
A system for programming experiments and for recording and analyzing data automatically1
Herrick, Robert M.; Denelsbeck, John S.
1963-01-01
A system designed for use in complex operant conditioning experiments is described. Some of its key features are: (a) plugboards that permit the experimenter to change either from one program to another or from one analysis to another in less than a minute, (b) time-sharing of permanently-wired, electronic logic components, (c) recordings suitable for automatic analyses. Included are flow diagrams of the system and sample logic diagrams for programming experiments and for analyzing data. ImagesFig. 4. PMID:14055967
NASA Technical Reports Server (NTRS)
So, Kenneth T.; Hall, John B., Jr.; Thompson, Clifford D.
1987-01-01
NASA's Langley and Goddard facilities have evaluated the effects of animal science experiments on the Space Station's Environmental Control and Life Support System (ECLSS) by means of computer-aided analysis, assuming an animal colony consisting of 96 rodents and eight squirrel monkeys. Thirteen ECLSS options were established for the reclamation of metabolic oxygen and waste water. Minimum cost and weight impacts on the ECLSS are found to accrue to the system's operation in off-nominal mode, using electrochemical CO2 removal and a static feed electrolyzer for O2 generation.
Dzyubachyk, Oleh; Essers, Jeroen; van Cappellen, Wiggert A; Baldeyron, Céline; Inagaki, Akiko; Niessen, Wiro J; Meijering, Erik
2010-10-01
Complete, accurate and reproducible analysis of intracellular foci from fluorescence microscopy image sequences of live cells requires full automation of all processing steps involved: cell segmentation and tracking followed by foci segmentation and pattern analysis. Integrated systems for this purpose are lacking. Extending our previous work in cell segmentation and tracking, we developed a new system for performing fully automated analysis of fluorescent foci in single cells. The system was validated by applying it to two common tasks: intracellular foci counting (in DNA damage repair experiments) and cell-phase identification based on foci pattern analysis (in DNA replication experiments). Experimental results show that the system performs comparably to expert human observers. Thus, it may replace tedious manual analyses for the considered tasks, and enables high-content screening. The described system was implemented in MATLAB (The MathWorks, Inc., USA) and compiled to run within the MATLAB environment. The routines together with four sample datasets are available at http://celmia.bigr.nl/. The software is planned for public release, free of charge for non-commercial use, after publication of this article.
Flight Experiment Demonstration System (FEDS) analysis report
NASA Technical Reports Server (NTRS)
Shank, D. E.
1986-01-01
The purpose of the Flight Experiment Demonstration System (FEDS) was to show, in a simulated spacecraft environment, the feasibility of using a microprocessor to automate the onboard orbit determination functions. The software and hardware configuration used to support FEDS during the demonstration and the results of the demonstration are discussed.
MELCOR Analysis of OSU Multi-Application Small Light Water Reactor (MASLWR) Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoon, Dhongik S.; Jo, HangJin; Fu, Wen
A multi-application small light water reactor (MASLWR) conceptual design was developed by Oregon State University (OSU) with emphasis on passive safety systems. The passive containment safety system employs condensation and natural circulation to achieve the necessary heat removal from the containment in case of postulated accidents. Containment condensation experiments at the MASLWR test facility at OSU are modeled and analyzed with MELCOR, a system-level reactor accident analysis computer code. The analysis assesses its ability to predict condensation heat transfer in the presence of noncondensable gas for accidents where high-energy steam is released into the containment. This work demonstrates MELCOR’s abilitymore » to predict the pressure-temperature response of the scaled containment. Our analysis indicates that the heat removal rates are underestimated in the experiment due to the limited locations of the thermocouples and applies corrections to these measurements by conducting integral energy analyses along with CFD simulation for confirmation. Furthermore, the corrected heat removal rate measurements and the MELCOR predictions on the heat removal rate from the containment show good agreement with the experimental data.« less
MELCOR Analysis of OSU Multi-Application Small Light Water Reactor (MASLWR) Experiment
Yoon, Dhongik S.; Jo, HangJin; Fu, Wen; ...
2017-05-23
A multi-application small light water reactor (MASLWR) conceptual design was developed by Oregon State University (OSU) with emphasis on passive safety systems. The passive containment safety system employs condensation and natural circulation to achieve the necessary heat removal from the containment in case of postulated accidents. Containment condensation experiments at the MASLWR test facility at OSU are modeled and analyzed with MELCOR, a system-level reactor accident analysis computer code. The analysis assesses its ability to predict condensation heat transfer in the presence of noncondensable gas for accidents where high-energy steam is released into the containment. This work demonstrates MELCOR’s abilitymore » to predict the pressure-temperature response of the scaled containment. Our analysis indicates that the heat removal rates are underestimated in the experiment due to the limited locations of the thermocouples and applies corrections to these measurements by conducting integral energy analyses along with CFD simulation for confirmation. Furthermore, the corrected heat removal rate measurements and the MELCOR predictions on the heat removal rate from the containment show good agreement with the experimental data.« less
The AMIDAS Website: An Online Tool for Direct Dark Matter Detection Experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shan, Chung-Lin
2010-02-10
Following our long-erm work on development of model-independent data analysis methods for reconstructing the one-dimensional velocity distribution function of halo WIMPs as well as for determining their mass and couplings on nucleons by using data from direct Dark Matter detection experiments directly, we combined the simulation programs to a compact system: AMIDAS (A Model-Independent Data Analysis System). For users' convenience an online system has also been established at the same time. AMIDAS has the ability to do full Monte Carlo simulations, faster theoretical estimations, as well as to analyze (real) data sets recorded in direct detection experiments without modifying themore » source code. In this article, I give an overview of functions of the AMIDAS code based on the use of its website.« less
A scalable, self-analyzing digital locking system for use on quantum optics experiments.
Sparkes, B M; Chrzanowski, H M; Parrain, D P; Buchler, B C; Lam, P K; Symul, T
2011-07-01
Digital control of optics experiments has many advantages over analog control systems, specifically in terms of the scalability, cost, flexibility, and the integration of system information into one location. We present a digital control system, freely available for download online, specifically designed for quantum optics experiments that allows for automatic and sequential re-locking of optical components. We show how the inbuilt locking analysis tools, including a white-noise network analyzer, can be used to help optimize individual locks, and verify the long term stability of the digital system. Finally, we present an example of the benefits of digital locking for quantum optics by applying the code to a specific experiment used to characterize optical Schrödinger cat states.
Identification of visual evoked response parameters sensitive to pilot mental state
NASA Technical Reports Server (NTRS)
Zacharias, G. L.
1988-01-01
Systems analysis techniques were developed and demonstrated for modeling the electroencephalographic (EEG) steady state visual evoked response (ssVER), for use in EEG data compression and as an indicator of mental workload. The study focused on steady state frequency domain stimulation and response analysis, implemented with a sum-of-sines (SOS) stimulus generator and an off-line describing function response analyzer. Three major tasks were conducted: (1) VER related systems identification material was reviewed; (2) Software for experiment control and data analysis was developed and implemented; and (3) ssVER identification and modeling was demonstrated, via a mental loading experiment. It was found that a systems approach to ssVER functional modeling can serve as the basis for eventual development of a mental workload indicator. The review showed how transient visual evoked response (tVER) and ssVER research are related at the functional level, the software development showed how systems techniques can be used for ssVER characterization, and the pilot experiment showed how a simple model can be used to capture the basic dynamic response of the ssVER, under varying loads.
Automated Software Analysis of Fetal Movement Recorded during a Pregnant Woman's Sleep at Home.
Nishihara, Kyoko; Ohki, Noboru; Kamata, Hideo; Ryo, Eiji; Horiuchi, Shigeko
2015-01-01
Fetal movement is an important biological index of fetal well-being. Since 2008, we have been developing an original capacitive acceleration sensor and device that a pregnant woman can easily use to record fetal movement by herself at home during sleep. In this study, we report a newly developed automated software system for analyzing recorded fetal movement. This study will introduce the system and compare its results to those of a manual analysis of the same fetal movement signals (Experiment I). We will also demonstrate an appropriate way to use the system (Experiment II). In Experiment I, fetal movement data reported previously for six pregnant women at 28-38 gestational weeks were used. We evaluated the agreement of the manual and automated analyses for the same 10-sec epochs using prevalence-adjusted bias-adjusted kappa (PABAK) including quantitative indicators for prevalence and bias. The mean PABAK value was 0.83, which can be considered almost perfect. In Experiment II, twelve pregnant women at 24-36 gestational weeks recorded fetal movement at night once every four weeks. Overall, mean fetal movement counts per hour during maternal sleep significantly decreased along with gestational weeks, though individual differences in fetal development were noted. This newly developed automated analysis system can provide important data throughout late pregnancy.
Automated Software Analysis of Fetal Movement Recorded during a Pregnant Woman’s Sleep at Home
Nishihara, Kyoko; Ohki, Noboru; Kamata, Hideo; Ryo, Eiji; Horiuchi, Shigeko
2015-01-01
Fetal movement is an important biological index of fetal well-being. Since 2008, we have been developing an original capacitive acceleration sensor and device that a pregnant woman can easily use to record fetal movement by herself at home during sleep. In this study, we report a newly developed automated software system for analyzing recorded fetal movement. This study will introduce the system and compare its results to those of a manual analysis of the same fetal movement signals (Experiment I). We will also demonstrate an appropriate way to use the system (Experiment II). In Experiment I, fetal movement data reported previously for six pregnant women at 28-38 gestational weeks were used. We evaluated the agreement of the manual and automated analyses for the same 10-sec epochs using prevalence-adjusted bias-adjusted kappa (PABAK) including quantitative indicators for prevalence and bias. The mean PABAK value was 0.83, which can be considered almost perfect. In Experiment II, twelve pregnant women at 24-36 gestational weeks recorded fetal movement at night once every four weeks. Overall, mean fetal movement counts per hour during maternal sleep significantly decreased along with gestational weeks, though individual differences in fetal development were noted. This newly developed automated analysis system can provide important data throughout late pregnancy. PMID:26083422
Hunger, Christina; Bornhäuser, Annette; Link, Leoni; Geigges, Julian; Voss, Andreas; Weinhold, Jan; Schweitzer, Jochen
2017-03-01
This study presents the theoretical background, development, and psychometric properties of the German and English versions of the Experience in Personal Social Systems Questionnaire (EXIS.pers). It assesses how the members of a personal social system experience their situation within that system. It is designed as a research tool for interventions in which only one member of the system participates (e.g., Family Constellation Seminars). The EXIS.pers was created to measure change on the individual level relating to one's own important personal social system. In Study 1, we used exploratory factor analysis (EFA) for latent variable identification of the original German EXIS.pers (n = 179). In Studies 2 and 3, we used confirmatory factor analysis (CFA) to examine the dimensionality of the German (n = 634) and English (n = 310) EXIS.pers. Internal consistencies and cross-cultural structural equivalence were assessed. EFA indicated that a four-factor model provided best fit for the German EXIS.pers. For both the German and English EXIS.pers, CFA provided the best fit for a five-factor bi-level model that included a general factor (Experience In Personal Social Systems) and four dimensions (Belonging, Autonomy, Accord, Confidence). Good internal consistencies, external associations, and cross-cultural structural equivalence were demonstrated. This study provides first evidence for the German and English EXIS.pers as an economical and reliable measure of an individual's experience within his or her personal social systems. © 2016 Family Process Institute.
Lv, Yang; Hu, Guangyao; Wang, Chunyang; Yuan, Wenjie; Wei, Shanshan; Gao, Jiaoqi; Wang, Boyuan; Song, Fangchao
2017-01-01
The microbial contamination of central air conditioning system is one of the important factors that affect the indoor air quality. Actual measurement and analysis were carried out on microbial contamination in central air conditioning system at a venue in Dalian, China. Illumina miseq method was used and three fungal samples of two units were analysed by high throughput sequencing. Results showed that the predominant fungus in air conditioning unit A and B were Candida spp. and Cladosporium spp., and two fungus were further used in the hygrothermal response experiment. Based on the data of Cladosporium in hygrothermal response experiment, this paper used the logistic equation and the Gompertz equation to fit the growth predictive model of Cladosporium genera in different temperature and relative humidity conditions, and the square root model was fitted based on the two environmental factors. In addition, the models were carried on the analysis to verify the accuracy and feasibility of the established model equation. PMID:28367963
Lv, Yang; Hu, Guangyao; Wang, Chunyang; Yuan, Wenjie; Wei, Shanshan; Gao, Jiaoqi; Wang, Boyuan; Song, Fangchao
2017-04-03
The microbial contamination of central air conditioning system is one of the important factors that affect the indoor air quality. Actual measurement and analysis were carried out on microbial contamination in central air conditioning system at a venue in Dalian, China. Illumina miseq method was used and three fungal samples of two units were analysed by high throughput sequencing. Results showed that the predominant fungus in air conditioning unit A and B were Candida spp. and Cladosporium spp., and two fungus were further used in the hygrothermal response experiment. Based on the data of Cladosporium in hygrothermal response experiment, this paper used the logistic equation and the Gompertz equation to fit the growth predictive model of Cladosporium genera in different temperature and relative humidity conditions, and the square root model was fitted based on the two environmental factors. In addition, the models were carried on the analysis to verify the accuracy and feasibility of the established model equation.
ERIC Educational Resources Information Center
Diesel, Vivien; Miná Dias, Marcelo
2016-01-01
Purpose: To analyze the Brazilian experience in designing and implementing a recent extension policy reform based on agroecology, and reflect on its wider theoretical implications for extension reform literature. Design/methodology/approach: Using a critical public analysis we characterize the evolution of Brazilian federal extension policy…
Deployable antenna phase A study
NASA Technical Reports Server (NTRS)
Schultz, J.; Bernstein, J.; Fischer, G.; Jacobson, G.; Kadar, I.; Marshall, R.; Pflugel, G.; Valentine, J.
1979-01-01
Applications for large deployable antennas were re-examined, flight demonstration objectives were defined, the flight article (antenna) was preliminarily designed, and the flight program and ground development program, including the support equipment, were defined for a proposed space transportation system flight experiment to demonstrate a large (50 to 200 meter) deployable antenna system. Tasks described include: (1) performance requirements analysis; (2) system design and definition; (3) orbital operations analysis; and (4) programmatic analysis.
ERIC Educational Resources Information Center
Russell, Jack; Russell, Barbara
2015-01-01
The goal is to provide a robust and challenging problem statement for a capstone, advanced systems analysis and design course for CIS/MIS/CS majors. In addition to the problem narrative, a representative solution for much of the business modeling deliverables is presented using the UML paradigm. A structured analysis deliverable will be the topic…
Application Research of Fault Tree Analysis in Grid Communication System Corrective Maintenance
NASA Astrophysics Data System (ADS)
Wang, Jian; Yang, Zhenwei; Kang, Mei
2018-01-01
This paper attempts to apply the fault tree analysis method to the corrective maintenance field of grid communication system. Through the establishment of the fault tree model of typical system and the engineering experience, the fault tree analysis theory is used to analyze the fault tree model, which contains the field of structural function, probability importance and so on. The results show that the fault tree analysis can realize fast positioning and well repairing of the system. Meanwhile, it finds that the analysis method of fault tree has some guiding significance to the reliability researching and upgrading f the system.
Wave-Sediment Interaction in Muddy Environments: A Field Experiment
2007-01-01
in Years 1 and 2 (2007-2008) and a data analysis and modeling effort in Year 3 (2009). 2. “A System for Monitoring Wave-Sediment Interaction in...project was to conduct a pilot field experiment to test instrumentation and data analysis procedures for the major field experiment effort scheduled in...Chou et al., 1993; Foda et al., 1993). With the exception of liquefaction processes, these models assume a single, well- defined mud phase
Modular space station phase B extension preliminary system design. Volume 3: Experiment analyses
NASA Technical Reports Server (NTRS)
Wengrow, G. L.; Lillenas, A. N.
1972-01-01
Experiment analysis tasks performed during program definition study are described. Experiment accommodation and scheduling, and defining and implementing the laboratory evolution are discussed. The general purpose laboratory requirements and concepts are defined, and supplemental studies are reported.
Experiments and Analysis on a Computer Interface to an Information-Retrieval Network.
ERIC Educational Resources Information Center
Marcus, Richard S.; Reintjes, J. Francis
A primary goal of this project was to develop an interface that would provide direct access for inexperienced users to existing online bibliographic information retrieval networks. The experiment tested the concept of a virtual-system mode of access to a network of heterogeneous interactive retrieval systems and databases. An experimental…
ERIC Educational Resources Information Center
Anslow, Katharine
2014-01-01
This research aimed to illuminate the experiences of adults with learning disabilities of the reflecting team, in the context of their systemic family therapy. Five adults with learning disabilities were recruited from one community learning disability team. A qualitative design using interpretative phenomenological analysis (IPA) was appropriate…
NASA Technical Reports Server (NTRS)
Bueker, P. A.
1982-01-01
The Nitrogen Washout System measures nitrogen elimination on a breath basis from the body tissues of a subject breathing pure oxygen. The system serves as a prototype for a Space Shuttle Life Sciences experiment and in the Environmental Physiology Laboratory. Typically, a subject washes out body nitrogen for three hours while breathing oxygen from a mask enclosed in a positive-pressure oxygen tent. A nitrogen washout requires one test operator and the test subject. A DEC LSI-11/02 computer is used to (1) control and calibrate the mass spectrometer and Skylab spirometer, (2) gather and store experimental data and (3) provide limited real time analysis and more extensive post-experiment analysis. Five programs are used to gather and store the experimental data and perform all the real time control and analysis.
High frequency vibration characteristics of electric wheel system under in-wheel motor torque ripple
NASA Astrophysics Data System (ADS)
Mao, Yu; Zuo, Shuguang; Wu, Xudong; Duan, Xianglei
2017-07-01
With the introduction of in-wheel motor, the electric wheel system encounters new vibration problems brought by motor torque ripple excitation. In order to analyze new vibration characteristics of electric wheel system, torque ripple of in-wheel motor based on motor module and vector control system is primarily analyzed, and frequency/order features of the torque ripple are discussed. Then quarter vehicle-electric wheel system (QV-EWS) dynamics model based on the rigid ring tire assumption is established and the main parameters of the model are identified according to tire free modal test. Modal characteristics of the model are further analyzed. The analysis indicates that torque excitation of in-wheel motor is prone to arouse horizontal vibration, in which in-phase rotational, anti-phase rotational and horizontal translational modes of electric wheel system mainly participate. Based on the model, vibration responses of the QV-EWS under torque ripple are simulated. The results show that unlike vertical low frequency (lower than 20 Hz) vibration excited by road roughness, broadband torque ripple will arouse horizontal high frequency (50-100 Hz) vibration of electric wheel system due to participation of the three aforementioned modes. To verify the theoretical analysis, the bench experiment of electric wheel system is conducted and vibration responses are acquired. The experiment demonstrates the high frequency vibration phenomenon of electric wheel system and the measured order features as well as main resonant frequencies agree with simulation results. Through theoretical modeling, analysis and experiments this paper reveals and explains the high frequency vibration characteristics of electric wheel system, providing references for the dynamic analysis, optimal design of QV-EWS.
SCALE TSUNAMI Analysis of Critical Experiments for Validation of 233U Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mueller, Don; Rearden, Bradley T
2009-01-01
Oak Ridge National Laboratory (ORNL) staff used the SCALE TSUNAMI tools to provide a demonstration evaluation of critical experiments considered for use in validation of current and anticipated operations involving {sup 233}U at the Radiochemical Development Facility (RDF). This work was reported in ORNL/TM-2008/196 issued in January 2009. This paper presents the analysis of two representative safety analysis models provided by RDF staff.
Experiment study on sediment erosion of Pelton turbine flow passage component material
NASA Astrophysics Data System (ADS)
Liu, J.; Lu, L.; Zhu, L.
2012-11-01
A rotating and jet experiment system with high flow velocity is designed to study the anti-erosion performance of materials. The resultant velocity of the experiment system is high to 120 m/s. The anti-erosion performance of materials used in needle and nozzle and bucket of Pelton turbine, which is widely used in power station with high head and little discharge, was studied in detail by this experiment system. The experimental studies were carried with different resultant velocities and sediment concentrations. Multiple linear regression analysis method was applied to get the exponents of velocity and sediment concentration. The exponents for different materials are different. The exponents of velocity ranged from 3 to 3.5 for three kinds of material. And the exponents of sediment concentration ranged from 0.97 to 1.03 in this experiment. The SEM analysis on the erosion surface of different materials was also carried. On the erosion condition with high resultant impact velocity, the selective cutting loss of material is the mainly erosion mechanism for metal material.
Microgravity isolation system design: A modern control analysis framework
NASA Technical Reports Server (NTRS)
Hampton, R. D.; Knospe, C. R.; Allaire, P. E.; Grodsinsky, C. M.
1994-01-01
Many acceleration-sensitive, microgravity science experiments will require active vibration isolation from the manned orbiters on which they will be mounted. The isolation problem, especially in the case of a tethered payload, is a complex three-dimensional one that is best suited to modern-control design methods. These methods, although more powerful than their classical counterparts, can nonetheless go only so far in meeting the design requirements for practical systems. Once a tentative controller design is available, it must still be evaluated to determine whether or not it is fully acceptable, and to compare it with other possible design candidates. Realistically, such evaluation will be an inherent part of a necessary iterative design process. In this paper, an approach is presented for applying complex mu-analysis methods to a closed-loop vibration isolation system (experiment plus controller). An analysis framework is presented for evaluating nominal stability, nominal performance, robust stability, and robust performance of active microgravity isolation systems, with emphasis on the effective use of mu-analysis methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bower, G.
We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities.
A System Analysis Approach to Robot Gripper Control Using Phase Lag Compensator Bode Designs
NASA Astrophysics Data System (ADS)
Aye, Khin Muyar; Lin, Htin; Tun, Hla Myo
2008-10-01
In this paper, we introduce the result comparisons that were developed for the phase lag compensator design using Bode Plots. The implementation of classical experiments as MATLAB m-files is described. Robot gripper control system can be designed to gain insight into a variety of concepts, including stabilization of unstable systems, compensation properties, Bode analysis and design. The analysis has resulted in a number of important conclusions for the design of a new generation of control support systems.
2017-06-01
designed experiment to model and explore a ship-to-shore logistics process supporting dispersed units via three types of ULSs, which vary primarily in...systems, simulation, discrete event simulation, design of experiments, data analysis, simplekit, nearly orthogonal and balanced designs 15. NUMBER OF... designed experiment to model and explore a ship-to-shore logistics process supporting dispersed units via three types of ULSs, which vary primarily
Atmosphere Explorer control system software (version 1.0)
NASA Technical Reports Server (NTRS)
Villasenor, A.
1972-01-01
The basic design is described of the Atmosphere Explorer Control System (AECS) software used in the testing, integration, and flight contol of the AE spacecraft and experiments. The software performs several vital functions, such as issuing commands to the spacecraft and experiments, receiving and processing telemetry data, and allowing for extensive data processing by experiment analysis programs. The major processing sections are: executive control section, telemetry decommutation section, command generation section, and utility section.
Overview of ICE Project: Integration of Computational Fluid Dynamics and Experiments
NASA Technical Reports Server (NTRS)
Stegeman, James D.; Blech, Richard A.; Babrauckas, Theresa L.; Jones, William H.
2001-01-01
Researchers at the NASA Glenn Research Center have developed a prototype integrated environment for interactively exploring, analyzing, and validating information from computational fluid dynamics (CFD) computations and experiments. The Integrated CFD and Experiments (ICE) project is a first attempt at providing a researcher with a common user interface for control, manipulation, analysis, and data storage for both experiments and simulation. ICE can be used as a live, on-tine system that displays and archives data as they are gathered; as a postprocessing system for dataset manipulation and analysis; and as a control interface or "steering mechanism" for simulation codes while visualizing the results. Although the full capabilities of ICE have not been completely demonstrated, this report documents the current system. Various applications of ICE are discussed: a low-speed compressor, a supersonic inlet, real-time data visualization, and a parallel-processing simulation code interface. A detailed data model for the compressor application is included in the appendix.
Borisjuk, Ljudmilla; Hajirezaei, Mohammad-Reza; Klukas, Christian; Rolletschek, Hardy; Schreiber, Falk
2005-01-01
Modern 'omics'-technologies result in huge amounts of data about life processes. For analysis and data mining purposes this data has to be considered in the context of the underlying biological networks. This work presents an approach for integrating data from biological experiments into metabolic networks by mapping the data onto network elements and visualising the data enriched networks automatically. This methodology is implemented in DBE, an information system that supports the analysis and visualisation of experimental data in the context of metabolic networks. It consists of five parts: (1) the DBE-Database for consistent data storage, (2) the Excel-Importer application for the data import, (3) the DBE-Website as the interface for the system, (4) the DBE-Pictures application for the up- and download of binary (e. g. image) files, and (5) DBE-Gravisto, a network analysis and graph visualisation system. The usability of this approach is demonstrated in two examples.
Vehicle Signal Analysis Using Artificial Neural Networks for a Bridge Weigh-in-Motion System
Kim, Sungkon; Lee, Jungwhee; Park, Min-Seok; Jo, Byung-Wan
2009-01-01
This paper describes the procedures for development of signal analysis algorithms using artificial neural networks for Bridge Weigh-in-Motion (B-WIM) systems. Through the analysis procedure, the extraction of information concerning heavy traffic vehicles such as weight, speed, and number of axles from the time domain strain data of the B-WIM system was attempted. As one of the several possible pattern recognition techniques, an Artificial Neural Network (ANN) was employed since it could effectively include dynamic effects and bridge-vehicle interactions. A number of vehicle traveling experiments with sufficient load cases were executed on two different types of bridges, a simply supported pre-stressed concrete girder bridge and a cable-stayed bridge. Different types of WIM systems such as high-speed WIM or low-speed WIM were also utilized during the experiments for cross-checking and to validate the performance of the developed algorithms. PMID:22408487
Vehicle Signal Analysis Using Artificial Neural Networks for a Bridge Weigh-in-Motion System.
Kim, Sungkon; Lee, Jungwhee; Park, Min-Seok; Jo, Byung-Wan
2009-01-01
This paper describes the procedures for development of signal analysis algorithms using artificial neural networks for Bridge Weigh-in-Motion (B-WIM) systems. Through the analysis procedure, the extraction of information concerning heavy traffic vehicles such as weight, speed, and number of axles from the time domain strain data of the B-WIM system was attempted. As one of the several possible pattern recognition techniques, an Artificial Neural Network (ANN) was employed since it could effectively include dynamic effects and bridge-vehicle interactions. A number of vehicle traveling experiments with sufficient load cases were executed on two different types of bridges, a simply supported pre-stressed concrete girder bridge and a cable-stayed bridge. Different types of WIM systems such as high-speed WIM or low-speed WIM were also utilized during the experiments for cross-checking and to validate the performance of the developed algorithms.
A versatile system for the rapid collection, handling and graphics analysis of multidimensional data
NASA Astrophysics Data System (ADS)
O'Brien, P. M.; Moloney, G.; O'Connor, A.; Legge, G. J. F.
1993-05-01
The aim of this work was to provide a versatile system for handling multiparameter data that may arise from a variety of experiments — nuclear, AMS, microprobe elemental analysis, 3D microtomography etc. Some of the most demanding requirements arise in the application of microprobes to quantitative elemental mapping and to microtomography. A system to handle data from such experiments had been under continuous development and use at MARC for the past 15 years. It has now been made adaptable to the needs of multiparameter (or single parameter) experiments in general. The original system has been rewritten, greatly expanded and made much more powerful and faster, by use of modern computer technology — a VME bus computer with a real time operating system and a RISC workstation running Unix and the X Window system. This provides the necessary (i) power, speed and versatility, (ii) expansion and updating capabilities (iii) standardisation and adaptability, (iv) coherent modular programming structures, (v) ability to interface to other programs and (vi) transparent operation with several levels, involving the use of menus, programmed function keys and powerful macro programming facilities.
Airborne Visible Laser Optical Communications (AVLOC) experiment
NASA Technical Reports Server (NTRS)
1974-01-01
A series of optical communication experiments between a high altitude aircraft at 18.3 km (60,000 ft) and a ground station were conducted by NASA from summer 1972 through winter 1973. The basic system was an optical tracker and transmitter located in each terminal. The aircraft transceiver consisted of a 5-mW HeNe laser transmitter with a 30-megabit modulator. The ground station beacon was an argon laser operating at 488 nm. A separate pulsed laser radar was used for initial acquisition. The objective of the experiment was to obtain engineering data on the precision tracking and communication system performance at both terminals. Atmospheric effects on the system performance was also an experiment objective. The system description, engineering analysis, testing, and flight results are discussed.
Module Fifteen: Special Topics; Basic Electricity and Electronics Individualized Learning System.
ERIC Educational Resources Information Center
Bureau of Naval Personnel, Washington, DC.
The final module emphasizes utilizing the information learned in modules 1-14 to analyze and evaluate the power supply constructed in Module 0. The module contains the following narrative--power supply evaluation; experiment 1--resistance analysis of the half-wave and semiconductor power supply; experiment 2--voltage analysis of the half-wave and…
Analysis of test data film generated by the lunar sounder (S-209)
NASA Technical Reports Server (NTRS)
Massey, N.
1973-01-01
The analysis of test films pertaining to the readiness of the Apollo 17 radar equipment is discussed. Emphasis is placed on the evaluation of the lunar sounder equipment. The lunar sounder experiment was to examine the lunar surface at three different radar frequencies of 2 meters, 60 meters, and 20 meters. Test films were made on the lunar sounder system to describe the purpose of the test, to describe the experiments used for analysis, and to provide conclusions reached after analysis.
NASA Astrophysics Data System (ADS)
Dewhurst, A.; Legger, F.
2015-12-01
The ATLAS experiment accumulated more than 140 PB of data during the first run of the Large Hadron Collider (LHC) at CERN. The analysis of such an amount of data is a challenging task for the distributed physics community. The Distributed Analysis (DA) system of the ATLAS experiment is an established and stable component of the ATLAS distributed computing operations. About half a million user jobs are running daily on DA resources, submitted by more than 1500 ATLAS physicists. The reliability of the DA system during the first run of the LHC and the following shutdown period has been high thanks to the continuous automatic validation of the distributed analysis sites and the user support provided by a dedicated team of expert shifters. During the LHC shutdown, the ATLAS computing model has undergone several changes to improve the analysis workflows, including the re-design of the production system, a new analysis data format and event model, and the development of common reduction and analysis frameworks. We report on the impact such changes have on the DA infrastructure, describe the new DA components, and include recent performance measurements.
Advanced gamma ray balloon experiment ground checkout and data analysis
NASA Technical Reports Server (NTRS)
Blackstone, M.
1976-01-01
A software programming package to be used in the ground checkout and handling of data from the advanced gamma ray balloon experiment is described. The Operator's Manual permits someone unfamiliar with the inner workings of the software system (called LEO) to operate on the experimental data as it comes from the Pulse Code Modulation interface, converting it to a form for later analysis, and monitoring the program of an experiment. A Programmer's Manual is included.
NASA Astrophysics Data System (ADS)
Kazanskiy, Nikolay; Protsenko, Vladimir; Serafimovich, Pavel
2016-03-01
This research article contains an experiment with implementation of image filtering task in Apache Storm and IBM InfoSphere Streams stream data processing systems. The aim of presented research is to show that new technologies could be effectively used for sliding window filtering of image sequences. The analysis of execution was focused on two parameters: throughput and memory consumption. Profiling was performed on CentOS operating systems running on two virtual machines for each system. The experiment results showed that IBM InfoSphere Streams has about 1.5 to 13.5 times lower memory footprint than Apache Storm, but could be about 2.0 to 2.5 slower on a real hardware.
Testing of Safety-Critical Software Embedded in an Artificial Heart
NASA Astrophysics Data System (ADS)
Cha, Sungdeok; Jeong, Sehun; Yoo, Junbeom; Kim, Young-Gab
Software is being used more frequently to control medical devices such as artificial heart or robotic surgery system. While much of software safety issues in such systems are similar to other safety-critical systems (e.g., nuclear power plants), domain-specific properties may warrant development of customized techniques to demonstrate fitness of the system on patients. In this paper, we report results of a preliminary analysis done on software controlling a Hybrid Ventricular Assist Device (H-VAD) developed by Korea Artificial Organ Centre (KAOC). It is a state-of-the-art artificial heart which completed animal testing phase. We performed software testing in in-vitro experiments and animal experiments. An abnormal behaviour, never detected during extensive in-vitro analysis and animal testing, was found.
NASA Technical Reports Server (NTRS)
Jeng, Frank F.
2007-01-01
Development of analysis guidelines for Exploration Life Support (ELS) technology tests was completed. The guidelines were developed based on analysis experiences gained from supporting Environmental Control and Life Support System (ECLSS) technology development in air revitalization systems and water recovery systems. Analyses are vital during all three phases of the ELS technology test: pre-test, during test and post test. Pre-test analyses of a test system help define hardware components, predict system and component performances, required test duration, sampling frequencies of operation parameters, etc. Analyses conducted during tests could verify the consistency of all the measurements and the performance of the test system. Post test analyses are an essential part of the test task. Results of post test analyses are an important factor in judging whether the technology development is a successful one. In addition, development of a rigorous model for a test system is an important objective of any new technology development. Test data analyses, especially post test data analyses, serve to verify the model. Test analyses have supported development of many ECLSS technologies. Some test analysis tasks in ECLSS technology development are listed in the Appendix. To have effective analysis support for ECLSS technology tests, analysis guidelines would be a useful tool. These test guidelines were developed based on experiences gained through previous analysis support of various ECLSS technology tests. A comment on analysis from an experienced NASA ECLSS manager (1) follows: "Bad analysis was one that bent the test to prove that the analysis was right to begin with. Good analysis was one that directed where the testing should go and also bridged the gap between the reality of the test facility and what was expected on orbit."
Antoniotti, M; Park, F; Policriti, A; Ugel, N; Mishra, B
2003-01-01
The analysis of large amounts of data, produced as (numerical) traces of in vivo, in vitro and in silico experiments, has become a central activity for many biologists and biochemists. Recent advances in the mathematical modeling and computation of biochemical systems have moreover increased the prominence of in silico experiments; such experiments typically involve the simulation of sets of Differential Algebraic Equations (DAE), e.g., Generalized Mass Action systems (GMA) and S-systems. In this paper we reason about the necessary theoretical and pragmatic foundations for a query and simulation system capable of analyzing large amounts of such trace data. To this end, we propose to combine in a novel way several well-known tools from numerical analysis (approximation theory), temporal logic and verification, and visualization. The result is a preliminary prototype system: simpathica/xssys. When dealing with simulation data simpathica/xssys exploits the special structure of the underlying DAE, and reduces the search space in an efficient way so as to facilitate any queries about the traces. The proposed system is designed to give the user possibility to systematically analyze and simultaneously query different possible timed evolutions of the modeled system.
Atmosphere explorer missions C, D, and E. Spacecraft experiment interface definition study
NASA Technical Reports Server (NTRS)
1972-01-01
The Atmosphere Explorer Missions C, D, & E Spacecraft/Experiment Interface Definition Study is discussed. The objectives of the study included an analysis of the accommodation requirements of the experiments for the three missions, an assessment of the overall effect of these requirements on the spacecraft system design and performance, and the detailed definition of all experiment/spacecraft electrical, mechanical, and environmental interfaces. In addition, the study included the identification and definition of system characteristics required to ensure compatibility with the consolidated STADAN and MSFN communications networks.
Gavrielides, Mike; Furney, Simon J; Yates, Tim; Miller, Crispin J; Marais, Richard
2014-01-01
Whole genomes, whole exomes and transcriptomes of tumour samples are sequenced routinely to identify the drivers of cancer. The systematic sequencing and analysis of tumour samples, as well other oncogenomic experiments, necessitates the tracking of relevant sample information throughout the investigative process. These meta-data of the sequencing and analysis procedures include information about the samples and projects as well as the sequencing centres, platforms, data locations, results locations, alignments, analysis specifications and further information relevant to the experiments. The current work presents a sample tracking system for oncogenomic studies (Onco-STS) to store these data and make them easily accessible to the researchers who work with the samples. The system is a web application, which includes a database and a front-end web page that allows the remote access, submission and updating of the sample data in the database. The web application development programming framework Grails was used for the development and implementation of the system. The resulting Onco-STS solution is efficient, secure and easy to use and is intended to replace the manual data handling of text records. Onco-STS allows simultaneous remote access to the system making collaboration among researchers more effective. The system stores both information on the samples in oncogenomic studies and details of the analyses conducted on the resulting data. Onco-STS is based on open-source software, is easy to develop and can be modified according to a research group's needs. Hence it is suitable for laboratories that do not require a commercial system.
Study on user interface of pathology picture archiving and communication system.
Kim, Dasueran; Kang, Peter; Yun, Jungmin; Park, Sung-Hye; Seo, Jeong-Wook; Park, Peom
2014-01-01
It is necessary to improve the pathology workflow. A workflow task analysis was performed using a pathology picture archiving and communication system (pathology PACS) in order to propose a user interface for the Pathology PACS considering user experience. An interface analysis of the Pathology PACS in Seoul National University Hospital and a task analysis of the pathology workflow were performed by observing recorded video. Based on obtained results, a user interface for the Pathology PACS was proposed. Hierarchical task analysis of Pathology PACS was classified into 17 tasks including 1) pre-operation, 2) text, 3) images, 4) medical record viewer, 5) screen transition, 6) pathology identification number input, 7) admission date input, 8) diagnosis doctor, 9) diagnosis code, 10) diagnosis, 11) pathology identification number check box, 12) presence or absence of images, 13) search, 14) clear, 15) Excel save, 16) search results, and 17) re-search. And frequently used menu items were identified and schematized. A user interface for the Pathology PACS considering user experience could be proposed as a preliminary step, and this study may contribute to the development of medical information systems based on user experience and usability.
NASA Technical Reports Server (NTRS)
Casper, Patricia A.; Kantowitz, Barry H.
1988-01-01
Multiple approaches are necessary for understanding and measuring workload. In particular, physiological systems identifiable by employing cardiac measures are related to cognitive systems. One issue of debate in measuring cardiac output is the grain of analysis used in recording and summarizing data. Various experiments are reviewed, the majority of which were directed at supporting or contradicting Lacey's intake-rejection hypothesis. Two of the experiments observed heart rate in operational environments and found virtually no changes associated with mental load. The major problems facing researchers using heart rate variability, or sinus arrhthmia, as a dependent measure have been associated with valid and sensitive scoring and preventing contamination of observed results by influences unrelated to cognition. Spectral analysis of heart rate variability offers two useful procedures: analysis from the time domain and analysis from the frequency domain. Most recently, data have been collected in a divided attention experiment, the performance measures and cardiac measures of which are detailed.
Marshall Amateur Radio Club experiment (MARCE) post flight data analysis
NASA Technical Reports Server (NTRS)
Rupp, Charles C.
1987-01-01
The Marshall Amateur Radio Club Experiment (MARCE) data system, the data recorded during the flight of STS-61C, the manner in which the data was reduced to engineering units, and the performance of the student experiments determined from the data are briefly described.
ATTDES: An Expert System for Satellite Attitude Determination and Control. 2
NASA Technical Reports Server (NTRS)
Mackison, Donald L.; Gifford, Kevin
1996-01-01
The design, analysis, and flight operations of satellite attitude determintion and attitude control systems require extensive mathematical formulations, optimization studies, and computer simulation. This is best done by an analyst with extensive education and experience. The development of programs such as ATTDES permit the use of advanced techniques by those with less experience. Typical tasks include the mission analysis to select stabilization and damping schemes, attitude determination sensors and algorithms, and control system designs to meet program requirements. ATTDES is a system that includes all of these activities, including high fidelity orbit environment models that can be used for preliminary analysis, parameter selection, stabilization schemes, the development of estimators covariance analyses, and optimization, and can support ongoing orbit activities. The modification of existing simulations to model new configurations for these purposes can be an expensive, time consuming activity that becomes a pacing item in the development and operation of such new systems. The use of an integrated tool such as ATTDES significantly reduces the effort and time required for these tasks.
Parent-identified barriers to pediatric health care: a process-oriented model.
Sobo, Elisa J; Seid, Michael; Reyes Gelhard, Leticia
2006-02-01
To further understand barriers to care as experienced by health care consumers, and to demonstrate the importance of conjoining qualitative and quantitative health services research. Transcripts from focus groups conducted in San Diego with English- and Spanish-speaking parents of children with special health care needs. Participants were asked about the barriers to care they had experienced or perceived, and their strategies for overcoming these barriers. Using elementary anthropological discourse analysis techniques, a process-based conceptual model of the parent experience was devised. The analysis revealed a parent-motivated model of barriers to care that enriched our understanding of quantitative findings regarding the population from which the focus group sample was drawn. Parent-identified barriers were grouped into the following six temporally and spatially sequenced categories: necessary skills and prerequisites for gaining access to the system; realizing access once it is gained; front office experiences; interactions with physicians; system arbitrariness and fragmentation; outcomes that affect future interaction with the system. Key to the successful navigation of the system was parents' functional biomedical acculturation; this construct likens the biomedical health services system to a cultural system within which all parents/patients must learn to function competently. Qualitative analysis of focus group data enabled a deeper understanding of barriers to care--one that went beyond the traditional association of marker variables with poor outcomes ("what") to reveal an understanding of the processes by which parents experience the health care system ("how,"why") and by which disparities may arise. Development of such process-oriented models furthers the provision of patient-centered care and the creation of interventions, programs, and curricula to enhance such care. Qualitative discourse analysis, for example using this project's widely applicable protocol for generating experientially based models, can enhance our knowledge of the parent/patient experience and aid in the development of more powerful conceptualizations of key health care constructs.
NASA Technical Reports Server (NTRS)
1973-01-01
The proceedings of a conference on NASA Structural Analysis (NASTRAN) to analyze the experiences of users of the program are presented. The subjects discussed include the following: (1) statics and buckling, (2) vibrations and dynamics, (3) substructing, (4) new capability, (5) user's experience, and (6) system experience. Specific applications of NASTRAN to spacecraft, aircraft, nuclear power plants, and materials tests are reported.
An Empirical Analysis of Negotiation Teaching Methodologies Using a Negotiation Support System
ERIC Educational Resources Information Center
Jones, Beth H.; Jones, Gary H.; Banerjee, Debasish
2005-01-01
This article describes an experiment that compared different methods of teaching undergraduates the fundamentals of negotiation analysis. Using student subjects, we compared three conditions: reading, lecture-only, and lecture accompanied by student use of a computerized negotiation support system (NSS). The authors examined two facets of…
NASA Technical Reports Server (NTRS)
Jules, Kenol; Hrovat, Kenneth; Kelly, Eric; Reckart, Timothy
2006-01-01
This summary report presents the analysis results of some of the processed acceleration data measured aboard the International Space Station during the period of November 2002 to April 2004. Two accelerometer systems were used to measure the acceleration levels for the activities that took place during Increment-6/8. However, not all of the activities during that period were analyzed in order to keep the size of the report manageable. The National Aeronautics and Space Administration sponsors the Microgravity Acceleration Measurement System and the Space Acceleration Measurement System to support microgravity science experiments that require microgravity acceleration measurements. On April 19, 2001, both the Microgravity Acceleration Measurement System and the Space Acceleration Measurement System units were launched on STS-100 from the Kennedy Space Center for installation on the International Space Station. The Microgravity Acceleration Measurement System unit was flown to the station in support of science experiments requiring quasi-steady acceleration measurements, while the Space Acceleration Measurement System unit was flown to support experiments requiring vibratory acceleration measurement. Both acceleration systems are also used in support of the vehicle microgravity requirements verification as well as in support of the International Space Station support cadre. The International Space Station Increment-6/8 reduced gravity environment analysis presented in this report uses acceleration data collected by both sets of accelerometer systems: 1. The Microgravity Acceleration Measurement System, which consists of two sensors: the Orbital Acceleration Research Experiment Sensor Subsystem, a low frequency range sensor (up to 1 Hz), is used to characterize the quasi-steady environment for payloads and vehicle, and the High Resolution Accelerometer Package, which is used to characterize the vibratory environment up to 100 Hz. 2. The Space Acceleration Measurement System measures vibratory acceleration data in the range of 0.01 to 400 Hz. This summary report presents analysis of some selected quasi-steady and vibratory activities measured by these accelerometers during Increment-6/8 from November 2002 to April 2004.
1974-07-01
elec- Materials se: trode materials and associ- operational ated conductors. 2.5.1 General. H" (02) Materials resources Technoeconomic analysis - None...Advanced Energy Systems Using New Fnels VIII Correlation and Analysis of Materials Requirements IX Research Recommendations and Priorities The authois...of government and industrial organizal ions who gave us the benefit of their knowledge and experience. iv VIII CORRELATION ANU ANALYSIS OF MATERIALS
Atmosphere Explorer control system software (version 2.0)
NASA Technical Reports Server (NTRS)
Mocarsky, W.; Villasenor, A.
1973-01-01
The Atmosphere Explorer Control System (AECS) was developed to provide automatic computer control of the Atmosphere Explorer spacecraft and experiments. The software performs several vital functions, such as issuing commands to the spacecraft and experiments, receiving and processing telemetry data, and allowing for extensive data processing by experiment analysis programs. The AECS was written for a 48K XEROX Data System Sigma 5 computer, and coexists in core with the XDS Real-time Batch Monitor (RBM) executive system. RBM is a flexible operating system designed for a real-time foreground/background environment, and hence is ideally suited for this application. Existing capabilities of RBM have been used as much as possible by AECS to minimize programming redundancy. The most important functions of the AECS are to send commands to the spacecraft and experiments, and to receive, process, and display telemetry data.
Radiometric and geometric analysis of hyperspectral imagery acquired from an unmanned aerial vehicle
Hruska, Ryan; Mitchell, Jessica; Anderson, Matthew; ...
2012-09-17
During the summer of 2010, an Unmanned Aerial Vehicle (UAV) hyperspectral in-flight calibration and characterization experiment of the Resonon PIKA II imaging spectrometer was conducted at the U.S. Department of Energy’s Idaho National Laboratory (INL) UAV Research Park. The purpose of the experiment was to validate the radiometric calibration of the spectrometer and determine the georegistration accuracy achievable from the on-board global positioning system (GPS) and inertial navigation sensors (INS) under operational conditions. In order for low-cost hyperspectral systems to compete with larger systems flown on manned aircraft, they must be able to collect data suitable for quantitative scientific analysis.more » The results of the in-flight calibration experiment indicate an absolute average agreement of 96.3%, 93.7% and 85.7% for calibration tarps of 56%, 24%, and 2.5% reflectivity, respectively. The achieved planimetric accuracy was 4.6 meters (based on RMSE).« less
Di Stefano, Carlos A.; Malamud, G.; Kuranz, C. C.; ...
2015-10-19
Here, we present experiments observing Richtmyer–Meshkov mode coupling and bubble competition in a system arising from well-characterized initial conditions and driven by a strong (Mach ~ 8) shock. These measurements and the analysis method developed to interpret them provide an important step toward the possibility of observing self-similarity under such conditions, as well as a general platform for performing and analyzing hydrodynamic instability experiments. A key feature of these experiments is that the shock is sustained sufficiently long that this nonlinear behavior occurs without decay of the shock velocity or other hydrodynamic properties of the system, which facilitates analysis andmore » allows the results to be used in the study of analytic models.« less
ERIC Educational Resources Information Center
Lench, Heather C.; Flores, Sarah A.; Bench, Shane W.
2011-01-01
Our purpose in the present meta-analysis was to examine the extent to which discrete emotions elicit changes in cognition, judgment, experience, behavior, and physiology; whether these changes are correlated as would be expected if emotions organize responses across these systems; and which factors moderate the magnitude of these effects. Studies…
An Overview of ANN Application in the Power Industry
NASA Technical Reports Server (NTRS)
Niebur, D.
1995-01-01
The paper presents a survey on the development and experience with artificial neural net (ANN) applications for electric power systems, with emphasis on operational systems. The organization and constraints of electric utilities are reviewed, motivations for investigating ANN are identified, and a current assessment is given from the experience of 2400 projects using ANN for load forecasting, alarm processing, fault detection, component fault diagnosis, static and dynamic security analysis, system planning, and operation planning.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Newman, Ken
1997-06-01
Experiment designs to estimate the effect of transportation on survival and return rates of Columbia River system salmonids are discussed along with statistical modeling techniques. Besides transportation, river flow and dam spill are necessary components in the design and analysis otherwise questions as to the effects of reservoir drawdowns and increased dam spill may never be satisfactorily answered. Four criteria for comparing different experiment designs are: (1) feasibility, (2) clarity of results, (3) scope of inference, and (4) time to learn. In this report, alternative designs for conducting experimental manipulations of smolt tagging studies to study effects of river operationsmore » such as flow levels, spill fractions, and transporting outmigrating salmonids around dams in the Columbia River system are presented. The principles of study design discussed in this report have broad implications for the many studies proposed to investigate both smolt and adult survival relationships. The concepts are illustrated for the case of the design and analysis of smolt transportation experiments. The merits of proposed transportation studies should be measured relative to these principles of proper statistical design and analysis.« less
Posttest REALP4 analysis of LOFT experiment L1-3A
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, J.R.; Holmstrom, H.L.O.
This report presents selected results of posttest RELAP4 modeling of LOFT loss-of-coolant experiment L1-3A, a double-ended isothermal cold leg break with lower plenum emergency core coolant injection. Comparisons are presented between the pretest prediction, the posttest analysis, and the experimental data. It is concluded that pressurizer modeling is important for accurately predicting system behavior during the initial portion of saturated blowdown. Using measured initial conditions rather than nominal specified initial conditions did not influence the system model results significantly. Using finer nodalization in the reactor vessel improved the prediction of the system pressure history by minimizing steam condensation effects. Unequalmore » steam condensation between the downcomer and core volumes appear to cause the manometer oscillations observed in both the pretest and posttest RELAP4 analysis.« less
KNIME for reproducible cross-domain analysis of life science data.
Fillbrunn, Alexander; Dietz, Christian; Pfeuffer, Julianus; Rahn, René; Landrum, Gregory A; Berthold, Michael R
2017-11-10
Experiments in the life sciences often involve tools from a variety of domains such as mass spectrometry, next generation sequencing, or image processing. Passing the data between those tools often involves complex scripts for controlling data flow, data transformation, and statistical analysis. Such scripts are not only prone to be platform dependent, they also tend to grow as the experiment progresses and are seldomly well documented, a fact that hinders the reproducibility of the experiment. Workflow systems such as KNIME Analytics Platform aim to solve these problems by providing a platform for connecting tools graphically and guaranteeing the same results on different operating systems. As an open source software, KNIME allows scientists and programmers to provide their own extensions to the scientific community. In this review paper we present selected extensions from the life sciences that simplify data exploration, analysis, and visualization and are interoperable due to KNIME's unified data model. Additionally, we name other workflow systems that are commonly used in the life sciences and highlight their similarities and differences to KNIME. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lucia, M., E-mail: mlucia@pppl.gov; Kaita, R.; Majeski, R.
The Materials Analysis and Particle Probe (MAPP) is a compact in vacuo surface science diagnostic, designed to provide in situ surface characterization of plasma facing components in a tokamak environment. MAPP has been implemented for operation on the Lithium Tokamak Experiment at Princeton Plasma Physics Laboratory (PPPL), where all control and analysis systems are currently under development for full remote operation. Control systems include vacuum management, instrument power, and translational/rotational probe drive. Analysis systems include onboard Langmuir probes and all components required for x-ray photoelectron spectroscopy, low-energy ion scattering spectroscopy, direct recoil spectroscopy, and thermal desorption spectroscopy surface analysis techniques.
TC-2 post Helios experiment data review. [postflight systems analysis of spacecraft performance
NASA Technical Reports Server (NTRS)
1975-01-01
Data are presented from a systems postflight analysis of the Centaur Launch Vehicle and Helios. Also given is a comparison of data from preflight analyses. Topics examined are: (1) propellant behavior; (2) helium usage; (3) propellant tank pressurization; (4) propellant tank thermodynamics; (5) component heating; thermal control; and thermal protection system; (6) main engine system; (7) H2O2 consumption; (8) boost pump post-meco performance; and (9) an overview of other systems.
Learning from Trending, Precursor Analysis, and System Failures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Youngblood, R. W.; Duffey, R. B.
2015-11-01
Models of reliability growth relate current system unreliability to currently accumulated experience. But “experience” comes in different forms. Looking back after a major accident, one is sometimes able to identify previous events or measurable performance trends that were, in some sense, signaling the potential for that major accident: potential that could have been recognized and acted upon, but was not recognized until the accident occurred. This could be a previously unrecognized cause of accidents, or underestimation of the likelihood that a recognized potential cause would actually operate. Despite improvements in the state of practice of modeling of risk and reliability,more » operational experience still has a great deal to teach us, and work has been going on in several industries to try to do a better job of learning from experience before major accidents occur. It is not enough to say that we should review operating experience; there is too much “experience” for such general advice to be considered practical. The paper discusses the following: 1. The challenge of deciding what to focus on in analysis of operating experience. 2. Comparing what different models of learning and reliability growth imply about trending and precursor analysis.« less
Automation of the longwall mining system
NASA Technical Reports Server (NTRS)
Zimmerman, W.; Aster, R. W.; Harris, J.; High, J.
1982-01-01
Cost effective, safe, and technologically sound applications of automation technology to underground coal mining were identified. The longwall analysis commenced with a general search for government and industry experience of mining automation technology. A brief industry survey was conducted to identify longwall operational, safety, and design problems. The prime automation candidates resulting from the industry experience and survey were: (1) the shearer operation, (2) shield and conveyor pan line advance, (3) a management information system to allow improved mine logistics support, and (4) component fault isolation and diagnostics to reduce untimely maintenance delays. A system network analysis indicated that a 40% improvement in productivity was feasible if system delays associated with all of the above four areas were removed. A technology assessment and conceptual system design of each of the four automation candidate areas showed that state of the art digital computer, servomechanism, and actuator technologies could be applied to automate the longwall system.
NASA Technical Reports Server (NTRS)
Jules, Kenol; Lin, Paul P.
2006-01-01
One of the responsibilities of the NASA Glenn Principal Investigator Microgravity Services is to support NASA sponsored investigators in the area of reduced-gravity acceleration data analysis, interpretation and the monitoring of the reduced-gravity environment on-board various carriers. With the International Space Station currently operational, a significant amount of acceleration data is being down-linked and processed on ground for both the space station onboard environment characterization (and verification) and scientific experiments. Therefore, to help principal investigator teams monitor the acceleration level on-board the International Space Station to avoid undesirable impact on their experiment, when possible, the NASA Glenn Principal Investigator Microgravity Services developed an artificial intelligence monitoring system, which detects in near real time any change in the environment susceptible to affect onboard experiments. The main objective of the monitoring system is to help research teams identify the vibratory disturbances that are active at any instant of time onboard the International Space Station that might impact the environment in which their experiment is being conducted. The monitoring system allows any space research scientist, at any location and at any time, to see the current acceleration level on-board the Space Station via the World Wide Web. From the NASA Glenn s Exploration Systems Division web site, research scientists can see in near real time the active disturbances, such as pumps, fans, compressor, crew exercise, re-boost, extra-vehicular activity, etc., and decide whether or not to continue operating or stopping (or making note of such activity for later correlation with science results) their experiments based on the g-level associated with that specific event. A dynamic graphical display accessible via the World Wide Web shows the status of all the vibratory disturbance activities with their degree of confidence as well as their g-level contribution to the environment. The system can detect both known and unknown vibratory disturbance activities. It can also perform trend analysis and prediction by analyzing past data over many Increments of the space station for selected disturbance activities. This feature can be used to monitor the health of onboard mechanical systems to detect and prevent potential system failure as well as for use by research scientists during their science results analysis. Examples of both real time on-line vibratory disturbance detection and off-line trend analysis are presented in this paper. Several soft computing techniques such as Kohonen s Self-Organizing Feature Map, Learning Vector Quantization, Back-Propagation Neural Networks, and Fuzzy Logic were used to design the system.
NASA Astrophysics Data System (ADS)
Shi, J. T.; Han, X. T.; Xie, J. F.; Yao, L.; Huang, L. T.; Li, L.
2013-03-01
A Pulsed High Magnetic Field Facility (PHMFF) has been established in Wuhan National High Magnetic Field Center (WHMFC) and various protection measures are applied in its control system. In order to improve the reliability and robustness of the control system, the safety analysis of the PHMFF is carried out based on Fault Tree Analysis (FTA) technique. The function and realization of 5 protection systems, which include sequence experiment operation system, safety assistant system, emergency stop system, fault detecting and processing system and accident isolating protection system, are given. The tests and operation indicate that these measures improve the safety of the facility and ensure the safety of people.
ERIC Educational Resources Information Center
Walczak, Mary M.; Lantz, Juliette M.
2004-01-01
The case of Well Wishes involves students in a thorough examination of the interaction among nitrogen-composed species in the septic systems and well water, which helps to clean household water. The case supports the attainment of five goals for students, and can be analyzed through classroom discussions or laboratory experiments.
An Office Automation Needs Assessment Model
1985-08-01
TRACKING FORM . . . 74 I. CSD OFFICE SYSTEMS ANALYSIS WORKSHEETS . . . 75 J. AMO EVALUATIONS OF PROPOSED MODEL ...... 113 FOOTNOTES...as to "who should plan for office automated systems," a checklist of attributes should be evaluated , including: experience, expertise, availability of...with experience, differs with respect to breadth of knowledge in numerous areas. In evaluating in-house vs. outside resources, the Hospital Commander
Detection of soil microorganism in situ by combined gas chromatography mass spectrometry
NASA Technical Reports Server (NTRS)
Alexander, M.; Duxbury, J. M.; Francis, A. J.; Adamson, J.
1972-01-01
Experimental tests were made to determine whether analysis of volatile metabolic products, formed in situ, is a viable procedure for an extraterrestrial life detection system. Laboratory experiments, carried out under anaerobic conditions with addition of carbon source, extended to include a variety of soils and additional substrates. In situ experiments were conducted without amendment using a vacuum sampling system.
NASA Technical Reports Server (NTRS)
Carden, J. L.; Browner, R.
1982-01-01
The preparation and analysis of standardized waste samples for controlled ecological life support systems (CELSS) are considered. Analysis of samples from wet oxidation experiments, the development of ion chromatographic techniques utilizing conventional high pressure liquid chromatography (HPLC) equipment, and an investigation of techniques for interfacing an ion chromatograph (IC) with an inductively coupled plasma optical emission spectrometer (ICPOES) are discussed.
A system architecture for online data interpretation and reduction in fluorescence microscopy
NASA Astrophysics Data System (ADS)
Röder, Thorsten; Geisbauer, Matthias; Chen, Yang; Knoll, Alois; Uhl, Rainer
2010-01-01
In this paper we present a high-throughput sample screening system that enables real-time data analysis and reduction for live cell analysis using fluorescence microscopy. We propose a novel system architecture capable of analyzing a large amount of samples during the experiment and thus greatly minimizing the post-analysis phase that is the common practice today. By utilizing data reduction algorithms, relevant information of the target cells is extracted from the online collected data stream, and then used to adjust the experiment parameters in real-time, allowing the system to dynamically react on changing sample properties and to control the microscope setup accordingly. The proposed system consists of an integrated DSP-FPGA hybrid solution to ensure the required real-time constraints, to execute efficiently the underlying computer vision algorithms and to close the perception-action loop. We demonstrate our approach by addressing the selective imaging of cells with a particular combination of markers. With this novel closed-loop system the amount of superfluous collected data is minimized, while at the same time the information entropy increases.
NASA Astrophysics Data System (ADS)
Michaelis, A.; Nemani, R. R.; Wang, W.; Votava, P.; Hashimoto, H.
2010-12-01
Given the increasing complexity of climate modeling and analysis tools, it is often difficult and expensive to build or recreate an exact replica of the software compute environment used in past experiments. With the recent development of new technologies for hardware virtualization, an opportunity exists to create full modeling, analysis and compute environments that are “archiveable”, transferable and may be easily shared amongst a scientific community or presented to a bureaucratic body if the need arises. By encapsulating and entire modeling and analysis environment in a virtual machine image, others may quickly gain access to the fully built system used in past experiments, potentially easing the task and reducing the costs of reproducing and verify past results produced by other researchers. Moreover, these virtual machine images may be used as a pedagogical tool for others that are interested in performing an academic exercise but don't yet possess the broad expertise required. We built two virtual machine images, one with the Community Earth System Model (CESM) and one with Weather Research Forecast Model (WRF), then ran several small experiments to assess the feasibility, performance overheads costs, reusability, and transferability. We present a list of the pros and cons as well as lessoned learned from utilizing virtualization technology in the climate and earth systems modeling domain.
NASTRAN thermal analyzer status, experience, and new developments
NASA Technical Reports Server (NTRS)
Lee, H. P.
1975-01-01
The unique finite element based NASTRAN Thermal Analyzer originally developed as a general purpose heat transfer analysis incorporated into the NASTRAN system is described. The current status, experiences from field applications, and new developments are included.
Evolution of the ATLAS PanDA workload management system for exascale computational science
NASA Astrophysics Data System (ADS)
Maeno, T.; De, K.; Klimentov, A.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Petrosyan, A.; Schovancova, J.; Vaniachine, A.; Wenaus, T.; Yu, D.; Atlas Collaboration
2014-06-01
An important foundation underlying the impressive success of data processing and analysis in the ATLAS experiment [1] at the LHC [2] is the Production and Distributed Analysis (PanDA) workload management system [3]. PanDA was designed specifically for ATLAS and proved to be highly successful in meeting all the distributed computing needs of the experiment. However, the core design of PanDA is not experiment specific. The PanDA workload management system is capable of meeting the needs of other data intensive scientific applications. Alpha-Magnetic Spectrometer [4], an astro-particle experiment on the International Space Station, and the Compact Muon Solenoid [5], an LHC experiment, have successfully evaluated PanDA and are pursuing its adoption. In this paper, a description of the new program of work to develop a generic version of PanDA will be given, as well as the progress in extending PanDA's capabilities to support supercomputers and clouds and to leverage intelligent networking. PanDA has demonstrated at a very large scale the value of automated dynamic brokering of diverse workloads across distributed computing resources. The next generation of PanDA will allow other data-intensive sciences and a wider exascale community employing a variety of computing platforms to benefit from ATLAS' experience and proven tools.
Neophyte experiences of football (soccer) match analysis: a multiple case study approach.
McKenna, Mark; Cowan, Daryl Thomas; Stevenson, David; Baker, Julien Steven
2018-03-05
Performance analysis is extensively used in sport, but its pedagogical application is little understood. Given its expanding role across football, this study explored the experiences of neophyte performance analysts. Experiences of six analysis interns, across three professional football clubs, were investigated as multiple cases of new match analysis. Each intern was interviewed after their first season, with archival data providing background information. Four themes emerged from qualitative analysis: (1) "building of relationships" was important, along with trust and role clarity; (2) "establishing an analysis system" was difficult due to tacit coach knowledge, but analysis was established; (3) the quality of the "feedback process" hinged on coaching styles, with balance of feedback and athlete engagement considered essential; (4) "establishing effect" was complex with no statistical effects reported; yet enhanced relationships, role clarity, and improved performances were reported. Other emic accounts are required to further understand occupational culture within performance analysis.
The Tri-Services Site Characterization Analysis Penetrometer System (SCAPS) was developed by the U.S. Army (U.S. Army Corps of Engineers, Waterways Experiment Station [WES] and the Army Environmental Center [AEC]), Navy (Naval Command, Control and Ocean Surveillance Center), and ...
BEATBOX v1.0: Background Error Analysis Testbed with Box Models
NASA Astrophysics Data System (ADS)
Knote, Christoph; Barré, Jérôme; Eckl, Max
2018-02-01
The Background Error Analysis Testbed (BEATBOX) is a new data assimilation framework for box models. Based on the BOX Model eXtension (BOXMOX) to the Kinetic Pre-Processor (KPP), this framework allows users to conduct performance evaluations of data assimilation experiments, sensitivity analyses, and detailed chemical scheme diagnostics from an observation simulation system experiment (OSSE) point of view. The BEATBOX framework incorporates an observation simulator and a data assimilation system with the possibility of choosing ensemble, adjoint, or combined sensitivities. A user-friendly, Python-based interface allows for the tuning of many parameters for atmospheric chemistry and data assimilation research as well as for educational purposes, for example observation error, model covariances, ensemble size, perturbation distribution in the initial conditions, and so on. In this work, the testbed is described and two case studies are presented to illustrate the design of a typical OSSE experiment, data assimilation experiments, a sensitivity analysis, and a method for diagnosing model errors. BEATBOX is released as an open source tool for the atmospheric chemistry and data assimilation communities.
Development of Skylab experiment T-013 crew/vehicle disturbances
NASA Technical Reports Server (NTRS)
Conway, B. A.; Woolley, C. T.; Kurzhals, P. R.; Reynolds, R. B.
1972-01-01
A Skylab experiment to determine the characteristics and effects of crew-motion disturbances was developed. The experiment will correlate data from histories of specified astronaut body motions, the disturbance forces and torques produced by these motions, and the resultant spacecraft control system response to the disturbances. Primary application of crew-motion disturbance data will be to the sizing and design of future manned spacecraft control and stabilization systems. The development of the crew/vehicle disturbances experiment is described, and a mathematical model of human body motion which may be used for analysis of a variety of man-motion activities is derived.
NASA Astrophysics Data System (ADS)
Whitney, Dwight E.
The influence of learning in the form of past relevant experience was examined in data collected for strategic ballistic missiles developed by the United States. A total of twenty-four new missiles were developed and entered service between 1954 and 1990. Missile development costs were collected and analyzed by regression analysis using the learning curve model with factors for past experience and other relevant cost estimating relationships. The purpose of the study was to determine whether prior development experience was a factor in the development cost of these like systems. Of the twenty-four missiles in the population, development costs for twelve of the missiles were collected from the literature. Since the costs were found to be segmented by military service, a discrete input variable for military service was used as one of the cost estimating relationships. Because there were only two US Navy samples, too few to analyze for segmentation and learning rate, they were excluded from the final analysis. The final analysis was on a sample of ten out of eighteen US Army and US Air Force missiles within the population. The result of the analysis found past experience to be a statistically significant factor in describing the development cost of the US Army and US Air Force missiles. The influence equated to a 0.86 progress ratio, indicating prior development experience had a positive (cost-reducing) influence on their development cost. Based on the result, it was concluded that prior development experience was a factor in the development cost of these systems.
Proceedings of the Fourth International Mobile Satellite Conference (IMSC 1995)
NASA Technical Reports Server (NTRS)
Rigley, Jack R. (Compiler); Estabrook, Polly (Compiler); Reekie, D. Hugh M. (Editor)
1995-01-01
The theme to the 1995 International Mobile Satellite Conference was 'Mobile Satcom Comes of Age'. The sessions included Modulation, Coding, and Multiple Access; Hybrid Networks - 1; Spacecraft Technology; propagation; Applications and Experiments - 1; Advanced System Concepts and Analysis; Aeronautical Mobile Satellite Communications; Mobile Terminal Antennas; Mobile Terminal Technology; Current and Planned Systems; Direct Broadcast Satellite; The Use of CDMA for LEO and ICO Mobile Satellite Systems; Hybrid Networks - 2; and Applications and Experiments - 2.
NASA Technical Reports Server (NTRS)
Marriott, A.
1980-01-01
The activities of the Point-Focusing Thermal and Electric Applications (PETEA) project for the fiscal year 1979 are summarized. The main thrust of the PFTEA Project, the small community solar thermal power experiment, was completed. Concept definition studies included a small central receiver approach, a point-focusing distributed receiver system with central power generation, and a point-focusing distributed receiver concept with distributed power generation. The first experiment in the Isolated Application Series was initiated. Planning for the third engineering experiment series, which addresses the industrial market sector, was also initiated. In addition to the experiment-related activities, several contracts to industry were let and studies were conducted to explore the market potential for point-focusing distributed receiver (PFDR) systems. System analysis studies were completed that looked at PFDR technology relative to other small power system technology candidates for the utility market sector.
The 1985 Army Experience Survey. Data Sourcebook and User’s Manual
1986-01-01
on the survey data file produced for the 1985 AES.- 4 The survey data are available in Operating System (OS) as well as Statistical Analysis System ...version of the survey data files was produced using the Statistical Analysis System (SASJ. The survey data were also produced in Operating System (OS...impacts upon future enlistments. In order iThe OS data file was designed to make the survey data accessible on any IBM-compatible computer system . 3 N’ to
The DAQ system for the AEḡIS experiment
NASA Astrophysics Data System (ADS)
Prelz, F.; Aghion, S.; Amsler, C.; Ariga, T.; Bonomi, G.; Brusa, R. S.; Caccia, M.; Caravita, R.; Castelli, F.; Cerchiari, G.; Comparat, D.; Consolati, G.; Demetrio, A.; Di Noto, L.; Doser, M.; Ereditato, A.; Evans, C.; Ferragut, R.; Fesel, J.; Fontana, A.; Gerber, S.; Giammarchi, M.; Gligorova, A.; Guatieri, F.; Haider, S.; Hinterberger, A.; Holmestad, H.; Kellerbauer, A.; Krasnický, D.; Lagomarsino, V.; Lansonneur, P.; Lebrun, P.; Malbrunot, C.; Mariazzi, S.; Matveev, V.; Mazzotta, Z.; Müller, S. R.; Nebbia, G.; Nedelec, P.; Oberthaler, M.; Pacifico, N.; Pagano, D.; Penasa, L.; Petracek, V.; Prevedelli, M.; Ravelli, L.; Rienaecker, B.; Robert, J.; Røhne, O. M.; Rotondi, A.; Sacerdoti, M.; Sandaker, H.; Santoro, R.; Scampoli, P.; Simon, M.; Smestad, L.; Sorrentino, F.; Testera, G.; Tietje, I. C.; Widmann, E.; Yzombard, P.; Zimmer, C.; Zmeskal, J.; Zurlo, N.
2017-10-01
In the sociology of small- to mid-sized (O(100) collaborators) experiments the issue of data collection and storage is sometimes felt as a residual problem for which well-established solutions are known. Still, the DAQ system can be one of the few forces that drive towards the integration of otherwise loosely coupled detector systems. As such it may be hard to complete with off-the-shelf components only. LabVIEW and ROOT are the (only) two software systems that were assumed to be familiar enough to all collaborators of the AEḡIS (AD6) experiment at CERN: working out of the GXML representation of LabVIEW Data types, a semantically equivalent representation as ROOT TTrees was developed for permanent storage and analysis. All data in the experiment is cast into this common format and can be produced and consumed on both systems and transferred over TCP and/or multicast over UDP for immediate sharing over the experiment LAN. We describe the setup that has been able to cater to all run data logging and long term monitoring needs of the AEḡIS experiment so far.
NASA Technical Reports Server (NTRS)
Prive, Nikki C.; Errico, Ronald M.
2013-01-01
A series of experiments that explore the roles of model and initial condition error in numerical weather prediction are performed using an observing system simulation experiment (OSSE) framework developed at the National Aeronautics and Space Administration Global Modeling and Assimilation Office (NASA/GMAO). The use of an OSSE allows the analysis and forecast errors to be explicitly calculated, and different hypothetical observing networks can be tested with ease. In these experiments, both a full global OSSE framework and an 'identical twin' OSSE setup are utilized to compare the behavior of the data assimilation system and evolution of forecast skill with and without model error. The initial condition error is manipulated by varying the distribution and quality of the observing network and the magnitude of observation errors. The results show that model error has a strong impact on both the quality of the analysis field and the evolution of forecast skill, including both systematic and unsystematic model error components. With a realistic observing network, the analysis state retains a significant quantity of error due to systematic model error. If errors of the analysis state are minimized, model error acts to rapidly degrade forecast skill during the first 24-48 hours of forward integration. In the presence of model error, the impact of observation errors on forecast skill is small, but in the absence of model error, observation errors cause a substantial degradation of the skill of medium range forecasts.
Hanson, Elizabeth R; Finley, Erin P; Petershack, Jean A
2017-04-01
Training in advocacy and community pediatrics often involves the use of community site visits. However, data on the specific knowledge, skills, and attitudes gained from these experiences are limited. In this study we used qualitative analysis of written narratives to explore the response of residents to a juvenile justice experience. Pediatric residents participated in a week-long experience in the juvenile probation department and completed a written narrative. Narratives were analyzed using grounded theory to explore the effects of this experience on residents' views of youth in the juvenile justice system. Analysis of 29 narratives revealed 13 themes relating to 5 core concepts: social determinants of behavior, role of professionals and institutions, achieving future potential, resolving discrepancies, and distancing. A conceptual model was developed to explore the interactions of these concepts in the resident view of youth in the juvenile justice system. Of the themes only 3 (23%) were related to content explicitly covered in the assigned reading materials. Several important concepts emerged as elements of this experience, many of which were not covered in the explicit curriculum. Variability in attitudinal response to the experience raised important questions about the influence of the ideological framework of the learner and the hidden curriculum on the learning that occurs in community settings. We propose a theoretical model that delineates the factors that influence learning in community settings to guide educators in planning these types of experiences. Copyright © 2016 Academic Pediatric Association. All rights reserved.
NASA Technical Reports Server (NTRS)
Hardage, Donna (Technical Monitor); Walters, R. J.; Morton, T. L.; Messenger, S. R.
2004-01-01
The objective is to develop an improved space solar cell radiation response analysis capability and to produce a computer modeling tool which implements the analysis. This was accomplished through analysis of solar cell flight data taken on the Microelectronics and Photonics Test Bed experiment. This effort specifically addresses issues related to rapid technological change in the area of solar cells for space applications in order to enhance system performance, decrease risk, and reduce cost for future missions.
Second LDEF Post-Retrieval Symposium Abstracts
NASA Technical Reports Server (NTRS)
Levine, Arlene S. (Compiler)
1992-01-01
These abstracts from the symposium represent the data analysis of the 57 experiments flown on the LDEF. The experiments include materials, coatings, thermal systems, power and propulsion, science, (cosmic ray, interstellar gas, heavy ions, micrometeoroids, etc.), electronics, optics, and life science.
Zenina, L P; Godkov, M A
2013-08-01
The article presents the experience of implementation of system of quality management into the practice of multi-field laboratory of emergency medical care hospital. The analysis of laboratory errors is applied and the modes of their prevention are demonstrated. The ratings of department of laboratory diagnostic of the N. V. Sklifosofskiy research institute of emergency care in the program EQAS (USA) Monthly Clinical Chemistry from 2007 are presented. The implementation of the system of quality management of laboratory analysis into department of laboratory diagnostic made it possible to support physicians of clinical departments with reliable information. The confidence of clinicians to received results increased. The effectiveness of laboratory diagnostic increased due to lowering costs of analysis without negative impact to quality of curative process.
NASA Technical Reports Server (NTRS)
Yan, Jerry C.
1987-01-01
In concurrent systems, a major responsibility of the resource management system is to decide how the application program is to be mapped onto the multi-processor. Instead of using abstract program and machine models, a generate-and-test framework known as 'post-game analysis' that is based on data gathered during program execution is proposed. Each iteration consists of (1) (a simulation of) an execution of the program; (2) analysis of the data gathered; and (3) the proposal of a new mapping that would have a smaller execution time. These heuristics are applied to predict execution time changes in response to small perturbations applied to the current mapping. An initial experiment was carried out using simple strategies on 'pipeline-like' applications. The results obtained from four simple strategies demonstrated that for this kind of application, even simple strategies can produce acceptable speed-up with a small number of iterations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Uchibori, Akihiro; Kurihara, Akikazu; Ohshima, Hiroyuki
A multiphysics analysis system for sodium-water reaction phenomena in a steam generator of sodium-cooled fast reactors was newly developed. The analysis system consists of the mechanistic numerical analysis codes, SERAPHIM, TACT, and RELAP5. The SERAPHIM code calculates the multicomponent multiphase flow and sodium-water chemical reaction caused by discharging of pressurized water vapor. Applicability of the SERAPHIM code was confirmed through the analyses of the experiment on water vapor discharging in liquid sodium. The TACT code was developed to calculate heat transfer from the reacting jet to the adjacent tube and to predict the tube failure occurrence. The numerical models integratedmore » into the TACT code were verified through some related experiments. The RELAP5 code evaluates thermal hydraulic behavior of water inside the tube. The original heat transfer correlations were corrected for the tube rapidly heated by the reacting jet. The developed system enables evaluation of the wastage environment and the possibility of the failure propagation.« less
Quantitative Schlieren analysis applied to holograms of crystals grown on Spacelab 3
NASA Technical Reports Server (NTRS)
Brooks, Howard L.
1986-01-01
In order to extract additional information about crystals grown in the microgravity environment of Spacelab, a quantitative schlieren analysis technique was developed for use in a Holography Ground System of the Fluid Experiment System. Utilizing the Unidex position controller, it was possible to measure deviation angles produced by refractive index gradients of 0.5 milliradians. Additionally, refractive index gradient maps for any recorded time during the crystal growth were drawn and used to create solute concentration maps for the environment around the crystal. The technique was applied to flight holograms of Cell 204 of the Fluid Experiment System that were recorded during the Spacelab 3 mission on STS 51B. A triglycine sulfate crystal was grown under isothermal conditions in the cell and the data gathered with the quantitative schlieren analysis technique is consistent with a diffusion limited growth process.
ERIC Educational Resources Information Center
Hatun Atas, Amine; Delialioglu, Ömer
2018-01-01
The aim of this study was to explore the opinions, perceptions and evaluations of students about their experiences with a question-answer system used on mobile devices in a lecture-based course. Basic qualitative research method was employed in this study to understand how students made sense of their experiences during the instruction. The…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rechard, Robert P.
This report presents a concise history in tabular form of events leading up to site identification in 1978, site selection in 1987, subsequent characterization, and ongoing analysis through 2008 of the performance of a repository for spent nuclear fuel and high-level radioactive waste at Yucca Mountain in southern Nevada. The tabulated events generally occurred in five periods: (1) commitment to mined geologic disposal and identification of sites; (2) site selection and analysis, based on regional geologic characterization through literature and analogous data; (3) feasibility analysis demonstrating calculation procedures and importance of system components, based on rough measures of performance usingmore » surface exploration, waste process knowledge, and general laboratory experiments; (4) suitability analysis demonstrating viability of disposal system, based on environment-specific laboratory experiments, in-situ experiments, and underground disposal system characterization; and (5) compliance analysis, based on completed site-specific characterization. Because the relationship is important to understanding the evolution of the Yucca Mountain Project, the tabulation also shows the interaction between four broad categories of political bodies and government agencies/institutions: (a) technical milestones of the implementing institutions, (b) development of the regulatory requirements and related federal policy in laws and court decisions, (c) Presidential and agency directives and decisions, and (d) critiques of the Yucca Mountain Project and pertinent national and world events related to nuclear energy and radioactive waste.« less
Bye, Amanda; Aston, Megan
2016-03-01
Children with intellectual disabilities spend more time in the health-care system than mainstream children. Parents have to learn how to navigate the system by coordinating appointments, understanding the referral process, knowing what services are available, and advocating for those services. This places an incredible amount of responsibility on families. This article is one mother's personal story and reflection about her journey through the Canadian health-care system in Nova Scotia, with her daughter who has an intellectual disability. The reflection identifies moments of tension experienced by a mother and how she was expected to be a medical system navigator, doctor-educator, time manager, and care coordinator and the roles that led to feelings of repression, extreme frustration, and fear. A final discussion offers an analysis of her experience, using concepts from feminist post-structuralism. © The Author(s) 2015.
NASA Technical Reports Server (NTRS)
Maggio, Gaspare; Groen, Frank; Hamlin, Teri; Youngblood, Robert
2010-01-01
Accident Precursor Analysis (APA) serves as the bridge between existing risk modeling activities, which are often based on historical or generic failure statistics, and system anomalies, which provide crucial information about the failure mechanisms that are actually operative in the system. APA docs more than simply track experience: it systematically evaluates experience, looking for under-appreciated risks that may warrant changes to design or operational practice. This paper presents the pilot application of the NASA APA process to Space Shuttle Orbiter systems. In this effort, the working sessions conducted at Johnson Space Center (JSC) piloted the APA process developed by Information Systems Laboratories (ISL) over the last two years under the auspices of NASA's Office of Safety & Mission Assurance, with the assistance of the Safety & Mission Assurance (S&MA) Shuttle & Exploration Analysis Branch. This process is built around facilitated working sessions involving diverse system experts. One important aspect of this particular APA process is its focus on understanding the physical mechanism responsible for an operational anomaly, followed by evaluation of the risk significance of the observed anomaly as well as consideration of generalizations of the underlying mechanism to other contexts. Model completeness will probably always be an issue, but this process tries to leverage operating experience to the extent possible in order to address completeness issues before a catastrophe occurs.
Performance evaluation of infrared imaging system in field test
NASA Astrophysics Data System (ADS)
Wang, Chensheng; Guo, Xiaodong; Ren, Tingting; Zhang, Zhi-jie
2014-11-01
Infrared imaging system has been applied widely in both military and civilian fields. Since the infrared imager has various types and different parameters, for system manufacturers and customers, there is great demand for evaluating the performance of IR imaging systems with a standard tool or platform. Since the first generation IR imager was developed, the standard method to assess the performance has been the MRTD or related improved methods which are not perfect adaptable for current linear scanning imager or 2D staring imager based on FPA detector. For this problem, this paper describes an evaluation method based on the triangular orientation discrimination metric which is considered as the effective and emerging method to evaluate the synthesis performance of EO system. To realize the evaluation in field test, an experiment instrument is developed. And considering the importance of operational environment, the field test is carried in practical atmospheric environment. The test imagers include panoramic imaging system and staring imaging systems with different optics and detectors parameters (both cooled and uncooled). After showing the instrument and experiment setup, the experiment results are shown. The target range performance is analyzed and discussed. In data analysis part, the article gives the range prediction values obtained from TOD method, MRTD method and practical experiment, and shows the analysis and results discussion. The experimental results prove the effectiveness of this evaluation tool, and it can be taken as a platform to give the uniform performance prediction reference.
Skylab materials processing facility experiment developer's report
NASA Technical Reports Server (NTRS)
Parks, P. G.
1975-01-01
The development of the Skylab M512 Materials Processing Facility is traced from the design of a portable, self-contained electron beam welding system for terrestrial applications to the highly complex experiment system ultimately developed for three Skylab missions. The M512 experiment facility was designed to support six in-space experiments intended to explore the advantages of manufacturing materials in the near-zero-gravity environment of Earth orbit. Detailed descriptions of the M512 facility and related experiment hardware are provided, with discussions of hardware verification and man-machine interfaces included. An analysis of the operation of the facility and experiments during the three Skylab missions is presented, including discussions of the hardware performance, anomalies, and data returned to earth.
TSTA Piping and Flame Arrestor Operating Experience Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cadwallader, Lee C.; Willms, R. Scott
The Tritium Systems Test Assembly (TSTA) was a facility dedicated to tritium handling technology and experiment research at the Los Alamos National Laboratory. The facility operated from 1984 to 2001, running a prototype fusion fuel processing loop with ~100 grams of tritium as well as small experiments. There have been several operating experience reports written on this facility’s operation and maintenance experience. This paper describes analysis of two additional components from TSTA, small diameter gas piping that handled small amounts of tritium in a nitrogen carrier gas, and the flame arrestor used in this piping system. The operating experiences andmore » the component failure rates for these components are discussed in this paper. Comparison data from other applications are also presented.« less
Analysis of Spatial Autocorrelation for Optimal Observation Network in Korea
NASA Astrophysics Data System (ADS)
Park, S.; Lee, S.; Lee, E.; Park, S. K.
2016-12-01
Many studies for improving prediction of high-impact weather have been implemented, such as THORPEX (The Observing System Research and Predictability Experiment), FASTEX (Fronts and Atlantic Storm-Track Experiment), NORPEX (North Pacific Experiment), WSR/NOAA (Winter Storm Reconnaissance), and DOTSTAR (Dropwindsonde Observations for Typhoon Surveillance near the TAiwan Region). One of most important objectives in these studies is to find effects of observation on forecast, and to establish optimal observation network. However, there are lack of such studies on Korea, although Korean peninsula exhibits a highly complex terrain so it is difficult to predict its weather phenomena. Through building the future optimal observation network, it is necessary to increase utilization of numerical weather prediction and improve monitoring·tracking·prediction skills of high-impact weather in Korea. Therefore, we will perform preliminary study to understand the spatial scale for an expansion of observation system through Spatial Autocorrelation (SAC) analysis. In additions, we will develop a testbed system to design an optimal observation network. Analysis is conducted with Automatic Weather System (AWS) rainfall data, global upper air grid observation (i.e., temperature, pressure, humidity), Himawari satellite data (i.e., water vapor) during 2013-2015 of Korea. This study will provide a guideline to construct observation network for not only improving weather prediction skill but also cost-effectiveness.
Development progress of the Materials Analysis and Particle Probe
NASA Astrophysics Data System (ADS)
Lucia, M.; Kaita, R.; Majeski, R.; Bedoya, F.; Allain, J. P.; Boyle, D. P.; Schmitt, J. C.; Onge, D. A. St.
2014-11-01
The Materials Analysis and Particle Probe (MAPP) is a compact in vacuo surface science diagnostic, designed to provide in situ surface characterization of plasma facing components in a tokamak environment. MAPP has been implemented for operation on the Lithium Tokamak Experiment at Princeton Plasma Physics Laboratory (PPPL), where all control and analysis systems are currently under development for full remote operation. Control systems include vacuum management, instrument power, and translational/rotational probe drive. Analysis systems include onboard Langmuir probes and all components required for x-ray photoelectron spectroscopy, low-energy ion scattering spectroscopy, direct recoil spectroscopy, and thermal desorption spectroscopy surface analysis techniques.
Development progress of the Materials Analysis and Particle Probe.
Lucia, M; Kaita, R; Majeski, R; Bedoya, F; Allain, J P; Boyle, D P; Schmitt, J C; Onge, D A St
2014-11-01
The Materials Analysis and Particle Probe (MAPP) is a compact in vacuo surface science diagnostic, designed to provide in situ surface characterization of plasma facing components in a tokamak environment. MAPP has been implemented for operation on the Lithium Tokamak Experiment at Princeton Plasma Physics Laboratory (PPPL), where all control and analysis systems are currently under development for full remote operation. Control systems include vacuum management, instrument power, and translational/rotational probe drive. Analysis systems include onboard Langmuir probes and all components required for x-ray photoelectron spectroscopy, low-energy ion scattering spectroscopy, direct recoil spectroscopy, and thermal desorption spectroscopy surface analysis techniques.
NASA Astrophysics Data System (ADS)
Nolan, G.; Pinardi, N.; Vukicevic, T.; Le Traon, P. Y.; Fernandez, V.
2016-02-01
Ocean observations are critical to providing accurate ocean forecasts that support operational decision making in European open and coastal seas. Observations are available in many forms from Fixed platforms e.g. Moored Buoys and tide gauges, underway measurements from Ferrybox systems, High Frequency radars and more recently from underwater Gliders and profiling floats. Observing System Simulation Experiments have been conducted to examine the relative contribution of each type of platform to an improvement in our ability to accurately forecast the future state of the ocean with HF radar and Gliders showing particular promise in improving model skill. There is considerable demand for ecosystem products and services from today's ocean observing system and biogeochemical observations are still relatively sparse particularly in coastal and shelf seas. There is a need to widen the techniques used to assess the fitness for purpose and gaps in the ocean observing system. As well as Observing System Simulation Experiments that quantify the effect of observations on the overall model skill we present a gap analysis based on (1) Examining where high model skill is required based on a marine spatial planning analysis of European seas i.e where does activity take place that requires more accurate forecasts? and (2) assessing gaps based on the capacity of the observing system to answer key societal challenges e.g. site suitability for aquaculture and ocean energy, oil spill response and contextual oceanographic products for fisheries and ecosystems. The broad based analysis will inform the development of the proposed European Ocean Observing System as a contribution to the Global Ocean Observing System (GOOS).
Stratospheric General Circulation with Chemistry Model (SGCCM)
NASA Technical Reports Server (NTRS)
Rood, Richard B.; Douglass, Anne R.; Geller, Marvin A.; Kaye, Jack A.; Nielsen, J. Eric; Rosenfield, Joan E.; Stolarski, Richard S.
1990-01-01
In the past two years constituent transport and chemistry experiments have been performed using both simple single constituent models and more complex reservoir species models. Winds for these experiments have been taken from the data assimilation effort, Stratospheric Data Analysis System (STRATAN).
Test bed experiments for various telerobotic system characteristics and configurations
NASA Technical Reports Server (NTRS)
Duffie, Neil A.; Wiker, Steven F.; Zik, John J.
1990-01-01
Dexterous manipulation and grasping in telerobotic systems depends on the integration of high-performance sensors, displays, actuators and controls into systems in which careful consideration has been given to human perception and tolerance. Research underway at the Wisconsin Center for Space Automation and Robotics (WCSAR) has the objective of enhancing the performance of these systems and their components, and quantifying the effects of the many electrical, mechanical, control, and human factors that affect their performance. This will lead to a fundamental understanding of performance issues which will in turn allow designers to evaluate sensor, actuator, display, and control technologies with respect to generic measures of dexterous performance. As part of this effort, an experimental test bed was developed which has telerobotic components with exceptionally high fidelity in master/slave operation. A Telerobotic Performance Analysis System has also been developed which allows performance to be determined for various system configurations and electro-mechanical characteristics. Both this performance analysis system and test bed experiments are described.
Tabletop Molecular Communication: Text Messages through Chemical Signals
Farsad, Nariman; Guo, Weisi; Eckford, Andrew W.
2013-01-01
In this work, we describe the first modular, and programmable platform capable of transmitting a text message using chemical signalling – a method also known as molecular communication. This form of communication is attractive for applications where conventional wireless systems perform poorly, from nanotechnology to urban health monitoring. Using examples, we demonstrate the use of our platform as a testbed for molecular communication, and illustrate the features of these communication systems using experiments. By providing a simple and inexpensive means of performing experiments, our system fills an important gap in the molecular communication literature, where much current work is done in simulation with simplified system models. A key finding in this paper is that these systems are often nonlinear in practice, whereas current simulations and analysis often assume that the system is linear. However, as we show in this work, despite the nonlinearity, reliable communication is still possible. Furthermore, this work motivates future studies on more realistic modelling, analysis, and design of theoretical models and algorithms for these systems. PMID:24367571
Role of the ATLAS Grid Information System (AGIS) in Distributed Data Analysis and Simulation
NASA Astrophysics Data System (ADS)
Anisenkov, A. V.
2018-03-01
In modern high-energy physics experiments, particular attention is paid to the global integration of information and computing resources into a unified system for efficient storage and processing of experimental data. Annually, the ATLAS experiment performed at the Large Hadron Collider at the European Organization for Nuclear Research (CERN) produces tens of petabytes raw data from the recording electronics and several petabytes of data from the simulation system. For processing and storage of such super-large volumes of data, the computing model of the ATLAS experiment is based on heterogeneous geographically distributed computing environment, which includes the worldwide LHC computing grid (WLCG) infrastructure and is able to meet the requirements of the experiment for processing huge data sets and provide a high degree of their accessibility (hundreds of petabytes). The paper considers the ATLAS grid information system (AGIS) used by the ATLAS collaboration to describe the topology and resources of the computing infrastructure, to configure and connect the high-level software systems of computer centers, to describe and store all possible parameters, control, configuration, and other auxiliary information required for the effective operation of the ATLAS distributed computing applications and services. The role of the AGIS system in the development of a unified description of the computing resources provided by grid sites, supercomputer centers, and cloud computing into a consistent information model for the ATLAS experiment is outlined. This approach has allowed the collaboration to extend the computing capabilities of the WLCG project and integrate the supercomputers and cloud computing platforms into the software components of the production and distributed analysis workload management system (PanDA, ATLAS).
Review of stochastic hybrid systems with applications in biological systems modeling and analysis.
Li, Xiangfang; Omotere, Oluwaseyi; Qian, Lijun; Dougherty, Edward R
2017-12-01
Stochastic hybrid systems (SHS) have attracted a lot of research interests in recent years. In this paper, we review some of the recent applications of SHS to biological systems modeling and analysis. Due to the nature of molecular interactions, many biological processes can be conveniently described as a mixture of continuous and discrete phenomena employing SHS models. With the advancement of SHS theory, it is expected that insights can be obtained about biological processes such as drug effects on gene regulation. Furthermore, combining with advanced experimental methods, in silico simulations using SHS modeling techniques can be carried out for massive and rapid verification or falsification of biological hypotheses. The hope is to substitute costly and time-consuming in vitro or in vivo experiments or provide guidance for those experiments and generate better hypotheses.
Model-OA wind turbine generator - Failure modes and effects analysis
NASA Technical Reports Server (NTRS)
Klein, William E.; Lali, Vincent R.
1990-01-01
The results failure modes and effects analysis (FMEA) conducted for wind-turbine generators are presented. The FMEA was performed for the functional modes of each system, subsystem, or component. The single-point failures were eliminated for most of the systems. The blade system was the only exception. The qualitative probability of a blade separating was estimated at level D-remote. Many changes were made to the hardware as a result of this analysis. The most significant change was the addition of the safety system. Operational experience and need to improve machine availability have resulted in subsequent changes to the various systems, which are also reflected in this FMEA.
LDEF electronic systems: Successes, failures, and lessons
NASA Technical Reports Server (NTRS)
Miller, Emmett; Porter, Dave; Smith, Dave; Brooks, Larry; Levorsen, Joe; Mulkey, Owen
1991-01-01
Following the Long Duration Exposure Facility (LDEF) retrieval, the Systems Special Investigation Group (SIG) participated in an extensive series of tests of various electronic systems, including the NASA provided data and initiate systems, and some experiment systems. Overall, these were found to have performed remarkably well, even though most were designed and tested under limited budgets and used at least some nonspace qualified components. However, several anomalies were observed, including a few which resulted in some loss of data. The postflight test program objectives, observations, and lessons learned from these examinations are discussed. All analyses are not yet complete, but observations to date will be summarized, including the Boeing experiment component studies and failure analysis results related to the Interstellar Gas Experiment. Based upon these observations, suggestions for avoiding similar problems on future programs are presented.
Search for the lepton-family-number nonconserving decay μ+-->e+γ
NASA Astrophysics Data System (ADS)
Ahmed, M.; Amann, J. F.; Barlow, D.; Black, K.; Bolton, R. D.; Brooks, M. L.; Carius, S.; Chen, Y. K.; Chernyshev, A.; Concannon, H. M.; Cooper, M. D.; Cooper, P. S.; Crocker, J.; Dittmann, J. R.; Dzemidzic, M.; Empl, A.; Fisk, R. J.; Fleet, E.; Foreman, W.; Gagliardi, C. A.; Haim, D.; Hallin, A.; Hoffman, C. M.; Hogan, G. E.; Hughes, E. B.; Hungerford, E. V.; Jui, C. C.; Kim, G. J.; Knott, J. E.; Koetke, D. D.; Kozlowski, T.; Kroupa, M. A.; Kunselman, A. R.; Lan, K. A.; Laptev, V.; Lee, D.; Liu, F.; Manweiler, R. W.; Marshall, R.; Mayes, B. W.; Mischke, R. E.; Nefkens, B. M.; Nickerson, L. M.; Nord, P. M.; Oothoudt, M. A.; Otis, J. N.; Phelps, R.; Piilonen, L. E.; Pillai, C.; Pinsky, L.; Ritter, M. W.; Smith, C.; Stanislaus, T. D.; Stantz, K. M.; Szymanski, J. J.; Tang, L.; Tippens, W. B.; Tribble, R. E.; Tu, X. L.; van Ausdeln, L. A.; von Witch, W. H.; Whitehouse, D.; Wilkinson, C.; Wright, B.; Wright, S. C.; Zhang, Y.; Ziock, K. O.
2002-06-01
The MEGA experiment, which searched for the muon- and electron-number violating decay μ+→e+γ, is described. The spectrometer system, the calibrations, the data taking procedures, the data analysis, and the sensitivity of the experiment are discussed. The most stringent upper limit on the branching ratio, B(μ+→e+γ)<1.2×10-11 with 90% confidence, is derived from a likelihood analysis.
Post flight system analysis of FRECOPA (AO 138)
NASA Technical Reports Server (NTRS)
Durin, Christian
1991-01-01
The unexpected duration for the flight of the Long Duration Exposure Facility (LDEF) conducted CNES to create a special investigation group in order to analyze all the materials and systems which compose the French Cooperative Payload (FRECOPA) except the experiments especially prepared for the flight. The FRECOPA tray was on the trailing face (V-) of the LDEF and protected from the atomic oxygen flux during all the flight. However, the solar irradiation was very important with solar flux quite perpendicular to the experiment once an orbit. There was also a good vacuum environment. The objectives are to test the effects of the combined space environment on materials and components like: structure, thermal control coatings and blankets, electronic unit, motors, and mechanical fixtures. When the LDEF returned to Kennedy Space Center, a visual inspection showed the very good behavior of the materials used and it was noted that the three mechanisms to open and close the experiment canisters worked completely. Many impacts of micrometeoroids or space debris on the structure and on the thermal protections were observed. After FRECOPA was brought back to Toulouse, many tests were performed and include: working order tests, mechanical tests (tension), optical and electronic microscopy (SEM), surface analysis (ESCA, SIMS, RBS, AUGER, etc.), thermal analysis, pressure measurements, and gas analysis (outgassing tests). The results of these experiments are discussed.
ISE structural dynamic experiments
NASA Technical Reports Server (NTRS)
Lock, Malcolm H.; Clark, S. Y.
1988-01-01
The topics are presented in viewgraph form and include the following: directed energy systems - vibration issue; Neutral Particle Beam Integrated Space Experiment (NPB-ISE) opportunity/study objective; vibration sources/study plan; NPB-ISE spacecraft configuration; baseline slew analysis and results; modal contributions; fundamental pitch mode; vibration reduction approaches; peak residual vibration; NPB-ISE spacecraft slew experiment; goodbye ISE - hello Zenith Star Program.
NASA Astrophysics Data System (ADS)
Bonacorsi, D.; Gutsche, O.
The Worldwide LHC Computing Grid (WLCG) project decided in March 2009 to perform scale tests of parts of its overall Grid infrastructure before the start of the LHC data taking. The "Scale Test for the Experiment Program" (STEP'09) was performed mainly in June 2009 -with more selected tests in September- October 2009 -and emphasized the simultaneous test of the computing systems of all 4 LHC experiments. CMS tested its Tier-0 tape writing and processing capabilities. The Tier-1 tape systems were stress tested using the complete range of Tier-1 work-flows: transfer from Tier-0 and custody of data on tape, processing and subsequent archival, redistribution of datasets amongst all Tier-1 sites as well as burst transfers of datasets to Tier-2 sites. The Tier-2 analysis capacity was tested using bulk analysis job submissions to backfill normal user activity. In this talk, we will report on the different performed tests and present their post-mortem analysis.
Multivariate Analysis of Schools and Educational Policy.
ERIC Educational Resources Information Center
Kiesling, Herbert J.
This report describes a multivariate analysis technique that approaches the problems of educational production function analysis by (1) using comparable measures of output across large experiments, (2) accounting systematically for differences in socioeconomic background, and (3) treating the school as a complete system in which different…
Communication Challenges in Requirements Definition: A Classroom Simulation
ERIC Educational Resources Information Center
Ramiller, Neil C.; Wagner, Erica L.
2011-01-01
Systems analysis and design is a standard course offering within information systems programs and often an important lecture topic in Information Systems core courses. Given the persistent difficulty that organizations experience in implementing systems that meet their requirements, it is important to help students in these courses get a tangible…
Technology for large space systems: A special bibliography with indexes (supplement 03)
NASA Technical Reports Server (NTRS)
1980-01-01
A bibliography containing 217 abstracts addressing the technology for large space systems is presented. State of the art and advanced concepts concerning interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, solar power satellite systems, and flight experiments are represented.
Causal inference in nonlinear systems: Granger causality versus time-delayed mutual information
NASA Astrophysics Data System (ADS)
Li, Songting; Xiao, Yanyang; Zhou, Douglas; Cai, David
2018-05-01
The Granger causality (GC) analysis has been extensively applied to infer causal interactions in dynamical systems arising from economy and finance, physics, bioinformatics, neuroscience, social science, and many other fields. In the presence of potential nonlinearity in these systems, the validity of the GC analysis in general is questionable. To illustrate this, here we first construct minimal nonlinear systems and show that the GC analysis fails to infer causal relations in these systems—it gives rise to all types of incorrect causal directions. In contrast, we show that the time-delayed mutual information (TDMI) analysis is able to successfully identify the direction of interactions underlying these nonlinear systems. We then apply both methods to neuroscience data collected from experiments and demonstrate that the TDMI analysis but not the GC analysis can identify the direction of interactions among neuronal signals. Our work exemplifies inference hazards in the GC analysis in nonlinear systems and suggests that the TDMI analysis can be an appropriate tool in such a case.
NASA Technical Reports Server (NTRS)
Drake, R. L.; Duvoisin, P. F.; Asthana, A.; Mather, T. W.
1971-01-01
High speed automated identification and design of dynamic systems, both linear and nonlinear, are discussed. Special emphasis is placed on developing hardware and techniques which are applicable to practical problems. The basic modeling experiment and new results are described. Using the improvements developed successful identification of several systems, including a physical example as well as simulated systems, was obtained. The advantages of parameter signature analysis over signal signature analysis in go-no go testing of operational systems were demonstrated. The feasibility of using these ideas in failure mode prediction in operating systems was also investigated. An improved digital controlled nonlinear function generator was developed, de-bugged, and completely documented.
Brockhauser, Sandor; Svensson, Olof; Bowler, Matthew W; Nanao, Max; Gordon, Elspeth; Leal, Ricardo M F; Popov, Alexander; Gerring, Matthew; McCarthy, Andrew A; Gotz, Andy
2012-08-01
The automation of beam delivery, sample handling and data analysis, together with increasing photon flux, diminishing focal spot size and the appearance of fast-readout detectors on synchrotron beamlines, have changed the way that many macromolecular crystallography experiments are planned and executed. Screening for the best diffracting crystal, or even the best diffracting part of a selected crystal, has been enabled by the development of microfocus beams, precise goniometers and fast-readout detectors that all require rapid feedback from the initial processing of images in order to be effective. All of these advances require the coupling of data feedback to the experimental control system and depend on immediate online data-analysis results during the experiment. To facilitate this, a Data Analysis WorkBench (DAWB) for the flexible creation of complex automated protocols has been developed. Here, example workflows designed and implemented using DAWB are presented for enhanced multi-step crystal characterizations, experiments involving crystal reorientation with kappa goniometers, crystal-burning experiments for empirically determining the radiation sensitivity of a crystal system and the application of mesh scans to find the best location of a crystal to obtain the highest diffraction quality. Beamline users interact with the prepared workflows through a specific brick within the beamline-control GUI MXCuBE.
NASA Astrophysics Data System (ADS)
Drusch, M.
2007-02-01
Satellite-derived surface soil moisture data sets are readily available and have been used successfully in hydrological applications. In many operational numerical weather prediction systems the initial soil moisture conditions are analyzed from the modeled background and 2 m temperature and relative humidity. This approach has proven its efficiency to improve surface latent and sensible heat fluxes and consequently the forecast on large geographical domains. However, since soil moisture is not always related to screen level variables, model errors and uncertainties in the forcing data can accumulate in root zone soil moisture. Remotely sensed surface soil moisture is directly linked to the model's uppermost soil layer and therefore is a stronger constraint for the soil moisture analysis. For this study, three data assimilation experiments with the Integrated Forecast System (IFS) of the European Centre for Medium-Range Weather Forecasts (ECMWF) have been performed for the 2-month period of June and July 2002: a control run based on the operational soil moisture analysis, an open loop run with freely evolving soil moisture, and an experimental run incorporating TMI (TRMM Microwave Imager) derived soil moisture over the southern United States. In this experimental run the satellite-derived soil moisture product is introduced through a nudging scheme using 6-hourly increments. Apart from the soil moisture analysis, the system setup reflects the operational forecast configuration including the atmospheric 4D-Var analysis. Soil moisture analyzed in the nudging experiment is the most accurate estimate when compared against in situ observations from the Oklahoma Mesonet. The corresponding forecast for 2 m temperature and relative humidity is almost as accurate as in the control experiment. Furthermore, it is shown that the soil moisture analysis influences local weather parameters including the planetary boundary layer height and cloud coverage.
Small expendable deployer system measurement analysis
NASA Technical Reports Server (NTRS)
Carrington, Connie K.
1988-01-01
The first on-orbit experiment of the Small Expendable Deployer System (SEDS) for tethered satellites will collect telemetry data for tether length, rate of deployment, and tether tension. The post-flight analysis will use this data to reconstruct the deployment history and determine dynamic characteristics such as tether shape and payload position. Linearized observability analysis has determined that these measurements are adequate to define states for a two-mass tether model, and two state estimators were written.
Uncertainty Quantification Techniques of SCALE/TSUNAMI
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T; Mueller, Don
2011-01-01
The Standardized Computer Analysis for Licensing Evaluation (SCALE) code system developed at Oak Ridge National Laboratory (ORNL) includes Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI). The TSUNAMI code suite can quantify the predicted change in system responses, such as k{sub eff}, reactivity differences, or ratios of fluxes or reaction rates, due to changes in the energy-dependent, nuclide-reaction-specific cross-section data. Where uncertainties in the neutron cross-section data are available, the sensitivity of the system to the cross-section data can be applied to propagate the uncertainties in the cross-section data to an uncertainty in the system response. Uncertainty quantification ismore » useful for identifying potential sources of computational biases and highlighting parameters important to code validation. Traditional validation techniques often examine one or more average physical parameters to characterize a system and identify applicable benchmark experiments. However, with TSUNAMI correlation coefficients are developed by propagating the uncertainties in neutron cross-section data to uncertainties in the computed responses for experiments and safety applications through sensitivity coefficients. The bias in the experiments, as a function of their correlation coefficient with the intended application, is extrapolated to predict the bias and bias uncertainty in the application through trending analysis or generalized linear least squares techniques, often referred to as 'data adjustment.' Even with advanced tools to identify benchmark experiments, analysts occasionally find that the application models include some feature or material for which adequately similar benchmark experiments do not exist to support validation. For example, a criticality safety analyst may want to take credit for the presence of fission products in spent nuclear fuel. In such cases, analysts sometimes rely on 'expert judgment' to select an additional administrative margin to account for gap in the validation data or to conclude that the impact on the calculated bias and bias uncertainty is negligible. As a result of advances in computer programs and the evolution of cross-section covariance data, analysts can use the sensitivity and uncertainty analysis tools in the TSUNAMI codes to estimate the potential impact on the application-specific bias and bias uncertainty resulting from nuclides not represented in available benchmark experiments. This paper presents the application of methods described in a companion paper.« less
Development and Integration of Control System Models
NASA Technical Reports Server (NTRS)
Kim, Young K.
1998-01-01
The computer simulation tool, TREETOPS, has been upgraded and used at NASA/MSFC to model various complicated mechanical systems and to perform their dynamics and control analysis with pointing control systems. A TREETOPS model of Advanced X-ray Astrophysics Facility - Imaging (AXAF-1) dynamics and control system was developed to evaluate the AXAF-I pointing performance for Normal Pointing Mode. An optical model of Shooting Star Experiment (SSE) was also developed and its optical performance analysis was done using the MACOS software.
Expert system for web based collaborative CAE
NASA Astrophysics Data System (ADS)
Hou, Liang; Lin, Zusheng
2006-11-01
An expert system for web based collaborative CAE was developed based on knowledge engineering, relational database and commercial FEA (Finite element analysis) software. The architecture of the system was illustrated. In this system, the experts' experiences, theories and typical examples and other related knowledge, which will be used in the stage of pre-process in FEA, were categorized into analysis process and object knowledge. Then, the integrated knowledge model based on object-oriented method and rule based method was described. The integrated reasoning process based on CBR (case based reasoning) and rule based reasoning was presented. Finally, the analysis process of this expert system in web based CAE application was illustrated, and an analysis example of a machine tool's column was illustrated to prove the validity of the system.
Analysis and Comparison of Some Automatic Vehicle Monitoring Systems
DOT National Transportation Integrated Search
1973-07-01
In 1970 UMTA solicited proposals and selected four companies to develop systems to demonstrate the feasibility of different automatic vehicle monitoring techniques. The demonstrations culminated in experiments in Philadelphia to assess the performanc...
An Integrated Analysis of the Physiological Effects of Space Flight: Executive Summary
NASA Technical Reports Server (NTRS)
Leonard, J. I.
1985-01-01
A large array of models were applied in a unified manner to solve problems in space flight physiology. Mathematical simulation was used as an alternative way of looking at physiological systems and maximizing the yield from previous space flight experiments. A medical data analysis system was created which consist of an automated data base, a computerized biostatistical and data analysis system, and a set of simulation models of physiological systems. Five basic models were employed: (1) a pulsatile cardiovascular model; (2) a respiratory model; (3) a thermoregulatory model; (4) a circulatory, fluid, and electrolyte balance model; and (5) an erythropoiesis regulatory model. Algorithms were provided to perform routine statistical tests, multivariate analysis, nonlinear regression analysis, and autocorrelation analysis. Special purpose programs were prepared for rank correlation, factor analysis, and the integration of the metabolic balance data.
Handbook of experiences in the design and installation of solar heating and cooling systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ward, D.S.; Oberoi, H.S.
1980-07-01
A large array of problems encountered are detailed, including design errors, installation mistakes, cases of inadequate durability of materials and unacceptable reliability of components, and wide variations in the performance and operation of different solar systems. Durability, reliability, and design problems are reviewed for solar collector subsystems, heat transfer fluids, thermal storage, passive solar components, piping/ducting, and reliability/operational problems. The following performance topics are covered: criteria for design and performance analysis, domestic hot water systems, passive space heating systems, active space heating systems, space cooling systems, analysis of systems performance, and performance evaluations. (MHR)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stevens, Garrison N.; Atamturktur, Sez; Brown, D. Andrew
Rapid advancements in parallel computing over the last two decades have enabled simulations of complex, coupled systems through partitioning. In partitioned analysis, independently developed constituent models communicate, representing dependencies between multiple physical phenomena that occur in the full system. Figure 1 schematically demonstrates a coupled system with two constituent models, each resolving different physical behavior. In this figure, the constituent model, denoted as the “consumer,” relies upon some input parameter that is being provided by the constituent model acting as a “feeder”. The role of the feeder model is to map operating conditions (i.e. those that are stimulating the process)more » to consumer inputs, thus providing functional inputs to the consumer model*. Problems arise if the feeder model cannot be built–a challenge that is prevalent for highly complex systems in extreme operational conditions that push the limits of our understanding of underlying physical behavior. Often, these are also the situations where separate-effect experiments isolating the physical phenomena are not available; meaning that experimentally determining the unknown constituent behavior is not possible (Bauer and Holland, 1995; Unal et al., 2013), and that integral-effect experiments that reflect the behavior of the complete system tend to be the only available observations. In this paper, the authors advocate for the usefulness of integral-effect experiments in furthering a model developer’s knowledge of the physics principles governing the system behavior of interest.« less
Stevens, Garrison N.; Atamturktur, Sez; Brown, D. Andrew; ...
2018-04-16
Rapid advancements in parallel computing over the last two decades have enabled simulations of complex, coupled systems through partitioning. In partitioned analysis, independently developed constituent models communicate, representing dependencies between multiple physical phenomena that occur in the full system. Figure 1 schematically demonstrates a coupled system with two constituent models, each resolving different physical behavior. In this figure, the constituent model, denoted as the “consumer,” relies upon some input parameter that is being provided by the constituent model acting as a “feeder”. The role of the feeder model is to map operating conditions (i.e. those that are stimulating the process)more » to consumer inputs, thus providing functional inputs to the consumer model*. Problems arise if the feeder model cannot be built–a challenge that is prevalent for highly complex systems in extreme operational conditions that push the limits of our understanding of underlying physical behavior. Often, these are also the situations where separate-effect experiments isolating the physical phenomena are not available; meaning that experimentally determining the unknown constituent behavior is not possible (Bauer and Holland, 1995; Unal et al., 2013), and that integral-effect experiments that reflect the behavior of the complete system tend to be the only available observations. In this paper, the authors advocate for the usefulness of integral-effect experiments in furthering a model developer’s knowledge of the physics principles governing the system behavior of interest.« less
Solution to the indexing problem of frequency domain simulation experiments
NASA Technical Reports Server (NTRS)
Mitra, Mousumi; Park, Stephen K.
1991-01-01
A frequency domain simulation experiment is one in which selected system parameters are oscillated sinusoidally to induce oscillations in one or more system statistics of interest. A spectral (Fourier) analysis of these induced oscillations is then performed. To perform this spectral analysis, all oscillation frequencies must be referenced to a common, independent variable - an oscillation index. In a discrete-event simulation, the global simulation clock is the most natural choice for the oscillation index. However, past efforts to reference all frequencies to the simulation clock generally yielded unsatisfactory results. The reason for these unsatisfactory results is explained in this paper and a new methodology which uses the simulation clock as the oscillation index is presented. Techniques for implementing this new methodology are demonstrated by performing a frequency domain simulation experiment for a network of queues.
An opportunity analysis system for space surveillance experiments with the MSX
NASA Technical Reports Server (NTRS)
Sridharan, Ramaswamy; Duff, Gary; Hayes, Tony; Wiseman, Andy
1994-01-01
The Mid-Course Space Experiment consists of a set of payloads on a satellite being designed and built under the sponsorship of Ballistic Missile Defense Office. The MSX satellite will conduct a series of measurements of phenomenology of backgrounds, missile targets, plumes and resident space objects (RSO's); and will engage in functional demonstrations in support of detection, acquisition and tracking for ballistic missile defense and space-based space surveillance missions. A complex satellite like the MSX has several constraints imposed on its operation by the sensors, the supporting instrumentation, power resources, data recording capability, communications and the environment in which all these operate. This paper describes the implementation of an opportunity and feasibility analysis system, developed at Lincoln Laboratory, Massachusetts Institute of Technology, specifically to support the experiments of the Principal Investigator for space-based surveillance.
Validating a Geographical Image Retrieval System.
ERIC Educational Resources Information Center
Zhu, Bin; Chen, Hsinchun
2000-01-01
Summarizes a prototype geographical image retrieval system that demonstrates how to integrate image processing and information analysis techniques to support large-scale content-based image retrieval. Describes an experiment to validate the performance of this image retrieval system against that of human subjects by examining similarity analysis…
Reichert, Matthew D.; Alvarez, Nicolas J.; Brooks, Carlton F.; ...
2014-09-24
Pendant bubble and drop devices are invaluable tools in understanding surfactant behavior at fluid–fluid interfaces. The simple instrumentation and analysis are used widely to determine adsorption isotherms, transport parameters, and interfacial rheology. However, much of the analysis performed is developed for planar interfaces. Moreover, the application of a planar analysis to drops and bubbles (curved interfaces) can lead to erroneous and unphysical results. We revisit this analysis for a well-studied surfactant system at air–water interfaces over a wide range of curvatures as applied to both expansion/contraction experiments and interfacial elasticity measurements. The impact of curvature and transport on measured propertiesmore » is quantified and compared to other scaling relationships in the literature. Our results provide tools to design interfacial experiments for accurate determination of isotherm, transport and elastic properties.« less
ERIC Educational Resources Information Center
Pelin, Nicolae; Mironov, Vladimir
2008-01-01
In this article the problems of functioning algorithms development for system of the automated analysis of educational process rhythm in a higher educational institution are considered. Using the device of experiment planning for conducting the scientific researches, adapted methodologies, received by authors in the dissertational works at the…
Spatial-temporal discriminant analysis for ERP-based brain-computer interface.
Zhang, Yu; Zhou, Guoxu; Zhao, Qibin; Jin, Jing; Wang, Xingyu; Cichocki, Andrzej
2013-03-01
Linear discriminant analysis (LDA) has been widely adopted to classify event-related potential (ERP) in brain-computer interface (BCI). Good classification performance of the ERP-based BCI usually requires sufficient data recordings for effective training of the LDA classifier, and hence a long system calibration time which however may depress the system practicability and cause the users resistance to the BCI system. In this study, we introduce a spatial-temporal discriminant analysis (STDA) to ERP classification. As a multiway extension of the LDA, the STDA method tries to maximize the discriminant information between target and nontarget classes through finding two projection matrices from spatial and temporal dimensions collaboratively, which reduces effectively the feature dimensionality in the discriminant analysis, and hence decreases significantly the number of required training samples. The proposed STDA method was validated with dataset II of the BCI Competition III and dataset recorded from our own experiments, and compared to the state-of-the-art algorithms for ERP classification. Online experiments were additionally implemented for the validation. The superior classification performance in using few training samples shows that the STDA is effective to reduce the system calibration time and improve the classification accuracy, thereby enhancing the practicability of ERP-based BCI.
Verification of Space Station Secondary Power System Stability Using Design of Experiment
NASA Technical Reports Server (NTRS)
Karimi, Kamiar J.; Booker, Andrew J.; Mong, Alvin C.; Manners, Bruce
1998-01-01
This paper describes analytical methods used in verification of large DC power systems with applications to the International Space Station (ISS). Large DC power systems contain many switching power converters with negative resistor characteristics. The ISS power system presents numerous challenges with respect to system stability such as complex sources and undefined loads. The Space Station program has developed impedance specifications for sources and loads. The overall approach to system stability consists of specific hardware requirements coupled with extensive system analysis and testing. Testing of large complex distributed power systems is not practical due to size and complexity of the system. Computer modeling has been extensively used to develop hardware specifications as well as to identify system configurations for lab testing. The statistical method of Design of Experiments (DoE) is used as an analysis tool for verification of these large systems. DOE reduces the number of computer runs which are necessary to analyze the performance of a complex power system consisting of hundreds of DC/DC converters. DoE also provides valuable information about the effect of changes in system parameters on the performance of the system. DoE provides information about various operating scenarios and identification of the ones with potential for instability. In this paper we will describe how we have used computer modeling to analyze a large DC power system. A brief description of DoE is given. Examples using applications of DoE to analysis and verification of the ISS power system are provided.
Crown, Scott B; Long, Christopher P; Antoniewicz, Maciek R
2016-11-01
13 C-Metabolic flux analysis ( 13 C-MFA) is a widely used approach in metabolic engineering for quantifying intracellular metabolic fluxes. The precision of fluxes determined by 13 C-MFA depends largely on the choice of isotopic tracers and the specific set of labeling measurements. A recent advance in the field is the use of parallel labeling experiments for improved flux precision and accuracy. However, as of today, no systemic methods exist for identifying optimal tracers for parallel labeling experiments. In this contribution, we have addressed this problem by introducing a new scoring system and evaluating thousands of different isotopic tracer schemes. Based on this extensive analysis we have identified optimal tracers for 13 C-MFA. The best single tracers were doubly 13 C-labeled glucose tracers, including [1,6- 13 C]glucose, [5,6- 13 C]glucose and [1,2- 13 C]glucose, which consistently produced the highest flux precision independent of the metabolic flux map (here, 100 random flux maps were evaluated). Moreover, we demonstrate that pure glucose tracers perform better overall than mixtures of glucose tracers. For parallel labeling experiments the optimal isotopic tracers were [1,6- 13 C]glucose and [1,2- 13 C]glucose. Combined analysis of [1,6- 13 C]glucose and [1,2- 13 C]glucose labeling data improved the flux precision score by nearly 20-fold compared to widely use tracer mixture 80% [1- 13 C]glucose +20% [U- 13 C]glucose. Copyright © 2016 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.
Parallel labeling experiments and metabolic flux analysis: Past, present and future methodologies.
Crown, Scott B; Antoniewicz, Maciek R
2013-03-01
Radioactive and stable isotopes have been applied for decades to elucidate metabolic pathways and quantify carbon flow in cellular systems using mass and isotope balancing approaches. Isotope-labeling experiments can be conducted as a single tracer experiment, or as parallel labeling experiments. In the latter case, several experiments are performed under identical conditions except for the choice of substrate labeling. In this review, we highlight robust approaches for probing metabolism and addressing metabolically related questions though parallel labeling experiments. In the first part, we provide a brief historical perspective on parallel labeling experiments, from the early metabolic studies when radioisotopes were predominant to present-day applications based on stable-isotopes. We also elaborate on important technical and theoretical advances that have facilitated the transition from radioisotopes to stable-isotopes. In the second part of the review, we focus on parallel labeling experiments for (13)C-metabolic flux analysis ((13)C-MFA). Parallel experiments offer several advantages that include: tailoring experiments to resolve specific fluxes with high precision; reducing the length of labeling experiments by introducing multiple entry-points of isotopes; validating biochemical network models; and improving the performance of (13)C-MFA in systems where the number of measurements is limited. We conclude by discussing some challenges facing the use of parallel labeling experiments for (13)C-MFA and highlight the need to address issues related to biological variability, data integration, and rational tracer selection. Copyright © 2012 Elsevier Inc. All rights reserved.
Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun
2017-02-06
In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human-machine-environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines.
A 3D Human-Machine Integrated Design and Analysis Framework for Squat Exercises with a Smith Machine
Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun
2017-01-01
In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human–machine–environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines. PMID:28178184
The Peroxidase-Glucose Oxidase Enzyme System in the Undergraduate Laboratory.
ERIC Educational Resources Information Center
Woolridge, Elisa; And Others
1986-01-01
Offers a series of experiments which introduce students to the general principles of enzymology. The experiment demonstrates several basic enzyme properties and the chromatographic exercises provide an analysis of each enzymatic activity. Questions are also presented for extending discussion on the activities. (ML)
Zhao, Ming; Rattanatamrong, Prapaporn; DiGiovanna, Jack; Mahmoudi, Babak; Figueiredo, Renato J; Sanchez, Justin C; Príncipe, José C; Fortes, José A B
2008-01-01
Dynamic data-driven brain-machine interfaces (DDDBMI) have great potential to advance the understanding of neural systems and improve the design of brain-inspired rehabilitative systems. This paper presents a novel cyberinfrastructure that couples in vivo neurophysiology experimentation with massive computational resources to provide seamless and efficient support of DDDBMI research. Closed-loop experiments can be conducted with in vivo data acquisition, reliable network transfer, parallel model computation, and real-time robot control. Behavioral experiments with live animals are supported with real-time guarantees. Offline studies can be performed with various configurations for extensive analysis and training. A Web-based portal is also provided to allow users to conveniently interact with the cyberinfrastructure, conducting both experimentation and analysis. New motor control models are developed based on this approach, which include recursive least square based (RLS) and reinforcement learning based (RLBMI) algorithms. The results from an online RLBMI experiment shows that the cyberinfrastructure can successfully support DDDBMI experiments and meet the desired real-time requirements.
In-Flight Thermal Performance of the Lidar In-Space Technology Experiment
NASA Technical Reports Server (NTRS)
Roettker, William
1995-01-01
The Lidar In-Space Technology Experiment (LITE) was developed at NASA s Langley Research Center to explore the applications of lidar operated from an orbital platform. As a technology demonstration experiment, LITE was developed to gain experience designing and building future operational orbiting lidar systems. Since LITE was the first lidar system to be flown in space, an important objective was to validate instrument design principles in such areas as thermal control, laser performance, instrument alignment and control, and autonomous operations. Thermal and structural analysis models of the instrument were developed during the design process to predict the behavior of the instrument during its mission. In order to validate those mathematical models, extensive engineering data was recorded during all phases of LITE's mission. This inflight engineering data was compared with preflight predictions and, when required, adjustments to the thermal and structural models were made to more accurately match the instrument s actual behavior. The results of this process for the thermal analysis and design of LITE are presented in this paper.
Optimal subinterval selection approach for power system transient stability simulation
Kim, Soobae; Overbye, Thomas J.
2015-10-21
Power system transient stability analysis requires an appropriate integration time step to avoid numerical instability as well as to reduce computational demands. For fast system dynamics, which vary more rapidly than what the time step covers, a fraction of the time step, called a subinterval, is used. However, the optimal value of this subinterval is not easily determined because the analysis of the system dynamics might be required. This selection is usually made from engineering experiences, and perhaps trial and error. This paper proposes an optimal subinterval selection approach for power system transient stability analysis, which is based on modalmore » analysis using a single machine infinite bus (SMIB) system. Fast system dynamics are identified with the modal analysis and the SMIB system is used focusing on fast local modes. An appropriate subinterval time step from the proposed approach can reduce computational burden and achieve accurate simulation responses as well. As a result, the performance of the proposed method is demonstrated with the GSO 37-bus system.« less
The ALICE analysis train system
NASA Astrophysics Data System (ADS)
Zimmermann, Markus; ALICE Collaboration
2015-05-01
In the ALICE experiment hundreds of users are analyzing big datasets on a Grid system. High throughput and short turn-around times are achieved by a centralized system called the LEGO trains. This system combines analysis from different users in so-called analysis trains which are then executed within the same Grid jobs thereby reducing the number of times the data needs to be read from the storage systems. The centralized trains improve the performance, the usability for users and the bookkeeping in comparison to single user analysis. The train system builds upon the already existing ALICE tools, i.e. the analysis framework as well as the Grid submission and monitoring infrastructure. The entry point to the train system is a web interface which is used to configure the analysis and the desired datasets as well as to test and submit the train. Several measures have been implemented to reduce the time a train needs to finish and to increase the CPU efficiency.
NASA Bioculture System: From Experiment Definition to Flight Payload
NASA Technical Reports Server (NTRS)
Sato, Kevin Y.; Almeida, Eduardo; Austin, Edward M.
2014-01-01
Starting in 2015, the NASA Bioculture System will be available to the science community to conduct cell biology and microbiology experiments on ISS. The Bioculture System carries ten environmentally independent Cassettes, which house the experiments. The closed loop fluids flow path subsystem in each Cassette provides a perfusion-based method for maintain specimen cultures in a shear-free environment by using a biochamber based on porous hollow fiber bioreactor technology. Each Cassette contains an incubator and separate insulated refrigerator compartment for storage of media, samples, nutrients and additives. The hardware is capable of fully automated or manual specimen culturing and processing, including in-flight experiment initiation, sampling and fixation, up to BSL-2 specimen culturing, and the ability to up to 10 independent cultures in parallel for statistical analysis. The incubation and culturing of specimens in the Bioculture System is a departure from standard laboratory culturing methods. Therefore, it is critical that the PI has an understanding the pre-flight test required for successfully using the Bioculture System to conduct an on-orbit experiment. Overall, the PI will conduct a series of ground tests to define flight experiment and on-orbit implementation requirements, verify biocompatibility, and determine base bioreactor conditions. The ground test processes for the utilization of the Bioculture System, from experiment selection to flight, will be reviewed. Also, pre-flight test schedules and use of COTS ground test equipment (CellMax and FiberCell systems) and the Bioculture System will be discussed.
ERIC Educational Resources Information Center
Wriedt, Mario; Sculley, Julian P.; Aulakh, Darpandeep; Zhou, Hong-Cai
2016-01-01
A simple and straightforward synthesis of an ultrastable porous metal-organic framework (MOF) based on copper(II) and a mixed N donor ligand system is described as a laboratory experiment for chemistry undergraduate students. These experiments and the resulting analysis are designed to teach students basic research tools and procedures while…
Long Duration Exposure Facility (LDEF) optical systems SIG summary and database
NASA Astrophysics Data System (ADS)
Bohnhoff-Hlavacek, Gail
1992-09-01
The main objectives of the Long Duration Exposure Facility (LDEF) Optical Systems Special Investigative Group (SIG) Discipline are to develop a database of experimental findings on LDEF optical systems and elements hardware, and provide an optical system overview. Unlike the electrical and mechanical disciplines, the optics effort relies primarily on the testing of hardware at the various principal investigator's laboratories, since minimal testing of optical hardware was done at Boeing. This is because all space-exposed optics hardware are part of other individual experiments. At this time, all optical systems and elements testing by experiment investigator teams is not complete, and in some cases has hardly begun. Most experiment results to date, document observations and measurements that 'show what happened'. Still to come from many principal investigators is a critical analysis to explain 'why it happened' and future design implications. The original optical system related concerns and the lessons learned at a preliminary stage in the Optical Systems Investigations are summarized. The design of the Optical Experiments Database and how to acquire and use the database to review the LDEF results are described.
Long Duration Exposure Facility (LDEF) optical systems SIG summary and database
NASA Technical Reports Server (NTRS)
Bohnhoff-Hlavacek, Gail
1992-01-01
The main objectives of the Long Duration Exposure Facility (LDEF) Optical Systems Special Investigative Group (SIG) Discipline are to develop a database of experimental findings on LDEF optical systems and elements hardware, and provide an optical system overview. Unlike the electrical and mechanical disciplines, the optics effort relies primarily on the testing of hardware at the various principal investigator's laboratories, since minimal testing of optical hardware was done at Boeing. This is because all space-exposed optics hardware are part of other individual experiments. At this time, all optical systems and elements testing by experiment investigator teams is not complete, and in some cases has hardly begun. Most experiment results to date, document observations and measurements that 'show what happened'. Still to come from many principal investigators is a critical analysis to explain 'why it happened' and future design implications. The original optical system related concerns and the lessons learned at a preliminary stage in the Optical Systems Investigations are summarized. The design of the Optical Experiments Database and how to acquire and use the database to review the LDEF results are described.
Technology for large space systems: A bibliography with indexes (supplement 07)
NASA Technical Reports Server (NTRS)
1983-01-01
This bibliography lists 366 reports, articles and other documents introduced into the NASA scientific and technical information system between January 1, 1982 and June 30, 1982. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, solar power satellite systems, and flight experiments.
A study analysis of cable-body systems totally immersed in a fluid stream
NASA Technical Reports Server (NTRS)
Delaurier, J. D.
1972-01-01
A general stability analysis of a cable-body system immersed in a fluid stream is presented. The analytical portion of this analysis treats the system as being essentially a cable problem, with the body dynamics giving the end conditions. The mathematical form of the analysis consists of partial differential wave equations, with the end and auxiliary conditions being determined from the body equations of motion. The equations uncouple to give a lateral problem and a longitudinal problem as in first order airplane dynamics. A series of tests on a tethered wind tunnel model provide a comparison of the theory with experiment.
Probabilistic structural analysis methods for select space propulsion system components
NASA Technical Reports Server (NTRS)
Millwater, H. R.; Cruse, T. A.
1989-01-01
The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.
The Crew Earth Observations Experiment: Earth System Science from the ISS
NASA Technical Reports Server (NTRS)
Stefanov, William L.; Evans, Cynthia A.; Robinson, Julie A.; Wilkinson, M. Justin
2007-01-01
This viewgraph presentation reviews the use of Astronaut Photography (AP) as taken from the International Space Station (ISS) in Earth System Science (ESS). Included are slides showing basic remote sensing theory, data characteristics of astronaut photography, astronaut training and operations, crew Earth observations group, targeting sites and acquisition, cataloging and database, analysis and applications for ESS, image analysis of particular interest urban areas, megafans, deltas, coral reefs. There are examples of the photographs and the analysis.
E-Learning System Using Segmentation-Based MR Technique for Learning Circuit Construction
ERIC Educational Resources Information Center
Takemura, Atsushi
2016-01-01
This paper proposes a novel e-Learning system using the mixed reality (MR) technique for technical experiments involving the construction of electronic circuits. The proposed system comprises experimenters' mobile computers and a remote analysis system. When constructing circuits, each learner uses a mobile computer to transmit image data from the…
An Overview of NASA's SubsoniC Research Aircraft Testbed (SCRAT)
NASA Technical Reports Server (NTRS)
Baumann, Ethan; Hernandez, Joe; Ruhf, John
2013-01-01
National Aeronautics and Space Administration Dryden Flight Research Center acquired a Gulfstream III (GIII) aircraft to serve as a testbed for aeronautics flight research experiments. The aircraft is referred to as SCRAT, which stands for SubsoniC Research Aircraft Testbed. The aircraft’s mission is to perform aeronautics research; more specifically raising the Technology Readiness Level (TRL) of advanced technologies through flight demonstrations and gathering high-quality research data suitable for verifying the technologies, and validating design and analysis tools. The SCRAT has the ability to conduct a range of flight research experiments throughout a transport class aircraft’s flight envelope. Experiments ranging from flight-testing of a new aircraft system or sensor to those requiring structural and aerodynamic modifications to the aircraft can be accomplished. The aircraft has been modified to include an instrumentation system and sensors necessary to conduct flight research experiments along with a telemetry capability. An instrumentation power distribution system was installed to accommodate the instrumentation system and future experiments. An engineering simulation of the SCRAT has been developed to aid in integrating research experiments. A series of baseline aircraft characterization flights has been flown that gathered flight data to aid in developing and integrating future research experiments. This paper describes the SCRAT’s research systems and capabilities
An Overview of NASA's Subsonic Research Aircraft Testbed (SCRAT)
NASA Technical Reports Server (NTRS)
Baumann, Ethan; Hernandez, Joe; Ruhf, John C.
2013-01-01
National Aeronautics and Space Administration Dryden Flight Research Center acquired a Gulfstream III (GIII) aircraft to serve as a testbed for aeronautics flight research experiments. The aircraft is referred to as SCRAT, which stands for SubsoniC Research Aircraft Testbed. The aircraft's mission is to perform aeronautics research; more specifically raising the Technology Readiness Level (TRL) of advanced technologies through flight demonstrations and gathering high-quality research data suitable for verifying the technologies, and validating design and analysis tools. The SCRAT has the ability to conduct a range of flight research experiments throughout a transport class aircraft's flight envelope. Experiments ranging from flight-testing of a new aircraft system or sensor to those requiring structural and aerodynamic modifications to the aircraft can be accomplished. The aircraft has been modified to include an instrumentation system and sensors necessary to conduct flight research experiments along with a telemetry capability. An instrumentation power distribution system was installed to accommodate the instrumentation system and future experiments. An engineering simulation of the SCRAT has been developed to aid in integrating research experiments. A series of baseline aircraft characterization flights has been flown that gathered flight data to aid in developing and integrating future research experiments. This paper describes the SCRAT's research systems and capabilities.
NASA Technical Reports Server (NTRS)
Frew, A. M.; Eisenhut, D. F.; Farrenkopf, R. L.; Gates, R. F.; Iwens, R. P.; Kirby, D. K.; Mann, R. J.; Spencer, D. J.; Tsou, H. S.; Zaremba, J. G.
1972-01-01
The precision pointing control system (PPCS) is an integrated system for precision attitude determination and orientation of gimbaled experiment platforms. The PPCS concept configures the system to perform orientation of up to six independent gimbaled experiment platforms to design goal accuracy of 0.001 degrees, and to operate in conjunction with a three-axis stabilized earth-oriented spacecraft in orbits ranging from low altitude (200-2500 n.m., sun synchronous) to 24 hour geosynchronous, with a design goal life of 3 to 5 years. The system comprises two complementary functions: (1) attitude determination where the attitude of a defined set of body-fixed reference axes is determined relative to a known set of reference axes fixed in inertial space; and (2) pointing control where gimbal orientation is controlled, open-loop (without use of payload error/feedback) with respect to a defined set of body-fixed reference axes to produce pointing to a desired target.
GEOPACK, a comprehensive user-friendly geostatistical software system, was developed to help in the analysis of spatially correlated data. The software system was developed to be used by scientists, engineers, regulators, etc., with little experience in geostatistical techniques...
An Isotopic Dilution Experiment Using Liquid Scintillation: A Simple Two-System, Two-Phase Analysis.
ERIC Educational Resources Information Center
Moehs, Peter J.; Levine, Samuel
1982-01-01
A simple isotonic, dilution analysis whose principles apply to methods of more complex radioanalyses is described. Suitable for clinical and instrumental analysis chemistry students, experimental manipulations are kept to a minimum involving only aqueous extraction before counting. Background information, procedures, and results are discussed.…
Performance analysis of a coherent free space optical communication system based on experiment.
Cao, Jingtai; Zhao, Xiaohui; Liu, Wei; Gu, Haijun
2017-06-26
Based on our previous study and designed experimental AO system with a 97-element continuous surface deformable mirror, we conduct the performance analysis of a coherent free space optical communication (FSOC) system for mixing efficiency (ME), bit error rate (BER) and outage probability under different Greenwood frequency and atmospheric coherent length. The results show that the influence of the atmospheric temporal characteristics on the performance is slightly stronger than that of the spatial characteristics when the receiving aperture and the number of sub-apertures are given. This analysis result provides a reference for the design of the coherent FSOC system.
"Once when i was on call...," theory versus reality in training for professionalism.
Eggly, Susan; Brennan, Simone; Wiese-Rometsch, Wilhelmine
2005-04-01
To identify the degree to which interns' reported experiences with professional and unprofessional behavior converge and/or diverge with ideal professional behavior proposed by the physician community. Interns at Wayne State University's residency programs in internal medicine, family medicine, and transitional medicine responded to essay questions about their experience with professional and unprofessional behavior as part of a curriculum on professionalism. Responses were coded for whether they reflected each of the principles and responsibilities outlined in a major publication on physician professionalism. Content analysis included the frequencies with which the interns' essays reflected each principle or responsibility. Additionally, a thematic analysis revealed themes of professional behavior that emerged from the essays. Interns' experiences with professional and unprofessional behavior most frequently converged with ideal behavior proposed by the physician community in categories involving interpersonal interactions with patients. Interns infrequently reported experiences involving behavior related to systems or sociopolitical issues. Interns' essays reflect their concern with interpersonal interactions with patients, but they are either less exposed to or less interested in describing behavior regarding systems or sociopolitical issues. This may be due to their stage of training or to the emphasis placed on interpersonal rather than systems or sociopolitical issues during training. The authors recommend future proposals of ideal professional behavior be revised periodically to reflect current experiences of practicing physicians, trainees, other health care providers and patients. Greater educational emphasis should be placed on the systems and sociopolitical environment in which trainees practice.
Microcracking of cross-ply composites under static and fatigue loads. Ph.D. Thesis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, S.
1994-12-31
Recently, a variational mechanics analysis approach has been used to determine the thermoelastic stress state in cracked, cross-ply laminates. The analysis included a calculation of the energy release rate due to the formation of a microcrack in the 90 deg plies. A wide variety of composite material systems and cross-ply layups of generic type (0{sub m}/90{sub n}) sub s were tested during static loading. The variational mechanics energy release rate analysis can be used to predict all features of the experimental results and to draw some new conclusions about the progression of damage in cross-ply laminates. The recommended experiments aremore » to measure the density of microcracks as a function of applied stress. Such results can be fit with the energy release rate expression and used to measure the microcracking or intralaminar fracture toughness. Experiments that measure only the stress to initiate microcracking are specifically not recommended because they do not give an accurate measure of the microcracking fracture toughness. Static fatigue, thermal cycling, and combined thermal and mechanical fatigue experiments were run on several material systems and many cross-ply layups. A modified Paris-law was used and the data from all layups of a single material system were found to fall on a single master Paris-law plot. The authors claim that the master Paris-law plot gives a good characterization of a given material system`s resistance to microcrack formation during fatigue loading.« less
Materials experiment carrier concepts definition study. Volume 2: Technical report, part 2
NASA Technical Reports Server (NTRS)
1981-01-01
A materials experiment carrier (MEC) that provides effective accommodation of the given baseline materials processing in space (MPS) payloads and demonstration of the MPS platform concept for high priority materials processing science, multidiscipline MPS investigations, host carrier for commercial MPS payloads, and system economy of orbital operations is defined. The study flow of task work is shown. Study tasks featured analysis and trades to identify the MEC system concept options.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott, P.; Olson, R.; Wilkowski, O.G.
1997-06-01
This report presents the results from Subtask 1.3 of the International Piping Integrity Research Group (IPIRG) program. The objective of Subtask 1.3 is to develop data to assess analysis methodologies for characterizing the fracture behavior of circumferentially cracked pipe in a representative piping system under combined inertial and displacement-controlled stresses. A unique experimental facility was designed and constructed. The piping system evaluated is an expansion loop with over 30 meters of 16-inch diameter Schedule 100 pipe. The experimental facility is equipped with special hardware to ensure system boundary conditions could be appropriately modeled. The test matrix involved one uncracked andmore » five cracked dynamic pipe-system experiments. The uncracked experiment was conducted to evaluate piping system damping and natural frequency characteristics. The cracked-pipe experiments evaluated the fracture behavior, pipe system response, and stability characteristics of five different materials. All cracked-pipe experiments were conducted at PWR conditions. Material characterization efforts provided tensile and fracture toughness properties of the different pipe materials at various strain rates and temperatures. Results from all pipe-system experiments and material characterization efforts are presented. Results of fracture mechanics analyses, dynamic finite element stress analyses, and stability analyses are presented and compared with experimental results.« less
NASA Technical Reports Server (NTRS)
1985-01-01
Topics covered include: data systems and quality; analysis and assimilation techniques; impacts on forecasts; tropical forecasts; analysis intercomparisons; improvements in predictability; and heat sources and sinks.
Design criteria for payload workstation accommodations
NASA Technical Reports Server (NTRS)
Watters, H. H.; Stokes, J. W.
1975-01-01
Anticipated shuttle sortie payload man-system design criteria needs are investigated. Man-system interactions for the scientific disciplines are listed and the extent is assessed to which documented Skylab experience is expected to provide system design guidance for each of the identified interactions. Where the analysis revealed that the reduced Skylab data does not answer the anticipated needs candidate criteria, based on unreduced Skylab data, available prior research, original analysis, or related requirements derived from previous space programs, are provided.
Analysis of shadowing effects on spacecraft power systems
NASA Technical Reports Server (NTRS)
Fincannon, H. J.
1995-01-01
This paper describes the Orbiting Spacecraft Shadowing Analysis (OSSA) computer program that was developed at NASA Lewis Research Center in order to assess the shadowing effects on various power systems. The algorithms, inputs and outputs are discussed. Examples of typical shadowing analyses that have been performed for the International Space Station Freedom, International Space Station Alpha and the joint United States/Russian Mir Solar Dynamic Flight Experiment Project are covered. Effects of shadowing on power systems are demonstrated.
GenSSI 2.0: multi-experiment structural identifiability analysis of SBML models.
Ligon, Thomas S; Fröhlich, Fabian; Chis, Oana T; Banga, Julio R; Balsa-Canto, Eva; Hasenauer, Jan
2018-04-15
Mathematical modeling using ordinary differential equations is used in systems biology to improve the understanding of dynamic biological processes. The parameters of ordinary differential equation models are usually estimated from experimental data. To analyze a priori the uniqueness of the solution of the estimation problem, structural identifiability analysis methods have been developed. We introduce GenSSI 2.0, an advancement of the software toolbox GenSSI (Generating Series for testing Structural Identifiability). GenSSI 2.0 is the first toolbox for structural identifiability analysis to implement Systems Biology Markup Language import, state/parameter transformations and multi-experiment structural identifiability analysis. In addition, GenSSI 2.0 supports a range of MATLAB versions and is computationally more efficient than its previous version, enabling the analysis of more complex models. GenSSI 2.0 is an open-source MATLAB toolbox and available at https://github.com/genssi-developer/GenSSI. thomas.ligon@physik.uni-muenchen.de or jan.hasenauer@helmholtz-muenchen.de. Supplementary data are available at Bioinformatics online.
Markov Jump-Linear Performance Models for Recoverable Flight Control Computers
NASA Technical Reports Server (NTRS)
Zhang, Hong; Gray, W. Steven; Gonzalez, Oscar R.
2004-01-01
Single event upsets in digital flight control hardware induced by atmospheric neutrons can reduce system performance and possibly introduce a safety hazard. One method currently under investigation to help mitigate the effects of these upsets is NASA Langley s Recoverable Computer System. In this paper, a Markov jump-linear model is developed for a recoverable flight control system, which will be validated using data from future experiments with simulated and real neutron environments. The method of tracking error analysis and the plan for the experiments are also described.
Reference earth orbital research and applications investigations (blue book). Volume 7: Technology
NASA Technical Reports Server (NTRS)
1971-01-01
The candidate experiment program for manned space stations with specific application to technology disciplines is presented. The five functional program elements are devoted to the development of new technology for application to future generation spacecraft and experiments. The functional program elements are as follows: (1) monitor and trace movement of external contaminants to determine methods for controlling contamination, (2) analysis of fundamentals of fluid systems management, (3) extravehicular activity, (4) advanced spacecraft systems tests, and (5) development of teleoperator system for use with space activities.
Multi-hole pressure probes to wind tunnel experiments and air data systems
NASA Astrophysics Data System (ADS)
Shevchenko, A. M.; Shmakov, A. S.
2017-10-01
The problems to develop a multihole pressure system to measure flow angularity, Mach number and dynamic head for wind tunnel experiments or air data systems are discussed. A simple analytical model with separation of variables is derived for the multihole spherical pressure probe. The proposed model is uniform for small subsonic and supersonic speeds. An error analysis was performed. The error functions are obtained, allowing to estimate the influence of the Mach number, the pitch angle, the location of the pressure ports on the uncertainty of determining the flow parameters.
A study of the portability of an Ada system in the software engineering laboratory (SEL)
NASA Technical Reports Server (NTRS)
Jun, Linda O.; Valett, Susan Ray
1990-01-01
A particular porting effort is discussed, and various statistics on analyzing the portability of Ada and the total staff months (overall and by phase) required to accomplish the rehost, are given. This effort is compared to past experiments on the rehosting of FORTRAN systems. The discussion includes an analysis of the types of errors encountered during the rehosting, the changes required to rehost the system, experiences with the Alsys IBM Ada compiler, the impediments encountered, and the lessons learned during this study.
Portable computing - A fielded interactive scientific application in a small off-the-shelf package
NASA Technical Reports Server (NTRS)
Groleau, Nicolas; Hazelton, Lyman; Frainier, Rich; Compton, Michael; Colombano, Silvano; Szolovits, Peter
1993-01-01
Experience with the design and implementation of a portable computing system for STS crew-conducted science is discussed. Principal-Investigator-in-a-Box (PI) will help the SLS-2 astronauts perform vestibular (human orientation system) experiments in flight. PI is an interactive system that provides data acquisition and analysis, experiment step rescheduling, and various other forms of reasoning to astronaut users. The hardware architecture of PI consists of a computer and an analog interface box. 'Off-the-shelf' equipment is employed in the system wherever possible in an effort to use widely available tools and then to add custom functionality and application codes to them. Other projects which can help prospective teams to learn more about portable computing in space are also discussed.
NASA Technical Reports Server (NTRS)
Rozendaal, Rodger A.; Behbehani, Roxanna
1990-01-01
NASA initiated the Variable Sweep Transition Flight Experiment (VSTFE) to establish a boundary layer transition database for laminar flow wing design. For this experiment, full-span upper surface gloves were fitted to a variable sweep F-14 aircraft. The development of an improved laminar boundary layer stability analysis system called the Unified Stability System (USS) is documented and results of its use on the VSTFE flight data are shown. The USS consists of eight computer codes. The theoretical background of the system is described, as is the input, output, and usage hints. The USS is capable of analyzing boundary layer stability over a wide range of disturbance frequencies and orientations, making it possible to use different philosophies in calculating the growth of disturbances on sweptwings.
Principals' Experiences of Being Evaluated: A Phenomenological Study
ERIC Educational Resources Information Center
Parylo, Oksana; Zepeda, Sally J.; Bengtson, Ed
2012-01-01
This phenomenological study sought to understand principals' lived experiences of being evaluated with reliance on the principles of developmental supervision and adult learning theory. Analysis of interview data from 16 principals revealed 3 major constructs in principal evaluation: evaluation is a complex, constantly changing system; principal…
Water Chemistry Laboratory Manual.
ERIC Educational Resources Information Center
Jenkins, David; And Others
This manual of laboratory experiments in water chemistry serves a dual function of illustrating fundamental chemical principles of dilute aqueous systems and of providing the student with some familiarity with the chemical measurements commonly used in water and wastewater analysis. Experiments are grouped in categories on the basis of similar…
Probing the magnetsophere with artificial electron beams
NASA Technical Reports Server (NTRS)
Winckler, J. R.
1981-01-01
An analysis is conducted of the University of Minnesota Electron Echo experiments, which so far have included five sounding rocket experiments. The concept of the Echo experiment is to inject electron beam pulses from a rocket into the ionosphere at altitudes in the range from 100 to 300 km. The electrons move to the conjugate hemisphere following magnetic field lines and return on neighboring field lines to the neighborhood of the rocket where the pulses may be detected and analyzed. Attention is given to the detection and analysis of echoes, the structure of echoes, and the Echo V experiment. The Echo V experiment showed clearly that detection of remote echo beams by atmospheric fluorescence using low light level TV system is not a viable technique. A future experiment is to use throw-away detectors for direct remote echo detection.
MOD-0A 200 kW wind turbine generator design and analysis report
NASA Astrophysics Data System (ADS)
Anderson, T. S.; Bodenschatz, C. A.; Eggers, A. G.; Hughes, P. S.; Lampe, R. F.; Lipner, M. H.; Schornhorst, J. R.
1980-08-01
The design, analysis, and initial performance of the MOD-OA 200 kW wind turbine generator at Clayton, NM is documented. The MOD-OA was designed and built to obtain operation and performance data and experience in utility environments. The project requirements, approach, system description, design requirements, design, analysis, system tests, installation, safety considerations, failure modes and effects analysis, data acquisition, and initial performance for the wind turbine are discussed. The design and analysis of the rotor, drive train, nacelle equipment, yaw drive mechanism and brake, tower, foundation, electricl system, and control systems are presented. The rotor includes the blades, hub, and pitch change mechanism. The drive train includes the low speed shaft, speed increaser, high speed shaft, and rotor brake. The electrical system includes the generator, switchgear, transformer, and utility connection. The control systems are the blade pitch, yaw, and generator control, and the safety system. Manual, automatic, and remote control are discussed. Systems analyses on dynamic loads and fatigue are presented.
MOD-0A 200 kW wind turbine generator design and analysis report
NASA Technical Reports Server (NTRS)
Anderson, T. S.; Bodenschatz, C. A.; Eggers, A. G.; Hughes, P. S.; Lampe, R. F.; Lipner, M. H.; Schornhorst, J. R.
1980-01-01
The design, analysis, and initial performance of the MOD-OA 200 kW wind turbine generator at Clayton, NM is documented. The MOD-OA was designed and built to obtain operation and performance data and experience in utility environments. The project requirements, approach, system description, design requirements, design, analysis, system tests, installation, safety considerations, failure modes and effects analysis, data acquisition, and initial performance for the wind turbine are discussed. The design and analysis of the rotor, drive train, nacelle equipment, yaw drive mechanism and brake, tower, foundation, electricl system, and control systems are presented. The rotor includes the blades, hub, and pitch change mechanism. The drive train includes the low speed shaft, speed increaser, high speed shaft, and rotor brake. The electrical system includes the generator, switchgear, transformer, and utility connection. The control systems are the blade pitch, yaw, and generator control, and the safety system. Manual, automatic, and remote control are discussed. Systems analyses on dynamic loads and fatigue are presented.
NASA Technical Reports Server (NTRS)
Grindeland, R.; Vale, W.; Hymer, W.; Sawchenko, P.; Vasques, M.; Krasnov, I.; Kaplanski, A.; Victorov, I.
1990-01-01
The objectives of the 1887 mission were: (1) to determine if the results of the SL-3 pituitary gland experiment (1) were repeatable; and (2) to determine what effect a longer mission would have on the rat pituitary gland growth hormone (GH) system. In the 1887 experiment two issues were considered especially important. First, it was recognized that cells prepared from individual rat pituitary glands should be considered separately so that the data from the 5 glands could be analyzed in a statistically meaningful way. Second, results of the SL-3 flight involving the hollow fiber implant and HPLC GH-variant experiments suggested that the biological activity of the hormone had been negatively affected by flight. The results of the 1887 experiment documented the wisdom of addressing both issues in the protocol. Thus, the reduction in secretory capacity of flight cells during subsequent extended cell culture on Earth was documented statistically, and thereby established the validity of the SL-3 result. The results of both flight experiments thus support the contention that there is a secretory lesion in pituitary GH cells of flight animals. The primary objective of both missions was a clear definition of the effect of spaceflight on the GH cell system. There can no longer be any reasonable doubt that this system is affected in microgravity. One explanation for the reason(s) underlying the better known effects of spaceflight on organisms, viz. changes in bone, muscle and immune systems may very well rest with such changes in bGH. In spite of the fact that rats in the Cosmos 1887 flight were on Earth for two days after flight, the data show that the GH system had still not recovered from the effects of flight. Many questions remain. One of the more important concerns the GRF responsiveness of somatotrophs after flight. This will be tested in an upcoming experiment.
Liu, Yan; Yang, Dong; Xiong, Fen; Yu, Lan; Ji, Fei; Wang, Qiu-Ju
2015-09-01
Hearing loss affects more than 27 million people in mainland China. It would be helpful to develop a portable and self-testing audiometer for the timely detection of hearing loss so that the optimal clinical therapeutic schedule can be determined. The objective of this study was to develop a software-based hearing self-testing system. The software-based self-testing system consisted of a notebook computer, an external sound card, and a pair of 10-Ω insert earphones. The system could be used to test the hearing thresholds by individuals themselves in an interactive manner using software. The reliability and validity of the system at octave frequencies of 0.25 Hz to 8.0 kHz were analyzed in three series of experiments. Thirty-seven normal-hearing particpants (74 ears) were enrolled in experiment 1. Forty individuals (80 ears) with sensorineural hearing loss (SNHL) participated in experiment 2. Thirteen normal-hearing participants (26 ears) and 37 participants (74 ears) with SNHL were enrolled in experiment 3. Each participant was enrolled in only one of the three experiments. In all experiments, pure-tone audiometry in a sound insulation room (standard test) was regarded as the gold standard. SPSS for Windows, version 17.0, was used for statistical analysis. The paired t-test was used to compare the hearing thresholds between the standard test and software-based self-testing (self-test) in experiments 1 and 2. In experiment 3 (main study), one-way analysis of variance and post hoc comparisons were used to compare the hearing thresholds among the standard test and two rounds of the self-test. Linear correlation analysis was carried out for the self-tests performed twice. The concordance was analyzed between the standard test and the self-test using the kappa method. p < 0.05 was considered statistically significant. Experiments 1 and 2: The hearing thresholds determined by the two methods were not significantly different at frequencies of 250, 500, or 8000 Hz (p > 0.05) but were significantly different at frequencies of 1000, 2000, and 4000 Hz (p < 0.05), except for 1000 Hz in the right ear in experiment 2. Experiment 3: The hearing thresholds determined by the standard test and self-tests repeated twice were not significantly different at any frequency (p > 0.05). The overall sensitivity of the self-test method was 97.6%, and the specificity was 98.3%. The sensitivity was 97.6% and the specificity was 97% for the patients with SNHL. The self-test had significant concordance with the standard test (kappa value = 0.848, p < 0.001). This portable hearing self-testing system based on a notebook personal computer is a reliable and sensitive method for hearing threshold assessment and monitoring. American Academy of Audiology.
1988-06-01
Di’Lt. ibu601’. I j I o; DTIC Qt.ALTTY I ,2,1 4 AMERICAN POWER JET COMPANY RIDGEFIELD, NJ FALLS CHURCH...The logic is applied to each reparable item in the system/equipment. When the components have been analyzed, an overall system/equipment analysis is...in the AMSDL as applicable to the referenced DIDs of interest. 5. Apply staff experience in logistics support analysis to assure that the intent of the
Second generation experiments in fault tolerant software
NASA Technical Reports Server (NTRS)
Knight, J. C.
1987-01-01
The purpose of the Multi-Version Software (MVS) experiment is to obtain empirical measurements of the performance of multi-version systems. Twenty version of a program were prepared under reasonably realistic development conditions from the same specifications. The overall structure of the testing environment for the MVS experiment and its status are described. A preliminary version of the control system is described that was implemented for the MVS experiment to allow the experimenter to have control over the details of the testing. The results of an empirical study of error detection using self checks are also presented. The analysis of the checks revealed that there are great differences in the ability of individual programmers to design effective checks.
Lee, Hyunyoung; Cheon, Byungsik; Hwang, Minho; Kang, Donghoon; Kwon, Dong-Soo
2018-02-01
In robotic surgical systems, commercial master devices have limitations owing to insufficient workspace and lack of intuitiveness. To overcome these limitations, a remote-center-of-motion (RCM) master manipulator was proposed. The feasibility of the proposed RCM structure was evaluated through kinematic analysis using a conventional serial structure. Two performance comparison experiments (peg transfer task and objective transfer task) were conducted for the developed master and Phantom Omni. The kinematic analysis results showed that compared with the serial structure, the proposed RCM structure has better performance in terms of design efficiency (19%) and workspace quality (59.08%). Further, in comparison with Phantom Omni, the developed master significantly increased task efficiency and significantly decreased workload in both experiments. The comparatively better performance in terms of intuitiveness, design efficiency, and operability of the proposed master for a robotic system for minimally invasive surgery was confirmed through kinematic and experimental analysis. Copyright © 2017 John Wiley & Sons, Ltd.
STS-107 Microgravity Environment Summary Report
NASA Technical Reports Server (NTRS)
Jules, Kenol; Hrovat, Kenneth; Kelly, Eric; Reckhart, Timothy
2005-01-01
This summary report presents the results of the processed acceleration data measured aboard the Columbia orbiter during the STS-107 microgravity mission from January 16 to February 1, 2003. Two accelerometer systems were used to measure the acceleration levels due to vehicle and science operations activities that took place during the 16-day mission. Due to lack of precise timeline information regarding some payload's operations, not all of the activities were analyzed for this report. However, a general characterization of the microgravity environment of the Columbia Space Shuttle during the 16-day mission is presented followed by a more specific characterization of the environment for some designated payloads during their operations. Some specific quasi-steady and vibratory microgravity environment characterization analyses were performed for the following payloads: Structure of Flame Balls at Low Lewis-number-2, Laminar Soot Processes-2, Mechanics of Granular Materials-3 and Water Mist Fire-Suppression Experiment. The Physical Science Division of the National Aeronautics and Space Administration sponsors the Orbital Acceleration Research Experiment and the Space Acceleration Measurement System for Free Flyer to support microgravity science experiments, which require microgravity acceleration measurements. On January 16, 2003, both the Orbital Acceleration Research Experiment and the Space Acceleration Measurement System for Free Flyer accelerometer systems were launched on the Columbia Space Transportation System-107 from the Kennedy Space Center. The Orbital Acceleration Research Experiment supported science experiments requiring quasi-steady acceleration measurements, while the Space Acceleration Measurement System for Free Flyer unit supported experiments requiring vibratory acceleration measurement. The Columbia reduced gravity environment analysis presented in this report uses acceleration data collected by these two sets of accelerometer systems: The Orbital Acceleration Research Experiment is a low frequency sensor, which measures acceleration up to 1 Hz, but the 1 Hz acceleration data is trimmean filtered to yield much lower frequency acceleration data up to 0.01 Hz. This filtered data can be mapped to other locations for characterizing the quasi-steady environment for payloads and the vehicle. The Space Acceleration Measurement System for Free Flyer measures vibratory acceleration in the range of 0.01 to 200 Hz at multiple measurement locations. The vibratory acceleration data measured by this system is used to assess the local vibratory environment for payloads as well as to measure the disturbance causes by the vehicle systems, crew exercise devices and payloads operation disturbances. This summary report presents analysis of selected quasi-steady and vibratory activities measured by these two accelerometers during the Columbia 16-day microgravity mission from January 16 to February 1, 2003.
Analyses and forecasts of a tornadic supercell outbreak using a 3DVAR system ensemble
NASA Astrophysics Data System (ADS)
Zhuang, Zhaorong; Yussouf, Nusrat; Gao, Jidong
2016-05-01
As part of NOAA's "Warn-On-Forecast" initiative, a convective-scale data assimilation and prediction system was developed using the WRF-ARW model and ARPS 3DVAR data assimilation technique. The system was then evaluated using retrospective short-range ensemble analyses and probabilistic forecasts of the tornadic supercell outbreak event that occurred on 24 May 2011 in Oklahoma, USA. A 36-member multi-physics ensemble system provided the initial and boundary conditions for a 3-km convective-scale ensemble system. Radial velocity and reflectivity observations from four WSR-88Ds were assimilated into the ensemble using the ARPS 3DVAR technique. Five data assimilation and forecast experiments were conducted to evaluate the sensitivity of the system to data assimilation frequencies, in-cloud temperature adjustment schemes, and fixed- and mixed-microphysics ensembles. The results indicated that the experiment with 5-min assimilation frequency quickly built up the storm and produced a more accurate analysis compared with the 10-min assimilation frequency experiment. The predicted vertical vorticity from the moist-adiabatic in-cloud temperature adjustment scheme was larger in magnitude than that from the latent heat scheme. Cycled data assimilation yielded good forecasts, where the ensemble probability of high vertical vorticity matched reasonably well with the observed tornado damage path. Overall, the results of the study suggest that the 3DVAR analysis and forecast system can provide reasonable forecasts of tornadic supercell storms.
2015-12-01
Army Training and Doctrine Command (TRADOC) Analysis Center (TRAC) to the Philippines for Operation ENDURING FREEDOM – Philippines (OEF-P). PROJECT...management, doctrine and force development, training management, system testing, system acquisition, decision analysis, and resource management, as...influenced procurement decisions and reshaped Army doctrine . Additionally, CAA itself has benefited in numerous ways. Combat experience provides analysts
High-Speed Automatic Microscopy for Real Time Tracks Reconstruction in Nuclear Emulsion
NASA Astrophysics Data System (ADS)
D'Ambrosio, N.
2006-06-01
The Oscillation Project with Emulsion-tRacking Apparatus (OPERA) experiment will use a massive nuclear emulsion detector to search for /spl nu//sub /spl mu///spl rarr//spl nu//sub /spl tau// oscillation by identifying /spl tau/ leptons through the direct detection of their decay topology. The feasibility of experiments using a large mass emulsion detector is linked to the impressive progress under way in the development of automatic emulsion analysis. A new generation of scanning systems requires the development of fast automatic microscopes for emulsion scanning and image analysis to reconstruct tracks of elementary particles. The paper presents the European Scanning System (ESS) developed in the framework of OPERA collaboration.
New results from FRECOPA analysis
NASA Technical Reports Server (NTRS)
Durin, Christian
1993-01-01
New results from the ongoing analysis of the FRECOPA's (FREnch COoperative PAssive payload) system hardware are discussed. FRECOPA (AO138) was one of the 57 experiments flown on the LDEF satellite. The experiment was located on the trailing edge (Tray B3) and was exposed to UV radiation (11,100 equivalent sun hours), approximately equal to 34,000 thermal cycles, higher vacuum levels than the leading edge, a low atomic oxygen flux, and minor doses of protons and electrons. Due to LDEF's extended mission (5.8 years), CNES decided to set up a team to analyze the FRECOPA system. Initial results were presented at the First Post-Retrieval Conference, June, 1991. The results obtained since then are summarized.
NASA Technical Reports Server (NTRS)
Filpus, J. W.; Hawley, M. C.
1984-01-01
A theoretical investigation of the effect of the microscopic energetics of the recombination reaction on the performance of a microwave-plasma electrothermal propulsion system is described, and the results of the analysis are presented. A series of experiments to test the concept is described and analyzed by comparison with a computer model of the recombination reaction. It is concluded that internal energy considerations are not likely to significantly affect the design of a microwave-plasma electrothermal rocket. The experimental results indicate that the microwave power is far higher than the capacity of the gas to absorb it; the cooling needed to control the energy dominates the experimental results.
An Instructional Systems Technology Model for Institutional Change.
ERIC Educational Resources Information Center
Dudgeon, Paul J.
A program based on instructional systems technology was developed at Canadore College as a means of devising the optimal learning experience for each individual student. The systems approach is used to solve educational problems through a process of analysis, synthesis, modeling, and simulation, based on the LOGOS (Language for Optimizing…
NASA Astrophysics Data System (ADS)
Ström, Petter; Petersson, Per; Rubel, Marek; Possnert, Göran
2016-10-01
A dedicated detector system for heavy ion elastic recoil detection analysis at the Tandem Laboratory of Uppsala University is presented. Benefits of combining a time-of-flight measurement with a segmented anode gas ionization chamber are demonstrated. The capability of ion species identification is improved with the present system, compared to that obtained when using a single solid state silicon detector for the full ion energy signal. The system enables separation of light elements, up to Neon, based on atomic number while signals from heavy elements such as molybdenum and tungsten are separated based on mass, to a sample depth on the order of 1 μm. The performance of the system is discussed and a selection of material analysis applications is given. Plasma-facing materials from fusion experiments, in particular metal mirrors, are used as a main example for the discussion. Marker experiments using nitrogen-15 or oxygen-18 are specific cases for which the described improved species separation and sensitivity are required. Resilience to radiation damage and significantly improved energy resolution for heavy elements at low energies are additional benefits of the gas ionization chamber over a solid state detector based system.
Atmospheric lidar multi-user instrument system definition study
NASA Technical Reports Server (NTRS)
Greco, R. V. (Editor)
1980-01-01
A spaceborne lidar system for atmospheric studies was defined. The primary input was the Science Objectives Experiment Description and Evolutionary Flow Document. The first task of the study was to perform an experiment evolutionary analysis of the SEED. The second task was the system definition effort of the instrument system. The third task was the generation of a program plan for the hardware phase. The fourth task was the supporting studies which included a Shuttle deficiency analysis, a preliminary safety hazard analysis, the identification of long lead items, and development studies required. As a result of the study an evolutionary Lidar Multi-User Instrument System (MUIS) was defined. The MUIS occupies a full Spacelab pallet and has a weight of 1300 kg. The Lidar MUIS laser provides a 2 joule frequency doubled Nd:YAG laser that can also pump a tuneable dye laser wide frequency range and bandwidth. The MUIS includes a 1.25 meter diameter aperture Cassegrain receiver, with a moveable secondary mirror to provide precise alignment with the laser. The receiver can transmit the return signal to three single and multiple photomultiple tube detectors by use of a rotating fold mirror. It is concluded that the Lidar MUIS proceed to program implementation.
A framework of knowledge creation processes in participatory simulation of hospital work systems.
Andersen, Simone Nyholm; Broberg, Ole
2017-04-01
Participatory simulation (PS) is a method to involve workers in simulating and designing their own future work system. Existing PS studies have focused on analysing the outcome, and minimal attention has been devoted to the process of creating this outcome. In order to study this process, we suggest applying a knowledge creation perspective. The aim of this study was to develop a framework describing the process of how ergonomics knowledge is created in PS. Video recordings from three projects applying PS of hospital work systems constituted the foundation of process mining analysis. The analysis resulted in a framework revealing the sources of ergonomics knowledge creation as sequential relationships between the activities of simulation participants sharing work experiences; experimenting with scenarios; and reflecting on ergonomics consequences. We argue that this framework reveals the hidden steps of PS that are essential when planning and facilitating PS that aims at designing work systems. Practitioner Summary: When facilitating participatory simulation (PS) in work system design, achieving an understanding of the PS process is essential. By applying a knowledge creation perspective and process mining, we investigated the knowledge-creating activities constituting the PS process. The analysis resulted in a framework of the knowledge-creating process in PS.
HEP Data Grid Applications in Korea
NASA Astrophysics Data System (ADS)
Cho, Kihyeon; Oh, Youngdo; Son, Dongchul; Kim, Bockjoo; Lee, Sangsan
2003-04-01
We will introduce the national HEP Data Grid applications in Korea. Through a five-year HEP Data Grid project (2002-2006) for CMS, AMS, CDF, PHENIX, K2K and Belle experiments in Korea, the Center for High Energy Physics, Kyungpook National University in Korea will construct the 1,000 PC cluster and related storage system such as 1,200 TByte Raid disk system. This project includes one of the master plan to construct Asia Regional Data Center by 2006 for the CMS and AMS Experiments and DCAF(DeCentralized Analysis Farm) for the CDF Experiments. During the first year of the project, we have constructed a cluster of around 200 CPU's with a 50 TBytes of a storage system. We will present our first year's experience of the software and hardware applications for HEP Data Grid of EDG and SAM Grid testbeds.
LDEF: 69 Months in Space. Third Post-Retrieval Symposium, part 3
NASA Technical Reports Server (NTRS)
Levine, Arlene S. (Editor)
1995-01-01
This volume is a compilation of papers presented at the Third Long Duration Exposure Facility (LDEF) Post-Retrieval Symposium. The papers represent the data analysis of the 57 experiments flown on the LDEF. The experiments include materials, coatings, thermal systems, power and propulsion, science (cosmic ray, interstellar gas, heavy ions, micrometeoroid, etc.), electronics, optics, and life science. In addition, papers on preliminary data analysis of EURECA, EOIM-3, and other spacecraft are included.
An Intelligent Automation Platform for Rapid Bioprocess Design.
Wu, Tianyi; Zhou, Yuhong
2014-08-01
Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user's inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. © 2013 Society for Laboratory Automation and Screening.
An Intelligent Automation Platform for Rapid Bioprocess Design
Wu, Tianyi
2014-01-01
Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user’s inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. PMID:24088579
Thermal control surfaces experiment: Initial flight data analysis
NASA Technical Reports Server (NTRS)
Wilkes, Donald R.; Hummer, Leigh L.
1991-01-01
The behavior of materials in the space environment continues to be a limiting technology for spacecraft and experiments. The thermal control surfaces experiment (TCSE) aboard the Long Duration Exposure Facility (LDEF) is the most comprehensive experiment flown to study the effects of the space environment on thermal control surfaces. Selected thermal control surfaces were exposed to the LDEF orbital environment and the effects of this exposure were measured. The TCSE combined in-space orbital measurements with pre and post-flight analyses of flight materials to determine the effects of long term space exposure. The TCSE experiment objective, method, and measurements are described along with the results of the initial materials analysis. The TCSE flight system and its excellent performance on the LDEF mission is described. A few operational anomalies were encountered and are discussed.
Research and implementation on improving I/O performance of streaming media storage system
NASA Astrophysics Data System (ADS)
Lu, Zheng-wu; Wang, Yu-de; Jiang, Guo-song
2008-12-01
In this paper, we study the special requirements of a special storage system: streaming media server, and propose a solution to improve I/O performance of RAID storage system. The solution is suitable for streaming media applications. A streaming media storage subsystem includes the I/O interfaces, RAID arrays, I/O scheduling and device drivers. The solution is implemented on the top of the storage subsystem I/O Interface. Storage subsystem is the performance bottlenecks of a streaming media system, and I/O interface directly affect the performance of the storage subsystem. According to theoretical analysis, 64 KB block-size is most appropriate for streaming media applications. We carry out experiment in detail, and verified that the proper block-size really is 64KB. It is in accordance with our analysis. The experiment results also show that by using DMA controller, efficient memory management technology and mailbox interface design mechanism, streaming media storage system achieves a high-speed data throughput.
All Source Analysis System (ASAS): Migration from VAX to Alpha AXP computer systems
NASA Technical Reports Server (NTRS)
Sjoholm-Sierchio, Michael J.; Friedman, Steven Z. (Editor)
1994-01-01
The Jet Propulsion Laboratory's (JPL's) experience migrating existing VAX applications to Digital Equipment Corporation's new Alpha AXP processor is covered. The rapid development approach used during the 10-month period required to migrate the All Source Analysis System (ASAS), 1.5 million lines of FORTRAN, C, and Ada code, is also covered. ASAS, an automated tactical intelligence system, was developed by the Jet Propulsion Laboratory for the U. S. Army. Other benefits achieved as a result of the significant performance improvements provided by Alpha AXP platform are also described.
NASA Astrophysics Data System (ADS)
Nagaoka, Kenji; Yano, Hajime; Yoshimitsu, Tetsuo; Yoshida, Kazuya; Kubota, Takashi; Adachi, Tadashi; Kurisu, Masamitsu; Yatsunami, Hiroyuki; Kuroda, Yoji
This presentation introduces the analysis and evaluation of a deployment mechanism of a tiny rover by ZARM drop tower experiments. The mechanism is installed on the MINERVA-II2 system in the Hayabusa-2 project performed by JAXA. The MINERVA-II2 system includes a small exploration rover, and the rover will be released from the Hayabusa-2 spacecraft to the asteroid surface. After the rover lands on the surface, it will move over the surface and conduct scientific measurements. To achieve such a challenging mission, the deployment mechanism of the rover is one of the significant components. In particular, controlling the rover's landing velocity against the asteroid surface is required with high-reliability mechanism. In the MINERVA-II2 system, a reliable deployment mechanism using a metal spring is installed. By the simple mechanism, the rover's releasing velocity will be controlled within a required value. Although the performance evaluation and analysis are necessary before launch, it is difficult to experiment the deployment performance three-dimensionally on ground. In the MINERVA-II2 project, with the cooperation of ZARM, DLR and JAXA, we conducted microgravity experiments using a ZARM drop tower to examine the deployment performance in a three-dimensional microgravity. During the experiments, motion of the deployment mechanism and the rover were captured by an external camera mounted on the dropping chamber. After the drop, we analyzed the rover's releasing velocity based on image processing of the camera data. The experimental results confirmed that the deployment mechanism is feasible and reliable for controlling the rover's releasing velocity. In addition to the experiments, we analyzed a mechanical friction resistance of the mechanism from a theoretical viewpoint. These results contribute to design of spring stiffness and feedback to the development of the MINERVA-II2 flight model. Finally, the drop tower experiments were accomplished based on the agreement on the Hayabusa-2 project by DLR-JAXA. The chamber for the experiments was used, which was developed by the Hayabusa-2 project. In the experiments, we received technical and operations supports from ZARM. We sincerely express our acknowledgement to ZARM, DLR and JAXA.
Apollo experience report: S-band system signal design and analysis
NASA Technical Reports Server (NTRS)
Rosenberg, H. R. (Editor)
1972-01-01
A description is given of the Apollo communications-system engineering-analysis effort that ensured the adequacy, performance, and interface compatibility of the unified S-band system elements for a successful lunar-landing mission. The evolution and conceptual design of the unified S-band system are briefly reviewed from a historical viewpoint. A comprehensive discussion of the unified S-band elements includes the salient design features of the system and serves as a basis for a better understanding of the design decisions and analyses. The significant design decisions concerning the Apollo communications-system signal design are discussed providing an insight into the role of systems analysis in arriving at the current configuration of the Apollo communications system. Analyses are presented concerning performance estimation (mathematical-model development through real-time mission support) and system deficiencies, modifications, and improvements.
Optical Performance Of The Gemini Carbon Dioxide Laser Fusion System
NASA Astrophysics Data System (ADS)
Viswanathan, V. K.; Hayden, J. J.; Liberman, I.
1980-11-01
The performance of the Gemini two beam carbon dioxide laser fusion system was recently upgraded by installation of optical components with improved quality in the final amplifier. A theoretical analysis was conducted in conlunction with measurements of the new performance. The analysis and experimental procedures, and results obtained are reported and compared. Good agreement was found which was within the uncertainties of the analysis and the inaccuracies of the experiments. The focal spot Strehl ratio was between 0.24 and 0.3 for both beams.
A Real Time System for Multi-Sensor Image Analysis through Pyramidal Segmentation
1992-01-30
A Real Time Syte for M~ulti- sensor Image Analysis S. E I0 through Pyramidal Segmentation/ / c •) L. Rudin, S. Osher, G. Koepfler, J.9. Morel 7. ytu...experiments with reconnaissance photography, multi- sensor satellite imagery, medical CT and MRI multi-band data have shown a great practi- cal potential...C ,SF _/ -- / WSM iS-I-0-d41-40450 $tltwt, kw" I (nor.- . Z-97- A real-time system for multi- sensor image analysis through pyramidal segmentation
NASA Astrophysics Data System (ADS)
Stoker, C. R.; Lemke, L. G.; Cannon, H.; Glass, B.; Dunagan, S.; Zavaleta, J.; Miller, D.; Gomez-Elvira, J.
2006-03-01
The Mars Analog Research and Technology (MARTE) experiment has developed an automated drilling system on a simulated Mars lander platform including drilling, sample handling, core analysis and down-hole instruments relevant to searching for life in the Martian subsurface.
The Costs of Experience Corps[R] in Public Schools
ERIC Educational Resources Information Center
Frick, Kevin D.; McGill, Sylvia; Tan, Erwin J.; Rebok, George W.; Carson, Michelle C.; Tanner, Elizabeth K.; Fried, Linda P.
2012-01-01
Objective: To describe the annual operational costs of a mature Experience Corps[R] program in elementary schools in the Baltimore City Public School System. Methods: Systematic records of expenditures kept by the community partner, Greater Homewood Community Corporation, to be reported to funders were made available for analysis. Expenditures…
Race and Assessment Practice in South Africa: Understanding Black Academic Experience
ERIC Educational Resources Information Center
Jawitz, Jeff
2012-01-01
Despite efforts to transform the racialised system of higher education in South Africa inherited from apartheid, there has been little research published that interrogates the relationship between race and the experience of academic staff within the South African higher education environment. Drawing on critical discourse analysis and critical…
LDEF: 69 Months in Space. First Post-Retrieval Symposium, part 2
NASA Technical Reports Server (NTRS)
Levine, Arlene S. (Editor)
1992-01-01
A compilation of papers from the symposium is presented. The preliminary data analysis is presented of the 57 experiments flown on the LDEF. The experiments include materials, coatings, thermal systems, power and propulsion, science (cosmic ray, interstellar gas, heavy ions, and micrometeoroid), electronics, optics, and life science.
Observing System Evaluations Using GODAE Systems
2009-09-01
DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release, distribution is unlimite 13. SUPPLEMENTARY NOTES 20091228151 14. ABSTRACT Global ocean...forecast systems, developed under the Global Ocean Data Assimilation Experiment (GODAE), are a powerful means of assessing the impact of different...components of the Global Ocean Observing System (GOOS). Using a range of analysis tools and approaches, GODAE systems are useful for quantifying the
NASA Astrophysics Data System (ADS)
Sutton, M. A.; Gilat, A.; Seidt, J.; Rajan, S.; Kidane, A.
2018-01-01
The very early stages of high rate tensile loading are important when attempting to characterize the response of materials during the transient loading time. To improve understanding of the conditions imposed on the specimen during the transient stage, a series of high rate loading experiments are performed using a Kolsky tensile bar system. Specimen forces and velocities during the high rate loading experiment are obtained by performing a thorough method of characteristics analysis of the system employed in the experiments. The in-situ full-field specimen displacements, velocities and accelerations during the loading process are quantified using modern ultra-high-speed imaging systems to provide detailed measurements of specimen response, with emphasis on the earliest stages of loading. Detailed analysis of the image-based measurements confirms that conditions are nominally consistent with those necessary for use of the one-dimensional wave equation within the relatively thin, dog-bone shaped tensile specimen. Specifically, measurements and use of the one-dimensional wave equation show clearly that the specimen has low inertial stresses in comparison to the applied transmitted force. Though the accelerations of the specimen continue for up to 50 μs, measurements show that the specimen is essentially in force equilibrium beginning a few microseconds after initial loading. These local measurements contrast with predictions based on comparison of the wave-based incident force measurements, which suggest that equilibrium occurs much later, on the order of 40-50 μs .
NASA Technical Reports Server (NTRS)
Fay, Stanley; Gates, Stephen; Henderson, Timothy; Sackett, Lester; Kirchwey, Kim; Stoddard, Isaac; Storch, Joel
1988-01-01
The second Control Of Flexible Structures Flight Experiment (COFS-2) includes a long mast as in the first flight experiment, but with the Langley 15-m hoop column antenna attached via a gimbal system to the top of the mast. The mast is to be mounted in the Space Shuttle cargo bay. The servo-driven gimbal system could be used to point the antenna relative to the mast. The dynamic interaction of the Shuttle Orbiter/COFS-2 system with the Orbiter on-orbit Flight Control System (FCS) and the gimbal pointing control system has been studied using analysis and simulation. The Orbiter pointing requirements have been assessed for their impact on allowable free drift time for COFS experiments. Three fixed antenna configurations were investigated. Also simulated was Orbiter attitude control behavior with active vernier jets during antenna slewing. The effect of experiment mast dampers was included. Control system stability and performance and loads on various portions of the COFS-2 structure were investigated. The study indicates possible undesirable interaction between the Orbiter FCS and the flexible, articulated COFS-2 mast/antenna system, even when restricted to vernier reaction jets.
Providing Nuclear Criticality Safety Analysis Education through Benchmark Experiment Evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
John D. Bess; J. Blair Briggs; David W. Nigg
2009-11-01
One of the challenges that today's new workforce of nuclear criticality safety engineers face is the opportunity to provide assessment of nuclear systems and establish safety guidelines without having received significant experience or hands-on training prior to graduation. Participation in the International Criticality Safety Benchmark Evaluation Project (ICSBEP) and/or the International Reactor Physics Experiment Evaluation Project (IRPhEP) provides students and young professionals the opportunity to gain experience and enhance critical engineering skills.
Fighting the Whole System: Dissociative Identity Disorder, Labeling Theory, and Iatrogenic Doubting.
Floris, Jessica; McPherson, Susan
2015-01-01
This research examines how individuals diagnosed with dissociative identity disorder construe their experiences of being labeled with a contested diagnosis. Semistructured interviews were conducted in the United Kingdom with 5 women and 2 men diagnosed with dissociative identity disorder. A framework analysis was conducted. The analysis identified 2 overarching themes: diagnosis cross-examined and navigating care systems. The diagnosis appeared to be continually assessed by participants for its fit with symptoms, and the doubt among professionals seemed to be unhelpfully reflected in participants' attempts to understand and come to terms with their experiences. The findings are considered in light of labeling theory, the iatrogenic effects of professional doubt, and current debates concerning the reliability and validity of psychiatric diagnostic systems that have been reinvigorated by the publication of the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition.
Teleoperation experiments with a Utah/MIT hand and a VPL DataGlove
NASA Technical Reports Server (NTRS)
Clark, D.; Demmel, J.; Hong, J.; Lafferriere, Gerardo; Salkind, L.; Tan, X.
1989-01-01
A teleoperation system capable of controlling a Utah/MIT Dextrous Hand using a VPL DataGlove as a master is presented. Additionally the system is capable of running the dextrous hand in robotic (autonomous) mode as new programs are developed. The software and hardware architecture used is presented and the experiments performed are described. The communication and calibration issues involved are analyzed and applications to the analysis and development of automated dextrous manipulations are investigated.
Experiments in Knowledge Refinement for a Large Rule-Based System
1993-08-01
empirical analysis to refine expert system knowledge bases. Aritificial Intelligence , 22:23-48, 1984. *! ...The Addison- Weslev series in artificial intelligence . Addison-Weslev. Reading, Massachusetts. 1981. Cooke, 1991: ttoger M. Cooke. Experts in...ment for classification systems. Artificial Intelligence , 35:197-226, 1988. 14 Overall, we believe that it will be possible to build a heuristic system
Ovejero, M C; Pérez Vega-Leal, A; Gallardo, M I; Espino, J M; Selva, A; Cortés-Giraldo, M A; Arráns, R
2017-02-01
The aim of this work is to present a new data acquisition, control, and analysis software system written in LabVIEW. This system has been designed to obtain the dosimetry of a silicon strip detector in polyethylene. It allows the full automation of the experiments and data analysis required for the dosimetric characterization of silicon detectors. It becomes a useful tool that can be applied in the daily routine check of a beam accelerator.
An Interactive System For Fourier Analysis Of Artichoke Flower Shape.
NASA Astrophysics Data System (ADS)
Impedovo, Sebastiano; Fanelli, Anna M.; Ligouras, Panagiotis
1984-06-01
In this paper we present an interactive system which allows the Fourier analysis of the artichoke flower-head profile. The system consistsof a DEC pdp 11/34 computer with both a a track-following device and a Tektronix 4010/1 graphic and alpha numeric display on-line. Some experiments have been carried out taking into account some different parental types of artichoke flower-head samples. It is shown here that a narrow band of only eight harmonics is sufficient to classify different artichoke flower shapes.
Laser Safety and Hazardous Analysis for the ARES (Big Sky) Laser System
DOE Office of Scientific and Technical Information (OSTI.GOV)
AUGUSTONI, ARNOLD L.
A laser safety and hazard analysis was performed for the ARES laser system based on the 2000 version of the American National Standards Institute's (ANSI) Standard Z136.1,for Safe Use of Lasers and the 2000 version of the ANSI Standard Z136.6, for Safe Use of Lasers Outdoors. The ARES laser system is a Van/Truck based mobile platform, which is used to perform laser interaction experiments and tests at various national test sites.
The weakest t-norm based intuitionistic fuzzy fault-tree analysis to evaluate system reliability.
Kumar, Mohit; Yadav, Shiv Prasad
2012-07-01
In this paper, a new approach of intuitionistic fuzzy fault-tree analysis is proposed to evaluate system reliability and to find the most critical system component that affects the system reliability. Here weakest t-norm based intuitionistic fuzzy fault tree analysis is presented to calculate fault interval of system components from integrating expert's knowledge and experience in terms of providing the possibility of failure of bottom events. It applies fault-tree analysis, α-cut of intuitionistic fuzzy set and T(ω) (the weakest t-norm) based arithmetic operations on triangular intuitionistic fuzzy sets to obtain fault interval and reliability interval of the system. This paper also modifies Tanaka et al.'s fuzzy fault-tree definition. In numerical verification, a malfunction of weapon system "automatic gun" is presented as a numerical example. The result of the proposed method is compared with the listing approaches of reliability analysis methods. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.
A low-cost, computer-controlled robotic flower system for behavioral experiments.
Kuusela, Erno; Lämsä, Juho
2016-04-01
Human observations during behavioral studies are expensive, time-consuming, and error prone. For this reason, automatization of experiments is highly desirable, as it reduces the risk of human errors and workload. The robotic system we developed is simple and cheap to build and handles feeding and data collection automatically. The system was built using mostly off-the-shelf components and has a novel feeding mechanism that uses servos to perform refill operations. We used the robotic system in two separate behavioral studies with bumblebees (Bombus terrestris): The system was used both for training of the bees and for the experimental data collection. The robotic system was reliable, with no flight in our studies failing due to a technical malfunction. The data recorded were easy to apply for further analysis. The software and the hardware design are open source. The development of cheap open-source prototyping platforms during the recent years has opened up many possibilities in designing of experiments. Automatization not only reduces workload, but also potentially allows experimental designs never done before, such as dynamic experiments, where the system responds to, for example, learning of the animal. We present a complete system with hardware and software, and it can be used as such in various experiments requiring feeders and collection of visitation data. Use of the system is not limited to any particular experimental setup or even species.
Statistical Analysis Tools for Learning in Engineering Laboratories.
ERIC Educational Resources Information Center
Maher, Carolyn A.
1990-01-01
Described are engineering programs that have used automated data acquisition systems to implement data collection and analyze experiments. Applications include a biochemical engineering laboratory, heat transfer performance, engineering materials testing, mechanical system reliability, statistical control laboratory, thermo-fluid laboratory, and a…
2011-01-01
Background Orthopaedic research projects focusing on small displacements in a small measurement volume require a radiation free, three dimensional motion analysis system. A stereophotogrammetrical motion analysis system can track wireless, small, light-weight markers attached to the objects. Thereby the disturbance of the measured objects through the marker tracking can be kept at minimum. The purpose of this study was to develop and evaluate a non-position fixed compact motion analysis system configured for a small measurement volume and able to zoom while tracking small round flat markers in respect to a fiducial marker which was used for the camera pose estimation. Methods The system consisted of two web cameras and the fiducial marker placed in front of them. The markers to track were black circles on a white background. The algorithm to detect a centre of the projected circle on the image plane was described and applied. In order to evaluate the accuracy (mean measurement error) and precision (standard deviation of the measurement error) of the optical measurement system, two experiments were performed: 1) inter-marker distance measurement and 2) marker displacement measurement. Results The first experiment of the 10 mm distances measurement showed a total accuracy of 0.0086 mm and precision of ± 0.1002 mm. In the second experiment, translations from 0.5 mm to 5 mm were measured with total accuracy of 0.0038 mm and precision of ± 0.0461 mm. The rotations of 2.25° amount were measured with the entire accuracy of 0.058° and the precision was of ± 0.172°. Conclusions The description of the non-proprietary measurement device with very good levels of accuracy and precision may provide opportunities for new, cost effective applications of stereophotogrammetrical analysis in musculoskeletal research projects, focusing on kinematics of small displacements in a small measurement volume. PMID:21284867
Propellant Management in Microgravity- Further Analysis of an Experiment Flown on REXUS-14
NASA Astrophysics Data System (ADS)
Strobino, D.; Zumbrunen, E.; Putzu, R.; Pontelandolfo, P.
2015-09-01
This paper is about the further analysis of an experiment named CAESAR (stands for Capillarity-based Experiment for Spatial Advanced Research): a sounding rocket experiment carried out by students of hepia within the REXUS program. The authors have launched on REXUS-14 a propellant management experiment based on capillarity to reliably confirm other ground-based cxperiments. In the framework of the present work, the authors present the comparison of CAESAR experimental data with theoretical profiles provided in literature. The objective of this flight was to place several Propellant Management Devices (PMD) in a microgravity environment and acquire images of the fluid distribution around them. The main element of the experiment, called a sponge, is a PMD for space vehicles, often used in satellites. This radial panel shaped device can be used at the bottom of a satellite tank to keep the propellant near the outlet. It is designed to work even if the vehicle undergoes small accelerations, for example during station-keeping maneuvers. The fluid is eccentric but stays on the sponge and near the outlet, so the injection system of the motor is continuously supplied with the propellant. As previously published, the authors have created a buoyancy test bench and have designed another system by magnetic levitation to perform the same experiment on earth. These systems are easier to use and less expensive than a sounding rocket, a parabolic flight or a drop tower (i.e. other system to obtain microgravity on earth), so they will be very useful to make progress in this particular domain of science. They will also allow universities with small funds to work within this spatial field. A previous publication showed, from a qualitative point of view, a good agreement between experiments and theory; however in this paper quantitative comparisons are given. With this demonstrated, hepia can validate its buoyancy test facility with real flight tests.
Digital controller design: Analysis of the annular suspension pointing system
NASA Technical Reports Server (NTRS)
Kuo, B. C.
1979-01-01
The Annular Suspension and Pointing System (ASPS) is a payload auxiliary pointing device of the Space Shuttle. The ASPS is comprised of two major subassemblies, a vernier and a coarse pointing subsystem. The experiment is attached to a mounting plate/rim combination which is suspended on magnetic bearing/actuators (MBA) strategically located about the rim. Fine pointing is achieved by gimballing the plate/rim within the MBA gaps. Control about the experiment line-of-sight is obtained through the use of a non-contacting rim drive and positioning torquer. All sensors used to close the servo loops on the vernier system are noncontacting elements. Therefore, the experiment is a free-flyer constrained only by the magnetic forces generated by the control loops.
Vinnakota, Kalyan C; Beard, Daniel A; Dash, Ranjan K
2009-01-01
Identification of a complex biochemical system model requires appropriate experimental data. Models constructed on the basis of data from the literature often contain parameters that are not identifiable with high sensitivity and therefore require additional experimental data to identify those parameters. Here we report the application of a local sensitivity analysis to design experiments that will improve the identifiability of previously unidentifiable model parameters in a model of mitochondrial oxidative phosphorylation and tricaboxylic acid cycle. Experiments were designed based on measurable biochemical reactants in a dilute suspension of purified cardiac mitochondria with experimentally feasible perturbations to this system. Experimental perturbations and variables yielding the most number of parameters above a 5% sensitivity level are presented and discussed.
NASA Astrophysics Data System (ADS)
Peebles, D. E.; Peebles, H. C.; Ohlhausen, J. A.; Hurst, M. J.
1996-02-01
A specially designed ultrahigh vacuum in situ surface analysis and wetting system has been constructed to study the spreading of liquid metal solders on carefully prepared and well-characterized solid substrates. The system consists of a standard ultrahigh vacuum surface analysis chamber linked to a reaction chamber for wetting or other experiments at pressures up to atmospheric. A sophisticated video system allows real-time monitoring of the spreading of the liquid metal through both side and top views. An infrared imaging system allows accurate remote temperature measurements. Sample surfaces are prepared and spreading experiments performed without intermediate exposure of the surfaces to the contaminating atmospheres. Solder spreading is performed under 50 Torr of highly purified helium gas to allow for adequate thermal coupling between the solder and the substrate. Initial studies have been completed for the spreading of pure tin solder on copper substrates in the absence of any fluxing agent. Three types of copper substrate surfaces were investigated in these experiments: the sputter-cleaned, air-exposed, and the as-received surface. Surface chemical analysis by x-ray photoelectron spectroscopy showed the air-exposed surface to consist of about 3 nm of Cu2O, while the as-received surface consisted of about 8 nm of Cu2O. The sputter-cleaned surface contained less than one monolayer (0.3 nm) of Cu2O. Spreading experiments utilizing a linear temperature ramp show that pure tin solder spreads readily on oxidized copper surfaces at elevated temperatures. The initiation temperature for rapid tin spreading on the as-received copper surface was 325 °C. Decreasing the thickness of the oxide on the surface lowered the observed temperature for the initiation of spreading and increased the rate of spreading. On the sputter-cleaned copper surface, rapid solder spreading was observed immediately upon melting of the solder.
Analyzing high energy physics data using database computing: Preliminary report
NASA Technical Reports Server (NTRS)
Baden, Andrew; Day, Chris; Grossman, Robert; Lifka, Dave; Lusk, Ewing; May, Edward; Price, Larry
1991-01-01
A proof of concept system is described for analyzing high energy physics (HEP) data using data base computing. The system is designed to scale up to the size required for HEP experiments at the Superconducting SuperCollider (SSC) lab. These experiments will require collecting and analyzing approximately 10 to 100 million 'events' per year during proton colliding beam collisions. Each 'event' consists of a set of vectors with a total length of approx. one megabyte. This represents an increase of approx. 2 to 3 orders of magnitude in the amount of data accumulated by present HEP experiments. The system is called the HEPDBC System (High Energy Physics Database Computing System). At present, the Mark 0 HEPDBC System is completed, and can produce analysis of HEP experimental data approx. an order of magnitude faster than current production software on data sets of approx. 1 GB. The Mark 1 HEPDBC System is currently undergoing testing and is designed to analyze data sets 10 to 100 times larger.
RADC SCAT automated sneak circuit analysis tool
NASA Astrophysics Data System (ADS)
Depalma, Edward L.
The sneak circuit analysis tool (SCAT) provides a PC-based system for real-time identification (during the design phase) of sneak paths and design concerns. The tool utilizes an expert system shell to assist the analyst so that prior experience with sneak analysis is not necessary for performance. Both sneak circuits and design concerns are targeted by this tool, with both digital and analog circuits being examined. SCAT focuses the analysis at the assembly level, rather than the entire system, so that most sneak problems can be identified and corrected by the responsible design engineer in a timely manner. The SCAT program identifies the sneak circuits to the designer, who then decides what course of action is necessary.
Analysis of sensor network observations during some simulated landslide experiments
NASA Astrophysics Data System (ADS)
Scaioni, M.; Lu, P.; Feng, T.; Chen, W.; Wu, H.; Qiao, G.; Liu, C.; Tong, X.; Li, R.
2012-12-01
A multi-sensor network was tested during some experiments on a landslide simulation platform established at Tongji University (Shanghai, P.R. China). Here landslides were triggered by means of artificial rainfall (see Figure 1). The sensor network currently incorporates contact sensors and two imaging systems. This represent a novel solution, because the spatial sensor network incorporate either contact sensors and remote sensors (video-cameras). In future, these sensors will be installed on two real ground slopes in Sichuan province (South-West China), where Wenchuan earthquake occurred in 2008. This earthquake caused the immediate activation of several landslide, while other area became unstable and still are a menace for people and properties. The platform incorporates the reconstructed scale slope, sensor network, communication system, database and visualization system. Some landslide simulation experiments allowed ascertaining which sensors could be more suitable to be deployed in Wenchuan area. The poster will focus on the analysis of results coming from down scale simulations. Here the different steps of the landslide evolution can be followed on the basis of sensor observations. This include underground sensors to detect the water table level and the pressure in the ground, a set of accelerometers and two inclinometers. In the first part of the analysis the full data series are investigated to look for correlations and common patterns, as well as to link them to the physical processes. In the second, 4 subsets of sensors located in neighbor positions are analyzed. The analysis of low- and high-speed image sequences allowed to track a dense field of displacement on the slope surface. These outcomes have been compared to the ones obtained from accelerometers for cross-validation. Images were also used for the photogrammetric reconstruction of the slope topography during the experiment. Consequently, volume computation and mass movements could be evaluated on the basis of processed images.; Figure 1 - The landslide simulation platform at Tongji University at the end of an experiment. The picture shows the body of simulated landslide.
NASA DOE POD NDE Capabilities Data Book
NASA Technical Reports Server (NTRS)
Generazio, Edward R.
2015-01-01
This data book contains the Directed Design of Experiments for Validating Probability of Detection (POD) Capability of NDE Systems (DOEPOD) analyses of the nondestructive inspection data presented in the NTIAC, Nondestructive Evaluation (NDE) Capabilities Data Book, 3rd ed., NTIAC DB-97-02. DOEPOD is designed as a decision support system to validate inspection system, personnel, and protocol demonstrating 0.90 POD with 95% confidence at critical flaw sizes, a90/95. The test methodology used in DOEPOD is based on the field of statistical sequential analysis founded by Abraham Wald. Sequential analysis is a method of statistical inference whose characteristic feature is that the number of observations required by the procedure is not determined in advance of the experiment. The decision to terminate the experiment depends, at each stage, on the results of the observations previously made. A merit of the sequential method, as applied to testing statistical hypotheses, is that test procedures can be constructed which require, on average, a substantially smaller number of observations than equally reliable test procedures based on a predetermined number of observations.
Complete scanpaths analysis toolbox.
Augustyniak, Piotr; Mikrut, Zbigniew
2006-01-01
This paper presents a complete open software environment for control, data processing and assessment of visual experiments. Visual experiments are widely used in research on human perception physiology and the results are applicable to various visual information-based man-machine interfacing, human-emulated automatic visual systems or scanpath-based learning of perceptual habits. The toolbox is designed for Matlab platform and supports infra-red reflection-based eyetracker in calibration and scanpath analysis modes. Toolbox procedures are organized in three layers: the lower one, communicating with the eyetracker output file, the middle detecting scanpath events on a physiological background and the one upper consisting of experiment schedule scripts, statistics and summaries. Several examples of visual experiments carried out with use of the presented toolbox complete the paper.
artdaq: DAQ software development made simple
NASA Astrophysics Data System (ADS)
Biery, Kurt; Flumerfelt, Eric; Freeman, John; Ketchum, Wesley; Lukhanin, Gennadiy; Rechenmacher, Ron
2017-10-01
For a few years now, the artdaq data acquisition software toolkit has provided numerous experiments with ready-to-use components which allow for rapid development and deployment of DAQ systems. Developed within the Fermilab Scientific Computing Division, artdaq provides data transfer, event building, run control, and event analysis functionality. This latter feature includes built-in support for the art event analysis framework, allowing experiments to run art modules for real-time filtering, compression, disk writing and online monitoring. As art, also developed at Fermilab, is also used for offline analysis, a major advantage of artdaq is that it allows developers to easily switch between developing online and offline software. artdaq continues to be improved. Support for an alternate mode of running whereby data from some subdetector components are only streamed if requested has been added; this option will reduce unnecessary DAQ throughput. Real-time reporting of DAQ metrics has been implemented, along with the flexibility to choose the format through which experiments receive the reports; these formats include the Ganglia, Graphite and syslog software packages, along with flat ASCII files. Additionally, work has been performed investigating more flexible modes of online monitoring, including the capability to run multiple online monitoring processes on different hosts, each running its own set of art modules. Finally, a web-based GUI interface through which users can configure details of their DAQ system has been implemented, increasing the ease of use of the system. Already successfully deployed on the LArlAT, DarkSide-50, DUNE 35ton and Mu2e experiments, artdaq will be employed for SBND and is a strong candidate for use on ICARUS and protoDUNE. With each experiment comes new ideas for how artdaq can be made more flexible and powerful. The above improvements will be described, along with potential ideas for the future.
Studies of satellite support to weather modification in the western US region
NASA Technical Reports Server (NTRS)
Cotton, W. R.; Grant, L. O.; Vonderhaar, T. H.
1978-01-01
The applications of meteorological satellite data to both summer and winter weather modification programs are addressed. Appraisals of the capability of satellites to assess seedability, to provide real-time operational support, and to assist in the post-experiment analysis of a seeding experiment led to the incorporation of satellite observing systems as a major component in the Bureau of Reclamations weather modification activities. Satellite observations are an integral part of the South Park Area cumulus experiment (SPACE) which aims to formulate a quantitative hypothesis for enhancing precipitation from orographically induced summertime mesoscale convective systems (orogenic mesoscale systems). Progress is reported in using satellite observations to assist in classifying the important mesoscale systems, and in defining their frequency and coverage, and potential area of effect. Satellite studies of severe storms are also covered.
LDEF: 69 Months in Space. Third Post-Retrieval Symposium, part 1
NASA Technical Reports Server (NTRS)
Levine, Arlene S. (Editor)
1995-01-01
This volume (Part 1 of 3) is a compilation of papers presented at the Third Long Duration Exposure Facility (LDEF) Post-Retrieval Symposium. The papers represent the data analysis of the 57 experiments flown on the LDEF. The experiments include materials, coatings, thermal systems, power and propulsion, science (cosmic ray, interstellar gas, heavy ions, micrometeoroid, etc.), electronics, optics, and life science. In addition, papers on preliminary data analysis of EURECA, EOIM-3, and other spacecraft are included.
The Electrocatalytic Reduction of Carbon Dioxide Using Macrocycles of Nickel and Cobalt.
1980-10-24
34 water only. Carbon monoxide was found to coprise at least 50% of the total reduced products in all cases; H was also produced in most cases. While a...experiments performed in a gas tight elec- trolysis cell followed by g.c. analysis. The solvents used were either CH3CN-H20 or water only. Carbon...these experiments, and the solvent systems used were either acetonitrile/ water or water only. Gas chromato- graphic analysis was used to determine
LDEF: 69 Months in Space. Third Post-Retrieval Symposium, part 2
NASA Technical Reports Server (NTRS)
Levine, Arlene S. (Editor)
1995-01-01
This volume is a compilation of papers presented at the Third Long Duration Exposure Facility (LDEF) Post-Retrieval Symposium. The papers represent the data analysis of the 57 experiments flown on the LDEF. The experiments include materials, coatings, thermal systems, power and propulsion, science (cosmic ray, interstellar gas, heavy ions, micrometeoroid, etc.), electronics, optics, and life science. In addition, papers on preliminary data analysis of EURECA, EOIM-3, and other spacecraft are included. This second of three parts covers spacecraft construction materials.
International Space Station Increment-2 Microgravity Environment Summary Report
NASA Technical Reports Server (NTRS)
Jules, Kenol; Hrovat, Kenneth; Kelly, Eric; McPherson, Kevin; Reckart, Timothy
2002-01-01
This summary report presents the results of some of the processed acceleration data, collected aboard the International Space Station during the period of May to August 2001, the Increment-2 phase of the station. Two accelerometer systems were used to measure the acceleration levels during activities that took place during the Increment-2 segment. However, not all of the activities were analyzed for this report due to time constraints, lack of precise information regarding some payload operations and other station activities. The National Aeronautics and Space Administration sponsors the Microgravity Acceleration Measurement System and the Space Acceleration Microgravity System to support microgravity science experiments, which require microgravity acceleration measurements. On April 19, 2001, both the Microgravity Acceleration Measurement System and the Space Acceleration Measurement System units were launched on STS-100 from the Kennedy Space Center for installation on the International Space Station. The Microgravity Acceleration Measurement System unit was flown to the station in support of science experiments requiring quasi-steady acceleration measurements, while the Space Acceleration Measurement System unit was flown to support experiments requiring vibratory acceleration measurement. Both acceleration systems are also used in support of vehicle microgravity requirements verification. The International Space Station Increment-2 reduced gravity environment analysis presented in this report uses acceleration data collected by both sets of accelerometer systems: 1) The Microgravity Acceleration Measurement System, which consists of two sensors: the Orbital Acceleration Research Experiment Sensor Subsystem, a low frequency range sensor (up to 1 Hz), is used to characterize the quasi-steady environment for payloads and the vehicle, and the High Resolution Accelerometer Package, which is used to characterize the vibratory environment up to 100 Hz. 2) The Space Acceleration Measurement System, which is a high frequency sensor, measures vibratory acceleration data in the range of 0.01 to 300 Hz. This summary report presents analysis of some selected quasisteady and vibratory activities measured by these accelerometers during Increment-2 from May to August 20, 2001.
International Space Station Increment-3 Microgravity Environment Summary Report
NASA Technical Reports Server (NTRS)
Jules, Kenol; Hrovat, Kenneth; Kelly, Eric; McPherson, Kevin; Reckart, Timothy; Grodsinksy, Carlos
2002-01-01
This summary report presents the results of some of the processed acceleration data measured aboard the International Space Station during the period of August to December 2001. Two accelerometer systems were used to measure the acceleration levels for the activities that took place during Increment-3. However, not all of the activities were analyzed for this report due to time constraint and lack of precise timeline information regarding some payload operations and station activities. The National Aeronautics and Space Administration sponsors the Microgravity Acceleration Measurement System and the Space Acceleration Microgravity System to support microgravity science experiments which require microgravity acceleration measurements. On April 19, 2001, both the Microgravity Acceleration Measurement System and the Space Acceleration Measurement System units were launched on STS-100 from the Kennedy Space Center for installation on the International Space Station. The Microgravity Acceleration Measurement System unit was flown to the station in support of science experiments requiring quasi-steady acceleration measurements, while the Space Acceleration Measurement System unit was flown to support experiments requiring vibratory acceleration measurement. Both acceleration systems are also used in support of the vehicle microgravity requirements verification. The International Space Station Increment-3 reduced gravity environment analysis presented in this report uses acceleration data collected by both sets of accelerometer systems: (1) The Microgravity Acceleration Measurement System, which consists of two sensors: the Orbital Acceleration Research Experiment Sensor Subsystem, a low frequency range sensor (up to 1 Hz), is used to characterize the quasi-steady environment for payloads and vehicle, and the High Resolution Accelerometer Package, which is used to characterize the vibratory environment up to 100 Hz. (2) The Space Acceleration Measurement System, which is a high frequency sensor, measures vibratory acceleration data in the range of 0.01 to 400 Hz. This summary report presents analysis of some selected quasi-steady and vibratory activities measured by these accelerometers during Increment-3 from August to December, 2001.
Pan, Cong-Yuan; Du, Xue-Wei; An, Ning; Han, Zhen-Yu; Wang, Sheng-Bo; Wei, Wei; Wang, Qiu-Ping
2013-12-01
Laser-induced breakdown spectroscopy (LIBS) is one of the most promising technologies to be applied to metallurgical composition online monitoring in these days. In order to study the spectral characters of LIBS spectrum and to investigate the quantitative analysis method of material composition under vacuum and high temperature environment, a LIBS measurement system was designed and set up which can be used for conducting experiments with high-temperature or molten samples in different vacuum environment. The system consists of a Q-switched Nd : YAG laser used as the light source, lens with different focus lengths used for laser focusing and spectrum signal collecting, a spectrometer used for detecting the signal of LIBS spectrums, and a vacuum system for holding and heating the samples while supplying a vacuum environment. The vacuum was achieved and maintained by a vacuum pump and an electric induction furnace was used for heating the system. The induction coil was integrated to the vacuum system by attaching to a ceramic sealing flange. The system was installed and testified, and the results indicate that the vacuum of the system can reach 1X 10(-4) Pa without heating, while the heating temperature could be about 1 600 degreeC, the system can be used for melting metal samples such as steel and aluminum and get the LIBS spectrum of the samples at the same time. Utilizing this system, LIBS experiments were conducted using standard steel samples under different vacuum or high-temperature conditions. Results of comparison between LIBS spectrums of solid steel samples under different vacuum were achieved, and so are the spectrums of molten and solid steel samples under vacuum environment. Through data processing and theoretical analyzing of these spectrums, the initial results of those experiments are in good agreement with the results that are presently reported, which indicates that the whole system functions well and is available for molten metal LIBS experiment under vacuum environment.
Internal Versus External DSLs for Trace Analysis: Extended Abstract
NASA Technical Reports Server (NTRS)
Barringer, Howard; Havelund, Klaus
2011-01-01
This tutorial explores the design and implementation issues arising in the development of domain-specific languages for trace analysis. It introduces the audience to the general concepts underlying such special-purpose languages building upon the authors' own experiences in developing both external domain specific languages and systems, such as EAGLE, HAWK, RULER and LOGSCOPE, and the more recent internal domain-specific language and system TRACECONTRACT within the SCALA language.
An Optoelectronic Equivalent Narrowband Filter for High Resolution Optical Spectrum Analysis
Feng, Kunpeng; Cui, Jiwen; Dang, Hong; Wu, Weidong; Sun, Xun; Jiang, Xuelin; Tan, Jiubin
2017-01-01
To achieve a narrow bandwidth optical filter with a wide swept range for new generation optical spectrum analysis (OSA) of high performance optical sensors, an optoelectronic equivalent narrowband filter (OENF) was investigated and a swept optical filter with bandwidth of several MHz and sweep range of several tens of nanometers was built using electric filters and a sweep laser as local oscillator (LO). The principle of OENF is introduced and analysis of the OENF system is presented. Two electric filters are optimized to be RBW filters for high and medium spectral resolution applications. Both simulations and experiments are conducted to verify the OENF principle and the results show that the power uncertainty is less than 1.2% and the spectral resolution can reach 6 MHz. Then, a real-time wavelength calibration system consisting of a HCN gas cell and Fabry–Pérot etalon is proposed to guarantee a wavelength accuracy of ±0.4 pm in the C-band and to reduce the influence of phase noise and nonlinear velocity of the LO sweep. Finally, OSA experiments on actual spectra of various optical sensors are conducted using the OENF system. These experimental results indicate that OENF system has an excellent capacity for the analysis of fine spectrum structures. PMID:28208624
An Optoelectronic Equivalent Narrowband Filter for High Resolution Optical Spectrum Analysis.
Feng, Kunpeng; Cui, Jiwen; Dang, Hong; Wu, Weidong; Sun, Xun; Jiang, Xuelin; Tan, Jiubin
2017-02-10
To achieve a narrow bandwidth optical filter with a wide swept range for new generation optical spectrum analysis (OSA) of high performance optical sensors, an optoelectronic equivalent narrowband filter (OENF) was investigated and a swept optical filter with bandwidth of several MHz and sweep range of several tens of nanometers was built using electric filters and a sweep laser as local oscillator (LO). The principle of OENF is introduced and analysis of the OENF system is presented. Two electric filters are optimized to be RBW filters for high and medium spectral resolution applications. Both simulations and experiments are conducted to verify the OENF principle and the results show that the power uncertainty is less than 1.2% and the spectral resolution can reach 6 MHz. Then, a real-time wavelength calibration system consisting of a HCN gas cell and Fabry-Pérot etalon is proposed to guarantee a wavelength accuracy of ±0.4 pm in the C-band and to reduce the influence of phase noise and nonlinear velocity of the LO sweep. Finally, OSA experiments on actual spectra of various optical sensors are conducted using the OENF system. These experimental results indicate that OENF system has an excellent capacity for the analysis of fine spectrum structures.
NASA Technical Reports Server (NTRS)
Casas, J. C.; Koziana, J. V.; Saylor, M. S.; Kindle, E. C.
1982-01-01
Problems associated with the development of the measurement of air pollution from satellites (MAPS) experiment program are addressed. The primary thrust of this research was the utilization of the MAPS experiment data in three application areas: low altitude aircraft flights (one to six km); mid altitude aircraft flights (eight to 12 km); and orbiting space platforms. Extensive research work in four major areas of data management was the framework for implementation of the MAPS experiment technique. These areas are: (1) data acquisition; (2) data processing, analysis and interpretation algorithms; (3) data display techniques; and (4) information production.
Man-machine interface analysis of the flight design system
NASA Technical Reports Server (NTRS)
Ramsey, H. R.; Atwood, M. E.; Willoughby, J. K.
1978-01-01
The objective of the current effort was to perform a broad analysis of the human factors issues involved in the design of the Flight Design System (FDS). The analysis was intended to include characteristics of the system itself, such as: (1) basic structure and functional capabilities of FDS; (2) user backgrounds, capabilities, and possible modes of use; (3) FDS interactive dialogue, problem solving aids; (4) system data management capabilities; and to include, as well, such system related matters as: (1) flight design team structure; (2) roles of technicians; (3) user training; and (4) methods of evaluating system performance. Wherever possible, specific recommendations are made. In other cases, the issues which seem most important are identified. In some cases, additional analyses or experiments which might provide resolution are suggested.
Performance analysis of wireless sensor networks in geophysical sensing applications
NASA Astrophysics Data System (ADS)
Uligere Narasimhamurthy, Adithya
Performance is an important criteria to consider before switching from a wired network to a wireless sensing network. Performance is especially important in geophysical sensing where the quality of the sensing system is measured by the precision of the acquired signal. Can a wireless sensing network maintain the same reliability and quality metrics that a wired system provides? Our work focuses on evaluating the wireless GeoMote sensor motes that were developed by previous computer science graduate students at Mines. Specifically, we conducted a set of experiments, namely WalkAway and Linear Array experiments, to characterize the performance of the wireless motes. The motes were also equipped with the Sticking Heartbeat Aperture Resynchronization Protocol (SHARP), a time synchronization protocol developed by a previous computer science graduate student at Mines. This protocol should automatically synchronize the mote's internal clocks and reduce time synchronization errors. We also collected passive data to evaluate the response of GeoMotes to various frequency components associated with the seismic waves. With the data collected from these experiments, we evaluated the performance of the SHARP protocol and compared the performance of our GeoMote wireless system against the industry standard wired seismograph system (Geometric-Geode). Using arrival time analysis and seismic velocity calculations, we set out to answer the following question. Can our wireless sensing system (GeoMotes) perform similarly to a traditional wired system in a realistic scenario?
NASA Technical Reports Server (NTRS)
Norton, H. N.
1979-01-01
An earth-orbiting molecular shield that offers a unique opportunity for conducting physics, chemistry, and material processing experiments under a combination of environmental conditions that are not available in terrestrial laboratories is equipped with apparatus for forming a molecular beam from the freestream. Experiments are carried out using a moderate energy, high flux density, high purity atomic oxygen beam in the very low density environment within the molecular shield. As a minimum, the following instruments are required for the molecular shield: (1) a mass spectrometer; (2) a multifunction material analysis instrumentation system; and (3) optical spectrometry equipment. The design is given of a furlable molecular shield that allows deployment and retrieval of the system (including instrumentation and experiments) to be performed without contamination. Interfaces between the molecular shield system and the associated spacecraft are given. An in-flight deployment sequence is discussed that minimizes the spacecraft-induced contamination in the vicinity of the shield. Design approaches toward a precursor molecular shield system are shown.
Franke, O. Lehn; Reilly, Thomas E.
1987-01-01
The most critical and difficult aspect of defining a groundwater system or problem for conceptual analysis or numerical simulation is the selection of boundary conditions . This report demonstrates the effects of different boundary conditions on the steady-state response of otherwise similar ground-water systems to a pumping stress. Three series of numerical experiments illustrate the behavior of three hypothetical groundwater systems that are rectangular sand prisms with the same dimensions but with different combinations of constant-head, specified-head, no-flow, and constant-flux boundary conditions. In the first series of numerical experiments, the heads and flows in all three systems are identical, as are the hydraulic conductivity and system geometry . However, when the systems are subjected to an equal stress by a pumping well in the third series, each differs significantly in its response . The highest heads (smallest drawdowns) and flows occur in the systems most constrained by constant- or specified-head boundaries. These and other observations described herein are important in steady-state calibration, which is an integral part of simulating many ground-water systems. Because the effects of boundary conditions on model response often become evident only when the system is stressed, a close match between the potential distribution in the model and that in the unstressed natural system does not guarantee that the model boundary conditions correctly represent those in the natural system . In conclusion, the boundary conditions that are selected for simulation of a ground-water system are fundamentally important to groundwater systems analysis and warrant continual reevaluation and modification as investigation proceeds and new information and understanding are acquired.
A data storage, retrieval and analysis system for endocrine research. [for Skylab
NASA Technical Reports Server (NTRS)
Newton, L. E.; Johnston, D. A.
1975-01-01
This retrieval system builds, updates, retrieves, and performs basic statistical analyses on blood, urine, and diet parameters for the M071 and M073 Skylab and Apollo experiments. This system permits data entry from cards to build an indexed sequential file. Programs are easily modified for specialized analyses.
Coder Drift: A Reliability Problem for Teacher Observations.
ERIC Educational Resources Information Center
Marston, Paul T.; And Others
The results of two experiments support the hypothesis of "coder drift" which is defined as change that takes place while trained coders are using a system for a number of classroom observation sessions. The coding system used was a modification of the low-inference Flanders System of Interaction Analysis which calls for assigning…
Simple gas chromatographic system for analysis of microbial respiratory gases
NASA Technical Reports Server (NTRS)
Carle, G. C.
1972-01-01
Dual column ambient temperature system, consisting of pair of capillary columns, microbead thermistor detector and micro gas-sampling valve, is used in remote life-detection equipment for space experiments. Performance outweighs advantage gained by utilizing single-column systems to reduce weight, conserve carrier gas and operate at lower power levels.
Education Reform and School Funding: An Analysis of the Georgian Experience
ERIC Educational Resources Information Center
Maglakelidze, Shorena
2011-01-01
Alteration of the direct state funding system and transition to a voucher system commenced in 2005. Establishment of a voucher funding system for secondary schools aimed at ensuring more transparency and conscientiousness of allocating the sums for schools, as well as effective expenditure of money. Voucher funding has had to ensure financial…
RF-based power distribution system for optogenetic experiments
NASA Astrophysics Data System (ADS)
Filipek, Tomasz A.; Kasprowicz, Grzegorz H.
2017-08-01
In this paper, the wireless power distribution system for optogenetic experiment was demonstrated. The design and the analysis of the power transfer system development is described in details. The architecture is outlined in the context of performance requirements that had to be met. We show how to design a wireless power transfer system using resonant coupling circuits which consist of a number of receivers and one transmitter covering the entire cage area with a specific power density. The transmitter design with the full automated protection stage is described with detailed consideration of the specification and the construction of the transmitting loop antenna. In addition, the design of the receiver is described, including simplification of implementation and the minimization of the impact of component tolerances on the performance of the distribution system. The conducted analysis has been confirmed by calculations and measurement results. The presented distribution system was designed to provide 100 mW power supply to each of the ten possible receivers in a limited 490 x 350 mm cage space while using a single transmitter working at the coupling resonant frequency of 27 MHz.
Wiebrands, Michael; Malajczuk, Chris J; Woods, Andrew J; Rohl, Andrew L; Mancera, Ricardo L
2018-06-21
Molecular graphics systems are visualization tools which, upon integration into a 3D immersive environment, provide a unique virtual reality experience for research and teaching of biomolecular structure, function and interactions. We have developed a molecular structure and dynamics application, the Molecular Dynamics Visualization tool, that uses the Unity game engine combined with large scale, multi-user, stereoscopic visualization systems to deliver an immersive display experience, particularly with a large cylindrical projection display. The application is structured to separate the biomolecular modeling and visualization systems. The biomolecular model loading and analysis system was developed as a stand-alone C# library and provides the foundation for the custom visualization system built in Unity. All visual models displayed within the tool are generated using Unity-based procedural mesh building routines. A 3D user interface was built to allow seamless dynamic interaction with the model while being viewed in 3D space. Biomolecular structure analysis and display capabilities are exemplified with a range of complex systems involving cell membranes, protein folding and lipid droplets.
NASA Astrophysics Data System (ADS)
Lyu, Bai-cheng; Wu, Wen-hua; Yao, Wei-an; Du, Yu
2017-06-01
Mooring system is the key equipment of FPSO safe operation. The soft yoke mooring system is regarded as one of the best shallow water mooring strategies and widely applied to the oil exploitation in the Bohai Bay in China and the Gulf of Mexico. Based on the analysis of numerous monitoring data obtained by the prototype monitoring system of one FPSO in the Bohai Bay, the on-site lateral vibration behaviors found on the site of the soft yoke subject to wave load were analyzed. ADAMS simulation and model experiment were utilized to analyze the soft yoke lateral vibration and it was determined that lateral vibration was resonance behaviors caused by wave excitation. On the basis of the soft yoke longitudinal restoring force being guaranteed, a TLD-based vibration damper system was constructed and the vibration reduction experiments with multi-tank space and multi-load conditions were developed. The experimental results demonstrated that the proposed TLD vibration reduction system can effectively reduce lateral vibration of soft yoke structures.
Synthesis and analysis of precise spaceborne laser ranging systems, volume 2. [Spacelab payload
NASA Technical Reports Server (NTRS)
Paddon, E. A.
1978-01-01
The performance capabilities of specific shuttle-based laser ranging systems were evaluated, and interface and support requirements were determined. The preliminary design of a shuttle-borne laser ranging experiment developed as part of the Spacelab program is discussed.
Visual Computing Environment Workshop
NASA Technical Reports Server (NTRS)
Lawrence, Charles (Compiler)
1998-01-01
The Visual Computing Environment (VCE) is a framework for intercomponent and multidisciplinary computational simulations. Many current engineering analysis codes simulate various aspects of aircraft engine operation. For example, existing computational fluid dynamics (CFD) codes can model the airflow through individual engine components such as the inlet, compressor, combustor, turbine, or nozzle. Currently, these codes are run in isolation, making intercomponent and complete system simulations very difficult to perform. In addition, management and utilization of these engineering codes for coupled component simulations is a complex, laborious task, requiring substantial experience and effort. To facilitate multicomponent aircraft engine analysis, the CFD Research Corporation (CFDRC) is developing the VCE system. This system, which is part of NASA's Numerical Propulsion Simulation System (NPSS) program, can couple various engineering disciplines, such as CFD, structural analysis, and thermal analysis.
Modeling and design for a new ionospheric modification experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sales, G.S.; Platt, I.G.; Haines, D.M.
1990-10-01
Plans are now underway to carry out new HF oblique ionospheric modification experiments with increased radiated power using a new high gain antenna system and a 1 MW transmitter. The output of this large transmitting system will approach 90 dBW. An important part of this program is to determine the existence of a threshold for non-linear effects by varying the transmitter output. For these experiments we are introducing a new ET probe system, a low power oblique sounder, to be used along the same propagation path as the high power disturbing transmitter. This concept was first used by soviet researchersmore » to insure that this diagnostic signal always passes through the modified region of the ionosphere. The HF probe system will use a low power (150 W) CW signal shifted by approximately 40 kHz from the frequency used by the high power system. The transmitter for the probe system will be at the same location as the high power transmitter while the probe receiver will be 2400 km down range. The probe receiving system uses multiple antennas to measure the the vertical and azimuthal angle of arrival as well the Doppler frequency shift of the arriving probe signal. The three antenna array will be in an L configuration to measure the phase differences between the antennas. At the midpath point a vertical sounder will provide the ionospheric information necessary for the frequency management of the experiment. Real-time signal processing will permit the site operators to evaluate the performance of the system and make adjustments during the experiment. A special ray tracing computer will be used to provide real-time frequencies and elevation beam steering during the experiment. A description of the system and the analysis used in the design of the experiment are presented.« less
NASA Technical Reports Server (NTRS)
Conway, B. A.
1974-01-01
Astronaut crew motions can produce some of the largest disturbances acting on a manned spacecraft which can affect vehicle attitude and pointing. Skylab Experiment T-013 was developed to investigate the magnitude and effects of some of these disturbances on the Skylab spacecraft. The methods and techniques used to carry out this experiment are discussed, and preliminary results of data analysis presented. Initial findings indicate that forces on the order of 300 N were exerted during vigorous soaring activities, and that certain experiment activities produced spacecraft angular rate excursions 0.03 to 0.07 deg/sec. Results of Experiment T-013 will be incorporated into mathematical models of crew-motion disturbances, and are expected to be of significant aid in the sizing, design, and analysis of stabilization and control systems for future manned spacecraft.
Parallel processing of genomics data
NASA Astrophysics Data System (ADS)
Agapito, Giuseppe; Guzzi, Pietro Hiram; Cannataro, Mario
2016-10-01
The availability of high-throughput experimental platforms for the analysis of biological samples, such as mass spectrometry, microarrays and Next Generation Sequencing, have made possible to analyze a whole genome in a single experiment. Such platforms produce an enormous volume of data per single experiment, thus the analysis of this enormous flow of data poses several challenges in term of data storage, preprocessing, and analysis. To face those issues, efficient, possibly parallel, bioinformatics software needs to be used to preprocess and analyze data, for instance to highlight genetic variation associated with complex diseases. In this paper we present a parallel algorithm for the parallel preprocessing and statistical analysis of genomics data, able to face high dimension of data and resulting in good response time. The proposed system is able to find statistically significant biological markers able to discriminate classes of patients that respond to drugs in different ways. Experiments performed on real and synthetic genomic datasets show good speed-up and scalability.
NASA Astrophysics Data System (ADS)
Fiore, S.; Płóciennik, M.; Doutriaux, C.; Blanquer, I.; Barbera, R.; Williams, D. N.; Anantharaj, V. G.; Evans, B. J. K.; Salomoni, D.; Aloisio, G.
2017-12-01
The increased models resolution in the development of comprehensive Earth System Models is rapidly leading to very large climate simulations output that pose significant scientific data management challenges in terms of data sharing, processing, analysis, visualization, preservation, curation, and archiving.Large scale global experiments for Climate Model Intercomparison Projects (CMIP) have led to the development of the Earth System Grid Federation (ESGF), a federated data infrastructure which has been serving the CMIP5 experiment, providing access to 2PB of data for the IPCC Assessment Reports. In such a context, running a multi-model data analysis experiment is very challenging, as it requires the availability of a large amount of data related to multiple climate models simulations and scientific data management tools for large-scale data analytics. To address these challenges, a case study on climate models intercomparison data analysis has been defined and implemented in the context of the EU H2020 INDIGO-DataCloud project. The case study has been tested and validated on CMIP5 datasets, in the context of a large scale, international testbed involving several ESGF sites (LLNL, ORNL and CMCC), one orchestrator site (PSNC) and one more hosting INDIGO PaaS services (UPV). Additional ESGF sites, such as NCI (Australia) and a couple more in Europe, are also joining the testbed. The added value of the proposed solution is summarized in the following: it implements a server-side paradigm which limits data movement; it relies on a High-Performance Data Analytics (HPDA) stack to address performance; it exploits the INDIGO PaaS layer to support flexible, dynamic and automated deployment of software components; it provides user-friendly web access based on the INDIGO Future Gateway; and finally it integrates, complements and extends the support currently available through ESGF. Overall it provides a new "tool" for climate scientists to run multi-model experiments. At the time this contribution is being written, the proposed testbed represents the first implementation of a distributed large-scale, multi-model experiment in the ESGF/CMIP context, joining together server-side approaches for scientific data analysis, HPDA frameworks, end-to-end workflow management, and cloud computing.
CSM digital autopilot testing in support of ASTP experiments control requirements
NASA Technical Reports Server (NTRS)
Rue, D. L.
1975-01-01
Results are presented of CSM digital autopilot (DAP) testing. The testing was performed to demonstrate and evaluate control modes which are currently planned or could be considered for use in support of experiments on the ASTP mission. The testing was performed on the Lockheed Guidance, Navigation, and Control System Functional Simulator (GNCFS). This simulator, which was designed to test the Apollo and Skylab DAP control system, has been used extensively and is a proven tool for CSM DAP analysis.
Ultrahigh temperature vapor core reactor-MHD system for space nuclear electric power
NASA Technical Reports Server (NTRS)
Maya, Isaac; Anghaie, Samim; Diaz, Nils J.; Dugan, Edward T.
1991-01-01
The conceptual design of a nuclear space power system based on the ultrahigh temperature vapor core reactor with MHD energy conversion is presented. This UF4 fueled gas core cavity reactor operates at 4000 K maximum core temperature and 40 atm. Materials experiments, conducted with UF4 up to 2200 K, demonstrate acceptable compatibility with tungsten-molybdenum-, and carbon-based materials. The supporting nuclear, heat transfer, fluid flow and MHD analysis, and fissioning plasma physics experiments are also discussed.
Hagen, R. W.; Ambos, H. D.; Browder, M. W.; Roloff, W. R.; Thomas, L. J.
1979-01-01
The Clinical Physiologic Research System (CPRS) developed from our experience in applying computers to medical instrumentation problems. This experience revealed a set of applications with a commonality in data acquisition, analysis, input/output, and control needs that could be met by a portable system. The CPRS demonstrates a practical methodology for integrating commercial instruments with distributed modular elements of local design in order to make facile responses to changing instrumentation needs in clinical environments. ImagesFigure 3
NASA Astrophysics Data System (ADS)
Serafini, L.; Viganò, W.; Donati, A.; Porciani, M.; Zolesi, V.; Schulze-Varnholt, D.; Manieri, P.; El-Din Sallam, A.; Schmäh, M.; Horn, E. R.
2007-02-01
The study of internal clock systems of scorpions in weightless conditions is the goal of the SCORPI experiment. SCORPI was selected for flight on the International Space Station (ISS) and will be mounted in the European facility BIOLAB, the European Space Agency (ESA) laboratory designed to support biological experiments on micro-organisms, cells, tissue, cultures, small plants and small invertebrates. This paper outlines the main features of a breadboard designed and developed in order to allow the analysis of critical aspects of the experiment. It is a complete tool to simulate the experiment mission on ground and it can be customised, adapted and tuned to the scientific requirements. The paper introduces the SCORPI-T experiment which represents an important precursor for the success of the SCORPI on BIOLAB. The capabilities of the hardware developed show its potential use for future similar experiments in space.
NASA Astrophysics Data System (ADS)
Salinas Barrios, Ivan Eduardo
I investigated linguistic patterns in middle school students' writing to understand their relevant embodied experiences for learning science. Embodied experiences are those limited by the perceptual and motor constraints of the human body. Recent research indicates student understanding of science needs embodied experiences. Recent emphases of science education researchers in the practices of science suggest that students' understanding of systems and their structure, scale, size, representations, and causality are crosscutting concepts that unify all scientific disciplinary areas. To discern the relationship between linguistic patterns and embodied experiences, I relied on Cognitive Linguistics, a field within cognitive sciences that pays attention to language organization and use assuming that language reflects the human cognitive system. Particularly, I investigated the embodied experiences that 268 middle school students learning about water brought to understanding: i) systems and system structure; ii) scale, size and representations; and iii) causality. Using content analysis, I explored students' language in search of patterns regarding linguistic phenomena described within cognitive linguistics: image schemas, conceptual metaphors, event schemas, semantical roles, and force-dynamics. I found several common embodied experiences organizing students' understanding of crosscutting concepts. Perception of boundaries and change in location and perception of spatial organization in the vertical axis are relevant embodied experiences for students' understanding of systems and system structure. Direct object manipulation and perception of size with and without locomotion are relevant for understanding scale, size and representations. Direct applications of force and consequential perception of movement or change in form are relevant for understanding of causality. I discuss implications of these findings for research and science teaching.
Third LDEF Post-Retrieval Symposium Abstracts
NASA Technical Reports Server (NTRS)
Levine, Arlene S. (Compiler)
1993-01-01
This volume is a compilation of abstracts submitted to the Third Long Duration Exposure Facility (LDEF) Post-Retrieval Symposium. The abstracts represent the data analysis of the 57 experiments flown on the LDEF. The experiments include materials, coatings, thermal systems, power and propulsion, science (cosmic ray, interstellar gas, heavy ions, micrometeoroid, etc.), electronics, optics, and life science.
LDEF: 69 Months in Space. First Post-Retrieval Symposium, part 1
NASA Technical Reports Server (NTRS)
Levine, Arlene S. (Editor)
1992-01-01
A compilation of papers from the symposium is presented. The papers represent the preliminary data analysis of the 57 experiments flown on the Long Duration Exposure Facility (LDEF). The experiments include materials, coatings, thermal systems, power and propulsion, science (cosmic ray, interstellar gas, heavy ions, and micrometeoroids), electronics, optics, and life sciences.
The Kinetics and Inhibition of Gamma-Glutamyl Transpeptidase: A Biochemistry Laboratory Experiment.
ERIC Educational Resources Information Center
Splittgerber, A. G.; Sohl, Julie
1988-01-01
Discusses an enzyme kinetics laboratory experiment involving a two substrate system for undergraduate biochemistry. Uses the enzyme gamma-glutamyl transpeptidase as this enzyme in blood serum is of clinical significance. Notes elevated levels are seen in liver disease, alcoholism, and epilepsy. Uses a spectrophotometer for the analysis. (MVL)
An Interpretative Phenomenological Analysis of Stress and Coping in First Year Undergraduates
ERIC Educational Resources Information Center
Denovan, Andrew; Macaskill, Ann
2013-01-01
In the UK, changes to the higher education system have increased the range of stressors experienced by students above those traditionally associated with the transition to university. Despite this, there is little qualitative research examining how students experience and cope with the adjustment to university. The experience of the transition was…
The role of man in flight experiment payload missions. Volume 2: Appendices
NASA Technical Reports Server (NTRS)
Malone, T. B.
1973-01-01
In the study to determine the role of man in Sortie Lab operations, a functional model of a generalized experiment system was developed. The results are presented of a requirements analysis which was conducted to identify performance requirements, information requirements, and interface requirements associated with each function in the model.
Using Computer-Based "Experiments" in the Analysis of Chemical Reaction Equilibria
ERIC Educational Resources Information Center
Li, Zhao; Corti, David S.
2018-01-01
The application of the Reaction Monte Carlo (RxMC) algorithm to standard textbook problems in chemical reaction equilibria is discussed. The RxMC method is a molecular simulation algorithm for studying the equilibrium properties of reactive systems, and therefore provides the opportunity to develop computer-based "experiments" for the…
Rotational Dynamics with Tracker
ERIC Educational Resources Information Center
Eadkhong, T.; Rajsadorn, R.; Jannual, P.; Danworaphong, S.
2012-01-01
We propose the use of Tracker, freeware for video analysis, to analyse the moment of inertia ("I") of a cylindrical plate. Three experiments are performed to validate the proposed method. The first experiment is dedicated to find the linear coefficient of rotational friction ("b") for our system. By omitting the effect of such friction, we derive…
Phenomenological Analysis of Professional Identity Crisis Experience by Teachers
ERIC Educational Resources Information Center
Sadovnikova, Nadezhda O.; Sergeeva, Tamara B.; Suraeva, Maria O.
2016-01-01
The topicality of the problem under research is predetermined by the need of psychology and pedagogy for the study of the process of professional identity crisis experience by teachers and development of a system of measures for support of teachers' pedagogical activity and professional development. The objective of the study is to describe the…
ERIC Educational Resources Information Center
Olsson, Ingrid; Roll-Pettersson, Lise
2012-01-01
Using semi-structured interviews this study investigated the personal experiences of parents of pre-school children with intellectual disabilities within the Swedish social support system. Thirteen parents of 10 children participated. Interview transcripts were qualitatively analysed using interpretative phenomenological analysis. Three themes…
ERIC Educational Resources Information Center
Erskine, Steven R.; And Others
1986-01-01
Describes a laboratory experiment that is designed to aid in the understanding of the fundamental process involved in gas chromatographic separations. Introduces the Kovats retention index system for use by chemistry students to establish criteria for the optimal selection of gas chromatographic stationary phases. (TW)
Surface electrical properties experiment study phase, volume 3
NASA Technical Reports Server (NTRS)
1973-01-01
The reliability and quality assurance system and procedures used in developing test equipment for the Lunar Experiment projects are described. The subjects discussed include the following: (1) documentation control, (2) design review, (3) parts and materials selection, (4) material procurement, (5) inspection procedures, (6) qualification and special testing, and failure modes and effects analysis.
Analyses of space environment effects on active fiber optic links orbited aboard the LDEF
NASA Technical Reports Server (NTRS)
Taylor, Edward W.; Monarski, T. W.; Berry, J. N.; Sanchez, A. D.; Padden, R. J.; Chapman, S. P.
1993-01-01
The results of the 'Preliminary Analysis of WL Experiment no. 701, Space Environment Effects on Operating Fiber Optic Systems,' is correlated with space simulated post retrieval terrestrial studies performed on the M0004 experiment. Temperature cycling measurements were performed on the active optical data links for the purpose of assessing link signal to noise ratio and bit error rate performance some 69 months following the experiment deployment in low Earth orbit. The early results indicate a high correlation between pre-orbit, orbit, and post-orbit functionality of the first known and longest space demonstration of operating fiber optic systems.
NASA Technical Reports Server (NTRS)
Frost, J. D., Jr.; Salamy, J. G.
1973-01-01
The Skylab sleep-monitoring experiment simulated the timelines and environment expected during a 56-day Skylab mission. Two crewmembers utilized the data acquisition and analysis hardware, and their sleep characteristics were studied in an online fashion during a number of all night recording sessions. Comparison of the results of online automatic analysis with those of postmission visual data analysis was favorable, confirming the feasibility of obtaining reliable objective information concerning sleep characteristics during the Skylab missions. One crewmember exhibited definite changes in certain sleep characteristics (e.g., increased sleep latency, increased time Awake during first third of night, and decreased total sleep time) during the mission.
A High Performance Pulsatile Pump for Aortic Flow Experiments in 3-Dimensional Models.
Chaudhury, Rafeed A; Atlasman, Victor; Pathangey, Girish; Pracht, Nicholas; Adrian, Ronald J; Frakes, David H
2016-06-01
Aortic pathologies such as coarctation, dissection, and aneurysm represent a particularly emergent class of cardiovascular diseases. Computational simulations of aortic flows are growing increasingly important as tools for gaining understanding of these pathologies, as well as for planning their surgical repair. In vitro experiments are required to validate the simulations against real world data, and the experiments require a pulsatile flow pump system that can provide physiologic flow conditions characteristic of the aorta. We designed a newly capable piston-based pulsatile flow pump system that can generate high volume flow rates (850 mL/s), replicate physiologic waveforms, and pump high viscosity fluids against large impedances. The system is also compatible with a broad range of fluid types, and is operable in magnetic resonance imaging environments. Performance of the system was validated using image processing-based analysis of piston motion as well as particle image velocimetry. The new system represents a more capable pumping solution for aortic flow experiments than other available designs, and can be manufactured at a relatively low cost.
Gas-phase kinetics during diamond growth: CH4 as-growth species
NASA Astrophysics Data System (ADS)
Harris, Stephen J.
1989-04-01
We have used a one-dimensional kinetic analysis to model the gas-phase chemistry that occurred during the diamond growth experiments of Chauhan, Angus, and Gardner [J. Appl. Phys. 47, 4746 (1976)]. In those experiments the weight of diamond seed crystals heated by lamps in a CH4/H2 environment was monitored by a microbalance. No filament or electric discharge was present. Our analysis shows that diamond growth occurred in this system by direct reaction of CH4 on the diamond surface. C2H2 and CH3, which have been proposed as diamond growth species, played no significant role there, although our results do not address their possible contributions in other systems such as filament- or plasma-assisted diamond growth.
Geometric error analysis for shuttle imaging spectrometer experiment
NASA Technical Reports Server (NTRS)
Wang, S. J.; Ih, C. H.
1984-01-01
The demand of more powerful tools for remote sensing and management of earth resources steadily increased over the last decade. With the recent advancement of area array detectors, high resolution multichannel imaging spectrometers can be realistically constructed. The error analysis study for the Shuttle Imaging Spectrometer Experiment system is documented for the purpose of providing information for design, tradeoff, and performance prediction. Error sources including the Shuttle attitude determination and control system, instrument pointing and misalignment, disturbances, ephemeris, Earth rotation, etc., were investigated. Geometric error mapping functions were developed, characterized, and illustrated extensively with tables and charts. Selected ground patterns and the corresponding image distortions were generated for direct visual inspection of how the various error sources affect the appearance of the ground object images.
NASA Technical Reports Server (NTRS)
Keuper, H. R.; Peplies, R. W.; Gillooly, R. P.
1977-01-01
The use of machine scanning and/or computer-based techniques to provide greater objectivity in the photomorphic approach was investigated. Photomorphic analysis and its application in regional planning are discussed. Topics included: delineation of photomorphic regions; inadequacies of existing classification systems; tonal and textural characteristics and signature analysis techniques; pattern recognition and Fourier transform analysis; and optical experiments. A bibliography is included.
NASA Technical Reports Server (NTRS)
Knupp, Kevin R.
1988-01-01
Described is work performed under NASA Grant NAG8-654 for the period 15 March to 15 September 1988. This work entails primarily data analysis and numerical modeling efforts related to the 1986 Satellite Precipitation and Cloud Experiment (SPACE). In the following, the SPACE acronym is used along with the acronym COHMEX, which represents the encompassing Cooperative Huntsville Meteorological Experiment. Progress made during the second half of the first year of the study included: (1) installation and testing of the RAMS numerical Modeling system on the Alabama CRAY X-MP/24; (2) a start on the analysis of the mesoscale convection system (MCS) of 13 July 1986 COHMEX case; and (3) a cursory examination of a small MCS that formed over the COHMEX region on 15 July 1986. Details of each of these individual tasks are given.
A unified approach to computer analysis and modeling of spacecraft environmental interactions
NASA Technical Reports Server (NTRS)
Katz, I.; Mandell, M. J.; Cassidy, J. J.
1986-01-01
A new, coordinated, unified approach to the development of spacecraft plasma interaction models is proposed. The objective is to eliminate the unnecessary duplicative work in order to allow researchers to concentrate on the scientific aspects. By streamlining the developmental process, the interchange between theories and experimentalists is enhanced, and the transfer of technology to the spacecraft engineering community is faster. This approach is called the UNIfied Spacecraft Interaction Model (UNISIM). UNISIM is a coordinated system of software, hardware, and specifications. It is a tool for modeling and analyzing spacecraft interactions. It will be used to design experiments, to interpret results of experiments, and to aid in future spacecraft design. It breaks a Spacecraft Ineraction analysis into several modules. Each module will perform an analysis for some physical process, using phenomenology and algorithms which are well documented and have been subject to review. This system and its characteristics are discussed.
Research in interactive scene analysis
NASA Technical Reports Server (NTRS)
Tenenbaum, J. M.; Garvey, T. D.; Weyl, S. A.; Wolf, H. C.
1975-01-01
An interactive scene interpretation system (ISIS) was developed as a tool for constructing and experimenting with man-machine and automatic scene analysis methods tailored for particular image domains. A recently developed region analysis subsystem based on the paradigm of Brice and Fennema is described. Using this subsystem a series of experiments was conducted to determine good criteria for initially partitioning a scene into atomic regions and for merging these regions into a final partition of the scene along object boundaries. Semantic (problem-dependent) knowledge is essential for complete, correct partitions of complex real-world scenes. An interactive approach to semantic scene segmentation was developed and demonstrated on both landscape and indoor scenes. This approach provides a reasonable methodology for segmenting scenes that cannot be processed completely automatically, and is a promising basis for a future automatic system. A program is described that can automatically generate strategies for finding specific objects in a scene based on manually designated pictorial examples.
NASA Technical Reports Server (NTRS)
Thesken, John C.; Bowman, Cheryl L.; Arnold, Steven M.
2003-01-01
Successful spaceflight operations require onboard power management systems that reliably achieve mission objectives for a minimal launch weight. Because of their high specific energies and potential for reduced maintenance and logistics, composite flywheels are an attractive alternative to electrochemical batteries. The Rotor Durability Team, which comprises members from the Ohio Aerospace Institute (OAI) and the NASA Glenn Research Center, completed a program of elevated temperature testing at Glenn' s Life Prediction Branch's Fatigue Laboratory. The experiments provided unique design data essential to the safety and durability of flywheel energy storage systems for the International Space Station and other manned spaceflight applications. Analysis of the experimental data (ref. 1) demonstrated that the compressive stress relaxation of composite flywheel rotor material is significantly greater than the commonly available tensile stress relaxation data. Durability analysis of compression preloaded flywheel rotors is required for accurate safe-life predictions for use in the International Space Station.
NASA Astrophysics Data System (ADS)
Tao, Weijun; Zhang, Jianyun; Li, Guangyi; Liu, Tao; Liu, Fengping; Yi, Jingang; Wang, Hesheng; Inoue, Yoshio
2016-02-01
Wearable sensors are attractive for gait analysis because these systems can measure and obtain real-time human gait and motion information outside of the laboratory for a longer duration. In this paper, we present a new wearable ground reaction force (GRF) sensing system for ambulatory gait measurement. In addition, the GRF sensor system is also used to quantify the patients' lower-limb gait rehabilitation. We conduct a validation experiment for the sensor system on seven volunteer subjects (weight 62.39 +/- 9.69 kg and height 169.13 +/- 5.64 cm). The experiments include the use of the GRF sensing system for the subjects in the following conditions: (1) normal walking; (2) walking with the rehabilitation training device; and (3) walking with a knee brace and the rehabilitation training device. The experiment results support the hypothesis that the wearable GRF sensor system is capable of quantifying patients' lower-limb rehabilitation. The proposed GRF sensing system can also be used for assessing the effectiveness of a gait rehabilitation system and for providing bio-feedback information to the subjects.
Systems-Oriented Workplace Learning Experiences for Early Learners: Three Models.
O'Brien, Bridget C; Bachhuber, Melissa R; Teherani, Arianne; Iker, Theresa M; Batt, Joanne; O'Sullivan, Patricia S
2017-05-01
Early workplace learning experiences may be effective for learning systems-based practice. This study explores systems-oriented workplace learning experiences (SOWLEs) for early learners to suggest a framework for their development. The authors used a two-phase qualitative case study design. In Phase 1 (spring 2014), they prepared case write-ups based on transcribed interviews from 10 SOWLE leaders at the authors' institution and, through comparative analysis of cases, identified three SOWLE models. In Phase 2 (summer 2014), studying seven 8-week SOWLE pilots, the authors used interview and observational data collected from the seven participating medical students, two pharmacy students, and site leaders to construct case write-ups of each pilot and to verify and elaborate the models. In Model 1, students performed specific patient care activities that addressed a system gap. Some site leaders helped students connect the activities to larger systems problems and potential improvements. In Model 2, students participated in predetermined systems improvement (SI) projects, gaining experience in the improvement process. Site leaders had experience in SI and often had significant roles in the projects. In Model 3, students worked with key stakeholders to develop a project and conduct a small test of change. They experienced most elements of an improvement cycle. Site leaders often had experience with SI and knew how to guide and support students' learning. Each model could offer systems-oriented learning opportunities provided that key elements are in place including site leaders facile in SI concepts and able to guide students in SOWLE activities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cai, B; Sun, B; Yaddanapudi, S
Purpose: To describe the clinical use of a Linear Accelerator (Linac) DailyQA system with only EPID and OBI. To assess the reliability over an 18-month period and improve the robustness of this system based on QA failure analysis. Methods: A DailyQA solution utilizing an in-house designed phantom, combined EPID and OBI image acquisitions, and a web-based data analysis and reporting system was commissioned and used in our clinic to measure geometric, dosimetry and imaging components of a Varian Truebeam Linac. During an 18-month period (335 working days), the Daily QA results, including the output constancy, beam flatness and symmetry, uniformity,more » TPR20/10, MV and KV imaging quality, were collected and analyzed. For output constancy measurement, an independent monthly QA system with an ionization chamber (IC) and annual/incidental TG51 measurements with ADCL IC were performed and cross-compared to Daily QA system. Thorough analyses were performed on the recorded QA failures to evaluate the machine performance, optimize the data analysis algorithm, adjust the tolerance setting and improve the training procedure to prevent future failures. Results: A clinical workflow including beam delivery, data analysis, QA report generation and physics approval was established and optimized to suit daily clinical operation. The output tests over the 335 working day period cross-correlated with the monthly QA system within 1.3% and TG51 results within 1%. QA passed with one attempt on 236 days out of 335 days. Based on the QA failures analysis, the Gamma criteria is revised from (1%, 1mm) to (2%, 1mm) considering both QA accuracy and efficiency. Data analysis algorithm is improved to handle multiple entries for a repeating test. Conclusion: We described our 18-month clinical experience on a novel DailyQA system using only EPID and OBI. The long term data presented demonstrated the system is suitable and reliable for Linac daily QA.« less
System parameter identification from projection of inverse analysis
NASA Astrophysics Data System (ADS)
Liu, K.; Law, S. S.; Zhu, X. Q.
2017-05-01
The output of a system due to a change of its parameters is often approximated with the sensitivity matrix from the first order Taylor series. The system output can be measured in practice, but the perturbation in the system parameters is usually not available. Inverse sensitivity analysis can be adopted to estimate the unknown system parameter perturbation from the difference between the observation output data and corresponding analytical output data calculated from the original system model. The inverse sensitivity analysis is re-visited in this paper with improvements based on the Principal Component Analysis on the analytical data calculated from the known system model. The identification equation is projected into a subspace of principal components of the system output, and the sensitivity of the inverse analysis is improved with an iterative model updating procedure. The proposed method is numerical validated with a planar truss structure and dynamic experiments with a seven-storey planar steel frame. Results show that it is robust to measurement noise, and the location and extent of stiffness perturbation can be identified with better accuracy compared with the conventional response sensitivity-based method.
A LabVIEW-Based Virtual Instrument System for Laser-Induced Fluorescence Spectroscopy.
Wu, Qijun; Wang, Lufei; Zu, Lily
2011-01-01
We report the design and operation of a Virtual Instrument (VI) system based on LabVIEW 2009 for laser-induced fluorescence experiments. This system achieves synchronous control of equipment and acquisition of real-time fluorescence data communicating with a single computer via GPIB, USB, RS232, and parallel ports. The reported VI system can also accomplish data display, saving, and analysis, and printing the results. The VI system performs sequences of operations automatically, and this system has been successfully applied to obtain the excitation and dispersion spectra of α-methylnaphthalene. The reported VI system opens up new possibilities for researchers and increases the efficiency and precision of experiments. The design and operation of the VI system are described in detail in this paper, and the advantages that this system can provide are highlighted.
A LabVIEW-Based Virtual Instrument System for Laser-Induced Fluorescence Spectroscopy
Wu, Qijun; Wang, Lufei; Zu, Lily
2011-01-01
We report the design and operation of a Virtual Instrument (VI) system based on LabVIEW 2009 for laser-induced fluorescence experiments. This system achieves synchronous control of equipment and acquisition of real-time fluorescence data communicating with a single computer via GPIB, USB, RS232, and parallel ports. The reported VI system can also accomplish data display, saving, and analysis, and printing the results. The VI system performs sequences of operations automatically, and this system has been successfully applied to obtain the excitation and dispersion spectra of α-methylnaphthalene. The reported VI system opens up new possibilities for researchers and increases the efficiency and precision of experiments. The design and operation of the VI system are described in detail in this paper, and the advantages that this system can provide are highlighted. PMID:22013388
Video Analysis of Granular Gases in a Low-Gravity Environment
NASA Astrophysics Data System (ADS)
Lewallen, Erin
2004-10-01
Granular Agglomeration in Non-Gravitating Systems is a research project undertaken by the University of Tulsa Granular Dynamics Group. The project investigates the effects of weightlessness on granular systems by studying the dynamics of a "gas" of 1-mm diameter brass ball bearings driven at various amplitudes and frequencies in low-gravity. Models predict that particles in systems subjected to these conditions should exhibit clustering behavior due to energy loss through multiple inelastic collisions. Observation and study of clustering in our experiment could shed light on this phenomenon as a possible mechanism by which particles in space coalesce to form stable objects such as planetesimals and planetary ring systems. Our experiment has flown on NASA's KC-135 low gravity aircraft. Data analysis techniques for video data collected during these flights include modification of images using Adobe Photoshop and development of ball identification and tracking programs written in Interactive Data Language. By tracking individual balls, we aim to establish speed distributions for granular gases and thereby obtain values for granular temperature.
Pion correlations in relativistic heavy ion collisions at Heavy Ion Spectrometer Systems (HISS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Christie, W.B. Jr.
This thesis contains the setup, analysis and results of experiment E684H Multi-Pion Correlations in Relativistic Heavy Ion Collisions''. The goals of the original proposal were: (1) To initiate the use of the HISS facility in the study of central Relativistic Heavy Ion Collisions (RHIC). (2) To perform a second generation experiment for the detailed study of the pion source in RHIC. The first generation experiments, implied by the second goal above, refer to pion correlation studies which the Riverside group had performed at the LBL streamer chamber. The major advantage offered by moving the pion correlation studies to HISS ismore » that, being an electronic detector system, as opposed to the Streamer Chamber which is a visual detector, one can greatly increase the statistics for a study of this sort. An additional advantage is that once one has written the necessary detector and physics analysis code to do a particular type of study, the study may be extended to investigate the systematics, with much less effort and in a relatively short time. This paper discusses the Physics motivation for this experiment, the experimental setup and detectors used, the pion correlation analysis, the results, and the conclusions possible future directions for pion studies at HISS. If one is not interested in all the details of the experiment, I believe that by reading the sections on intensity interferometry, the section the fitting of the correlation function and the systematic corrections applied, and the results section, one will get a fairly complete synopsis of the experiment.« less
NASA-Langley Research Center's Aircraft Condition Analysis and Management System Implementation
NASA Technical Reports Server (NTRS)
Frye, Mark W.; Bailey, Roger M.; Jessup, Artie D.
2004-01-01
This document describes the hardware implementation design and architecture of Aeronautical Radio Incorporated (ARINC)'s Aircraft Condition Analysis and Management System (ACAMS), which was developed at NASA-Langley Research Center (LaRC) for use in its Airborne Research Integrated Experiments System (ARIES) Laboratory. This activity is part of NASA's Aviation Safety Program (AvSP), the Single Aircraft Accident Prevention (SAAP) project to develop safety-enabling technologies for aircraft and airborne systems. The fundamental intent of these technologies is to allow timely intervention or remediation to improve unsafe conditions before they become life threatening.
NASA Technical Reports Server (NTRS)
Deloach, R.; Morris, A. L.; Mcbeth, R. B.
1976-01-01
A portable boundary-layer meteorological data-acquisition and analysis system is described which employs a small tethered balloon and a programmable calculator. The system is capable of measuring pressure, wet- and dry-bulb temperature, wind speed, and temperature fluctuations as a function of height and time. Other quantities, which can be calculated in terms of these, can also be made available in real time. All quantities, measured and calculated, can be printed, plotted, and stored on magnetic tape in the field during the data-acquisition phase of an experiment.
Heart Sound Biometric System Based on Marginal Spectrum Analysis
Zhao, Zhidong; Shen, Qinqin; Ren, Fangqin
2013-01-01
This work presents a heart sound biometric system based on marginal spectrum analysis, which is a new feature extraction technique for identification purposes. This heart sound identification system is comprised of signal acquisition, pre-processing, feature extraction, training, and identification. Experiments on the selection of the optimal values for the system parameters are conducted. The results indicate that the new spectrum coefficients result in a significant increase in the recognition rate of 94.40% compared with that of the traditional Fourier spectrum (84.32%) based on a database of 280 heart sounds from 40 participants. PMID:23429515
Building a Propulsion Experiment Project Management Environment
NASA Technical Reports Server (NTRS)
Keiser, Ken; Tanner, Steve; Hatcher, Danny; Graves, Sara
2004-01-01
What do you get when you cross rocket scientists with computer geeks? It is an interactive, distributed computing web of tools and services providing a more productive environment for propulsion research and development. The Rocket Engine Advancement Program 2 (REAP2) project involves researchers at several institutions collaborating on propulsion experiments and modeling. In an effort to facilitate these collaborations among researchers at different locations and with different specializations, researchers at the Information Technology and Systems Center,' University of Alabama in Huntsville, are creating a prototype web-based interactive information system in support of propulsion research. This system, to be based on experience gained in creating similar systems for NASA Earth science field experiment campaigns such as the Convection and Moisture Experiments (CAMEX), will assist in the planning and analysis of model and experiment results across REAP2 participants. The initial version of the Propulsion Experiment Project Management Environment (PExPM) consists of a controlled-access web portal facilitating the drafting and sharing of working documents and publications. Interactive tools for building and searching an annotated bibliography of publications related to REAP2 research topics have been created to help organize and maintain the results of literature searches. Also work is underway, with some initial prototypes in place, for interactive project management tools allowing project managers to schedule experiment activities, track status and report on results. This paper describes current successes, plans, and expected challenges for this project.
Zhang, Lei; Zhao, Haiyu; Liu, Yang; Dong, Honghuan; Lv, Beiran; Fang, Min; Zhao, Huihui
2016-06-01
This study was conducted to establish the multicomponent sequential metabolism (MSM) method based on comparative analysis along the digestive system following oral administration of licorice (Glycyrrhiza uralensis Fisch., leguminosae), a traditional Chinese medicine widely used for harmonizing other ingredients in a formulae. The licorice water extract (LWE) dissolved in Krebs-Ringer buffer solution (1 g/mL) was used to carry out the experiments and the comparative analysis was performed using HPLC and LC-MS/MS methods. In vitro incubation, in situ closed-loop and in vivo blood sampling were used to measure the LWE metabolic profile along the digestive system. The incubation experiment showed that the LWE was basically stable in digestive juice. A comparative analysis presented the metabolic profile of each prototype and its corresponding metabolites then. Liver was the major metabolic organ for LWE, and the metabolism by the intestinal flora and gut wall was also an important part of the process. The MSM method was practical and could be a potential method to describe the metabolic routes of multiple components before absorption into the systemic blood stream. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Experiences with Text Mining Large Collections of Unstructured Systems Development Artifacts at JPL
NASA Technical Reports Server (NTRS)
Port, Dan; Nikora, Allen; Hihn, Jairus; Huang, LiGuo
2011-01-01
Often repositories of systems engineering artifacts at NASA's Jet Propulsion Laboratory (JPL) are so large and poorly structured that they have outgrown our capability to effectively manually process their contents to extract useful information. Sophisticated text mining methods and tools seem a quick, low-effort approach to automating our limited manual efforts. Our experiences of exploring such methods mainly in three areas including historical risk analysis, defect identification based on requirements analysis, and over-time analysis of system anomalies at JPL, have shown that obtaining useful results requires substantial unanticipated efforts - from preprocessing the data to transforming the output for practical applications. We have not observed any quick 'wins' or realized benefit from short-term effort avoidance through automation in this area. Surprisingly we have realized a number of unexpected long-term benefits from the process of applying text mining to our repositories. This paper elaborates some of these benefits and our important lessons learned from the process of preparing and applying text mining to large unstructured system artifacts at JPL aiming to benefit future TM applications in similar problem domains and also in hope for being extended to broader areas of applications.
Model 0A wind turbine generator FMEA
NASA Technical Reports Server (NTRS)
Klein, William E.; Lalli, Vincent R.
1989-01-01
The results of Failure Modes and Effects Analysis (FMEA) conducted for the Wind Turbine Generators are presented. The FMEA was performed for the functional modes of each system, subsystem, or component. The single-point failures were eliminated for most of the systems. The blade system was the only exception. The qualitative probability of a blade separating was estimated at level D-remote. Many changes were made to the hardware as a result of this analysis. The most significant change was the addition of the safety system. Operational experience and need to improve machine availability have resulted in subsequent changes to the various systems which are also reflected in this FMEA.
[The reference pricing of pharmaceuticals in European countries].
Gildeyeva, G N; Starykh, D A
2013-01-01
The article presents the analysis of various approaches to estimation of pharmaceuticals prices in conditions of actual systems of pharmaceuticals support. The pricing is considered in pegging to actual systems of pharmaceuticals support based on the principles of insurance and co-financing. The detailed analysis is presented concerning the methodology of estimation of reference prices of pharmaceuticals in different countries of Europe. The experience of European countries in evaluation of interchangeability of pharmaceuticals is discussed.
Analysis of defects of overhead facade systems and other light thin-walled structures
NASA Astrophysics Data System (ADS)
Endzhievskiy, L.; Frolovskaia, A.; Petrova, Y.
2017-04-01
This paper analyzes the defects and the causes of contemporary design solutions with an example of overhead facade systems with ventilated air gaps and light steel thin-walled structures on the basis of field experiments. The analysis is performed at all stages of work: design, manufacture, including quality, construction, and operation. Practical examples are given. The main causes of accidents and the accident rate prediction are looked upon and discussed.
Development and approach to low-frequency microgravity isolation systems
NASA Technical Reports Server (NTRS)
Grodsinsky, Carlos M.
1990-01-01
The low-gravity environment provided by space flight has afforded the science community a unique arena for the study of fundamental and technological sciences. However, the dynamic environment observed on space shuttle flights and predicted for Space Station Freedom has complicated the analysis of prior microgravity experiments and prompted concern for the viability of proposed space experiments requiring long-term, low-gravity environments. Thus, isolation systems capable of providing significant improvements to this random environment are being developed. The design constraints imposed by acceleration-sensitive, microgravity experiment payloads in the unique environment of space and a theoretical background for active isolation are discussed. A design is presented for a six-degree-of-freedom, active, inertial isolation system based on the baseline relative and inertial isolation techniques described.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
Accelerometer Data Analysis and Presentation Techniques
NASA Technical Reports Server (NTRS)
Rogers, Melissa J. B.; Hrovat, Kenneth; McPherson, Kevin; Moskowitz, Milton E.; Reckart, Timothy
1997-01-01
The NASA Lewis Research Center's Principal Investigator Microgravity Services project analyzes Orbital Acceleration Research Experiment and Space Acceleration Measurement System data for principal investigators of microgravity experiments. Principal investigators need a thorough understanding of data analysis techniques so that they can request appropriate analyses to best interpret accelerometer data. Accelerometer data sampling and filtering is introduced along with the related topics of resolution and aliasing. Specific information about the Orbital Acceleration Research Experiment and Space Acceleration Measurement System data sampling and filtering is given. Time domain data analysis techniques are discussed and example environment interpretations are made using plots of acceleration versus time, interval average acceleration versus time, interval root-mean-square acceleration versus time, trimmean acceleration versus time, quasi-steady three dimensional histograms, and prediction of quasi-steady levels at different locations. An introduction to Fourier transform theory and windowing is provided along with specific analysis techniques and data interpretations. The frequency domain analyses discussed are power spectral density versus frequency, cumulative root-mean-square acceleration versus frequency, root-mean-square acceleration versus frequency, one-third octave band root-mean-square acceleration versus frequency, and power spectral density versus frequency versus time (spectrogram). Instructions for accessing NASA Lewis Research Center accelerometer data and related information using the internet are provided.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
The ATLAS PanDA Monitoring System and its Evolution
NASA Astrophysics Data System (ADS)
Klimentov, A.; Nevski, P.; Potekhin, M.; Wenaus, T.
2011-12-01
The PanDA (Production and Distributed Analysis) Workload Management System is used for ATLAS distributed production and analysis worldwide. The needs of ATLAS global computing imposed challenging requirements on the design of PanDA in areas such as scalability, robustness, automation, diagnostics, and usability for both production shifters and analysis users. Through a system-wide job database, the PanDA monitor provides a comprehensive and coherent view of the system and job execution, from high level summaries to detailed drill-down job diagnostics. It is (like the rest of PanDA) an Apache-based Python application backed by Oracle. The presentation layer is HTML code generated on the fly in the Python application which is also responsible for managing database queries. However, this approach is lacking in user interface flexibility, simplicity of communication with external systems, and ease of maintenance. A decision was therefore made to migrate the PanDA monitor server to Django Web Application Framework and apply JSON/AJAX technology in the browser front end. This allows us to greatly reduce the amount of application code, separate data preparation from presentation, leverage open source for tools such as authentication and authorization mechanisms, and provide a richer and more dynamic user experience. We describe our approach, design and initial experience with the migration process.
Madduri, Ravi K.; Sulakhe, Dinanath; Lacinski, Lukasz; Liu, Bo; Rodriguez, Alex; Chard, Kyle; Dave, Utpal J.; Foster, Ian T.
2014-01-01
We describe Globus Genomics, a system that we have developed for rapid analysis of large quantities of next-generation sequencing (NGS) genomic data. This system achieves a high degree of end-to-end automation that encompasses every stage of data analysis including initial data retrieval from remote sequencing centers or storage (via the Globus file transfer system); specification, configuration, and reuse of multi-step processing pipelines (via the Galaxy workflow system); creation of custom Amazon Machine Images and on-demand resource acquisition via a specialized elastic provisioner (on Amazon EC2); and efficient scheduling of these pipelines over many processors (via the HTCondor scheduler). The system allows biomedical researchers to perform rapid analysis of large NGS datasets in a fully automated manner, without software installation or a need for any local computing infrastructure. We report performance and cost results for some representative workloads. PMID:25342933
Madduri, Ravi K; Sulakhe, Dinanath; Lacinski, Lukasz; Liu, Bo; Rodriguez, Alex; Chard, Kyle; Dave, Utpal J; Foster, Ian T
2014-09-10
We describe Globus Genomics, a system that we have developed for rapid analysis of large quantities of next-generation sequencing (NGS) genomic data. This system achieves a high degree of end-to-end automation that encompasses every stage of data analysis including initial data retrieval from remote sequencing centers or storage (via the Globus file transfer system); specification, configuration, and reuse of multi-step processing pipelines (via the Galaxy workflow system); creation of custom Amazon Machine Images and on-demand resource acquisition via a specialized elastic provisioner (on Amazon EC2); and efficient scheduling of these pipelines over many processors (via the HTCondor scheduler). The system allows biomedical researchers to perform rapid analysis of large NGS datasets in a fully automated manner, without software installation or a need for any local computing infrastructure. We report performance and cost results for some representative workloads.
A Monitoring System for the LHCb Data Flow
NASA Astrophysics Data System (ADS)
Barbosa, João; Gaspar, Clara; Jost, Beat; Frank, Markus; Cardoso, Luis G.
2017-06-01
The LHCb experiment uses the LHC accelerator for the collisions that produce the physics data necessary for analysis. The data produced by the detector by measuring the results of the collisions at a rate of 40 MHz are read out by a complex data acquisition (DAQ) system, which is summarily described in this paper. Distributed systems of such dimensions rely on monitoring and control systems that account for the numerous faults that can happen throughout the whole operation. With this in mind, a new system was created to extend the monitoring of the readout system, in this case by providing an overview of what is happening in each stage of the DAQ process, starting in the hardware trigger performed right after the detector measurements and ending in the local storage of the experiment. This system, a complement to the current run control (experimental control system), intends to shorten reaction times when a problem occurs by providing the operators with detailed information of where a certain fault is occurring. The architecture of the tool and its utilization by the experiment operators are described in this paper.
Detecting Disease Specific Pathway Substructures through an Integrated Systems Biology Approach
Alaimo, Salvatore; Marceca, Gioacchino Paolo; Ferro, Alfredo; Pulvirenti, Alfredo
2017-01-01
In the era of network medicine, pathway analysis methods play a central role in the prediction of phenotype from high throughput experiments. In this paper, we present a network-based systems biology approach capable of extracting disease-perturbed subpathways within pathway networks in connection with expression data taken from The Cancer Genome Atlas (TCGA). Our system extends pathways with missing regulatory elements, such as microRNAs, and their interactions with genes. The framework enables the extraction, visualization, and analysis of statistically significant disease-specific subpathways through an easy to use web interface. Our analysis shows that the methodology is able to fill the gap in current techniques, allowing a more comprehensive analysis of the phenomena underlying disease states. PMID:29657291
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ishkov, A.; Akopova, Gretta; Evans, Meredydd
This article will compare the natural gas transmission systems in the U.S. and Russia and review experience with methane mitigation technologies in the two countries. Russia and the United States (U.S.) are the world's largest consumers and producers of natural gas, and consequently, have some of the largest natural gas infrastructure. This paper compares the natural gas transmission systems in Russia and the U.S., their methane emissions and experiences in implementing methane mitigation technologies. Given the scale of the two systems, many international oil and natural gas companies have expressed interest in better understanding the methane emission volumes and trendsmore » as well as the methane mitigation options. This paper compares the two transmission systems and documents experiences in Russia and the U.S. in implementing technologies and programs for methane mitigation. The systems are inherently different. For instance, while the U.S. natural gas transmission system is represented by many companies, which operate pipelines with various characteristics, in Russia predominately one company, Gazprom, operates the gas transmission system. However, companies in both countries found that reducing methane emissions can be feasible and profitable. Examples of technologies in use include replacing wet seals with dry seals, implementing Directed Inspection and Maintenance (DI&M) programs, performing pipeline pump-down, applying composite wrap for non-leaking pipeline defects and installing low-bleed pneumatics. The research methodology for this paper involved a review of information on methane emissions trends and mitigation measures, analytical and statistical data collection; accumulation and analysis of operational data on compressor seals and other emission sources; and analysis of technologies used in both countries to mitigate methane emissions in the transmission sector. Operators of natural gas transmission systems have many options to reduce natural gas losses. Depending on the value of gas, simple, low-cost measures, such as adjusting leaking equipment components, or larger-scale measures, such as installing dry seals on compressors, can be applied.« less
Experience with a sophisticated computer based authoring system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gardner, P.R.
1984-04-01
In the November 1982 issue of ADCIS SIG CBT Newsletter the editor arrives at two conclusions regarding Computer Based Authoring Systems (CBAS): (1) CBAS drastically reduces programming time and the need for expert programmers, and (2) CBAS appears to have minimal impact on initial lesson design. Both of these comments have significant impact on any Cost-Benefit analysis for Computer-Based Training. The first tends to improve cost-effectiveness but only toward the limits imposed by the second. Westinghouse Hanford Company (WHC) recently purchased a sophisticated CBAS, the WISE/SMART system from Wicat (Orem, UT), for use in the Nuclear Power Industry. This reportmore » details our experience with this system relative to Items (1) and (2) above; lesson design time will be compared with lesson input time. Also provided will be the WHC experience in the use of subject matter experts (though computer neophytes) for the design and inputting of CBT materials.« less
Experience of creating a multifunctional safety system at the coal mining enterprise
NASA Astrophysics Data System (ADS)
Reshetnikov, V. V.; Davkaev, K. S.; Korolkov, M. V.; Lyakhovets, M. V.
2018-05-01
The principles of creating multifunctional safety systems (MFSS) based on mathematical models with Markov properties are considered. The applicability of such models for the analysis of the safety of the created systems and their effectiveness is substantiated. The method of this analysis and the results of its testing are discussed. The variant of IFSB implementation in the conditions of the operating coal-mining enterprise is given. The functional scheme, data scheme and operating modes of the MFSS are given. The automated workplace of the industrial safety controller is described.
New space sensor and mesoscale data analysis
NASA Technical Reports Server (NTRS)
Hickey, John S.
1987-01-01
The developed Earth Science and Application Division (ESAD) system/software provides the research scientist with the following capabilities: an extensive data base management capibility to convert various experiment data types into a standard format; and interactive analysis and display package (AVE80); an interactive imaging/color graphics capability utilizing the Apple III and IBM PC workstations integrated into the ESAD computer system; and local and remote smart-terminal capability which provides color video, graphics, and Laserjet output. Recommendations for updating and enhancing the performance of the ESAD computer system are listed.
Friedman, David B
2012-01-01
All quantitative proteomics experiments measure variation between samples. When performing large-scale experiments that involve multiple conditions or treatments, the experimental design should include the appropriate number of individual biological replicates from each condition to enable the distinction between a relevant biological signal from technical noise. Multivariate statistical analyses, such as principal component analysis (PCA), provide a global perspective on experimental variation, thereby enabling the assessment of whether the variation describes the expected biological signal or the unanticipated technical/biological noise inherent in the system. Examples will be shown from high-resolution multivariable DIGE experiments where PCA was instrumental in demonstrating biologically significant variation as well as sample outliers, fouled samples, and overriding technical variation that would not be readily observed using standard univariate tests.
NASA Technical Reports Server (NTRS)
Staveland, Lowell
1994-01-01
This is the experimental and software detailed design report for the prototype task loading model (TLM) developed as part of the man-machine integration design and analysis system (MIDAS), as implemented and tested in phase 6 of the Army-NASA Aircrew/Aircraft Integration (A3I) Program. The A3I program is an exploratory development effort to advance the capabilities and use of computational representations of human performance and behavior in the design, synthesis, and analysis of manned systems. The MIDAS TLM computationally models the demands designs impose on operators to aide engineers in the conceptual design of aircraft crewstations. This report describes TLM and the results of a series of experiments which were run this phase to test its capabilities as a predictive task demand modeling tool. Specifically, it includes discussions of: the inputs and outputs of TLM, the theories underlying it, the results of the test experiments, the use of the TLM as both stand alone tool and part of a complete human operator simulation, and a brief introduction to the TLM software design.
Wear and breakage monitoring of cutting tools by an optical method: theory
NASA Astrophysics Data System (ADS)
Li, Jianfeng; Zhang, Yongqing; Chen, Fangrong; Tian, Zhiren; Wang, Yao
1996-10-01
An essential part of a machining system in the unmanned flexible manufacturing system, is the ability to automatically change out tools that are worn or damaged. An optoelectronic method for in situ monitoring of the flank wear and breakage of cutting tools is presented. A flank wear estimation system is implemented in a laboratory environment, and its performance is evaluated through turning experiments. The flank wear model parameters that need to be known a priori are determined through several preliminary experiments, or from data available in the literature. The resulting cutting conditions are typical of those used in finishing cutting operations. Through time and amplitude domain analysis of the cutting tool wear states and breakage states, it is found that the original signal digital specificity (sigma) 2x and the self correlation coefficient (rho) (m) can reflect the change regularity of the cutting tool wear and break are determined, but which is not enough due to the complexity of the wear and break procedure of cutting tools. Time series analysis and frequency spectrum analysis will be carried out, which will be described in the later papers.
NASA Technical Reports Server (NTRS)
Coho, William K.; Weiland, Karen J.; VanZandt, David M.
1998-01-01
A space experiment designed to study the behavior of combustion without the gravitational effects of buoyancy was launched aboard the Space Shuttle Columbia on July 1, 1997. The space experiment, designated as Combustion Module-1 (CM-1), was one of several manifested on the Microgravity Sciences Laboratory - 1 (MSL-1) mission. The launch, designated STS-94, had the Spacelab Module as the payload, in which the MSL-1 experiments were conducted by the Shuttle crewmembers. CM-1 was designed to accommodate two different combustion experiments during MSL-1. One experiment, the Structure of Flame Balls at Low Lewis-number experiment (SOFBALL), required gas chromatography analysis to verify the composition of the known, premixed gases prior to combustion, and to determine the remaining reactant and the products resulting from the combustion process in microgravity. A commercial, off-the-shelf, dual-channel micro gas chromatograph was procured and modified to interface with the CM-1 Fluids Supply Package and the CM-1 Combustion Chamber, to accommodate two different carrier gases, each flowing through its own independent column module, to withstand the launch environment of the Space Shuttle, to accept Spacelab electrical power, and to meet the Spacelab flight requirements for electromagnetic interference (EMI) and offgassing. The GC data was down linked to the Marshall Space Flight Center for near-real time analysis, and stored on-orbit for post-flight analysis. The gas chromatograph operated successfully during the entire SOFBALL experiment and collected 309 runs. Because of the constraints imposed upon the gas chromatograph by the CM-1 hardware, system and operations, it was unable to measure the gases to the required accuracy. Future improvements to the system for a re-flight of the SOFBALL experiment are expected to enable the gas chromatograph to meet all the requirements.
NASA Technical Reports Server (NTRS)
Leonard, J. I.; White, R. J.; Rummel, J. A.
1980-01-01
An approach was developed to aid in the integration of many of the biomedical findings of space flight, using systems analysis. The mathematical tools used in accomplishing this task include an automated data base, a biostatistical and data analysis system, and a wide variety of mathematical simulation models of physiological systems. A keystone of this effort was the evaluation of physiological hypotheses using the simulation models and the prediction of the consequences of these hypotheses on many physiological quantities, some of which were not amenable to direct measurement. This approach led to improvements in the model, refinements of the hypotheses, a tentative integrated hypothesis for adaptation to weightlessness, and specific recommendations for new flight experiments.
DOT National Transportation Integrated Search
1974-04-01
A unifying wake vortex transport model is developed and applied to a wake vortex predictive system concept. The fundamentals of vortex motion underlying the predictive model are discussed including vortex decay, bursting and instability phenomena. A ...
Wright, Alexander I.; Magee, Derek R.; Quirke, Philip; Treanor, Darren E.
2015-01-01
Background: Obtaining ground truth for pathological images is essential for various experiments, especially for training and testing image analysis algorithms. However, obtaining pathologist input is often difficult, time consuming and expensive. This leads to algorithms being over-fitted to small datasets, and inappropriate validation, which causes poor performance on real world data. There is a great need to gather data from pathologists in a simple and efficient manner, in order to maximise the amount of data obtained. Methods: We present a lightweight, web-based HTML5 system for administering and participating in data collection experiments. The system is designed for rapid input with minimal effort, and can be accessed from anywhere in the world with a reliable internet connection. Results: We present two case studies that use the system to assess how limitations on fields of view affect pathologist agreement, and to what extent poorly stained slides affect judgement. In both cases, the system collects pathologist scores at a rate of less than two seconds per image. Conclusions: The system has multiple potential applications in pathology and other domains. PMID:26110089
Wright, Alexander I; Magee, Derek R; Quirke, Philip; Treanor, Darren E
2015-01-01
Obtaining ground truth for pathological images is essential for various experiments, especially for training and testing image analysis algorithms. However, obtaining pathologist input is often difficult, time consuming and expensive. This leads to algorithms being over-fitted to small datasets, and inappropriate validation, which causes poor performance on real world data. There is a great need to gather data from pathologists in a simple and efficient manner, in order to maximise the amount of data obtained. We present a lightweight, web-based HTML5 system for administering and participating in data collection experiments. The system is designed for rapid input with minimal effort, and can be accessed from anywhere in the world with a reliable internet connection. We present two case studies that use the system to assess how limitations on fields of view affect pathologist agreement, and to what extent poorly stained slides affect judgement. In both cases, the system collects pathologist scores at a rate of less than two seconds per image. The system has multiple potential applications in pathology and other domains.
Entropy production in a box: Analysis of instabilities in confined hydrothermal systems
NASA Astrophysics Data System (ADS)
Börsing, N.; Wellmann, J. F.; Niederau, J.; Regenauer-Lieb, K.
2017-09-01
We evaluate if the concept of thermal entropy production can be used as a measure to characterize hydrothermal convection in a confined porous medium as a valuable, thermodynamically motivated addition to the standard Rayleigh number analysis. Entropy production has been used widely in the field of mechanical and chemical engineering as a way to characterize the thermodynamic state and irreversibility of an investigated system. Pioneering studies have since adapted these concepts to natural systems, and we apply this measure here to investigate the specific case of hydrothermal convection in a "box-shaped" confined porous medium, as a simplified analog for, e.g., hydrothermal convection in deep geothermal aquifers. We perform various detailed numerical experiments to assess the response of the convective system to changing boundary conditions or domain aspect ratios, and then determine the resulting entropy production for each experiment. In systems close to the critical Rayleigh number, we derive results that are in accordance to the analytically derived predictions. At higher Rayleigh numbers, however, we observe multiple possible convection modes, and the analysis of the integrated entropy production reveals distinct curves of entropy production that provide an insight into the hydrothermal behavior in the system, both for cases of homogeneous materials, as well as for heterogeneous spatial material distributions. We conclude that the average thermal entropy production characterizes the internal behavior of hydrothermal systems with a meaningful thermodynamic measure, and we expect that it can be useful for the investigation of convection systems in many similar hydrogeological and geophysical settings.
Agricultural science in the wild: a social network analysis of farmer knowledge exchange.
Wood, Brennon A; Blair, Hugh T; Gray, David I; Kemp, Peter D; Kenyon, Paul R; Morris, Steve T; Sewell, Alison M
2014-01-01
Responding to demands for transformed farming practices requires new forms of knowledge. Given their scale and complexity, agricultural problems can no longer be solved by linear transfers in which technology developed by specialists passes to farmers by way of extension intermediaries. Recent research on alternative approaches has focused on the innovation systems formed by interactions between heterogeneous actors. Rather than linear transfer, systems theory highlights network facilitation as a specialized function. This paper contributes to our understanding of such facilitation by investigating the networks in which farmers discuss science. We report findings based on the study of a pastoral farming experiment collaboratively undertaken by a group of 17 farmers and five scientists. Analysis of prior contact and alter sharing between the group's members indicates strongly tied and decentralized networks. Farmer knowledge exchanges about the experiment have been investigated using a mix of quantitative and qualitative methods. Network surveys identified who the farmers contacted for knowledge before the study began and who they had talked to about the experiment by 18 months later. Open-ended interviews collected farmer statements about their most valuable contacts and these statements have been thematically analysed. The network analysis shows that farmers talked about the experiment with 192 people, most of whom were fellow farmers. Farmers with densely tied and occupationally homogeneous contacts grew their networks more than did farmers with contacts that are loosely tied and diverse. Thematic analysis reveals three general principles: farmers value knowledge delivered by persons rather than roles, privilege farming experience, and develop knowledge with empiricist rather than rationalist techniques. Taken together, these findings suggest that farmers deliberate about science in intensive and durable networks that have significant implications for theorizing agricultural innovation. The paper thus concludes by considering the findings' significance for current efforts to rethink agricultural extension.
The Episodic Nature of Experience: A Dynamical Systems Analysis.
Sreekumar, Vishnu; Dennis, Simon; Doxas, Isidoros
2017-07-01
Context is an important construct in many domains of cognition, including learning, memory, and emotion. We used dynamical systems methods to demonstrate the episodic nature of experience by showing a natural separation between the scales over which within-context and between-context relationships operate. To do this, we represented an individual's emails extending over about 5 years in a high-dimensional semantic space and computed the dimensionalities of the subspaces occupied by these emails. Personal discourse has a two-scaled geometry with smaller within-context dimensionalities than between-context dimensionalities. Prior studies have shown that reading experience (Doxas, Dennis, & Oliver, 2010) and visual experience (Sreekumar, Dennis, Doxas, Zhuang, & Belkin, 2014) have a similar two-scaled structure. Furthermore, the recurrence plot of the emails revealed that experience is predictable and hierarchical, supporting the constructs of some influential theories of memory. The results demonstrate that experience is not scale-free and provide an important target for accounts of how experience shapes cognition. Copyright © 2016 Cognitive Science Society, Inc.
Measurement of nitrogen in the body using a commercial PGNAA system--phantom experiments.
Chichester, D L; Empey, E
2004-01-01
An industrial prompt-gamma neutron activation analysis (PGNAA) system, originally designed for the real-time elemental analyses of bulk coal on a conveyor belt, has been studied to examine the feasibility of using such a system for body composition analysis. Experiments were conducted to measure nitrogen in a simple, tissue equivalent phantom comprised of 2.7 wt% of nitrogen. The neutron source for these experiments was 365 MBq (18.38 microg) of 252Cf located within an engineered low Z moderator and it yielded a dose rate in the measurement position of 3.91 mSv/h; data were collected using a 2780 cm(3) NaI(Tl) cylindrical detector with a digital signal processor and a 512 channel MCA. Source, moderator and detector geometries were unaltered from the system's standard configuration, where they have been optimized for considerations such as neutron thermalization, measurement sensitivity and uniformity, background radiation and external dose minimization. Based on net counts in the 10.8 MeV PGNAA nitrogen photopeak and its escape peaks the dose dependent nitrogen count rate was 11,600 counts/mSv with an uncertainty of 3.0% after 0.32 mSv (4.9 min), 2.0% after 0.74 mSv (11.4 min) and 1.0% after 3.02 mSv (46.4 min).
NASA Astrophysics Data System (ADS)
Wyche, K. P.; Monks, P. S.; Smallbone, K. L.; Hamilton, J. F.; Alfarra, M. R.; Rickard, A. R.; McFiggans, G. B.; Jenkin, M. E.; Bloss, W. J.; Ryan, A. C.; Hewitt, C. N.; MacKenzie, A. R.
2015-07-01
Highly non-linear dynamical systems, such as those found in atmospheric chemistry, necessitate hierarchical approaches to both experiment and modelling in order to ultimately identify and achieve fundamental process-understanding in the full open system. Atmospheric simulation chambers comprise an intermediate in complexity, between a classical laboratory experiment and the full, ambient system. As such, they can generate large volumes of difficult-to-interpret data. Here we describe and implement a chemometric dimension reduction methodology for the deconvolution and interpretation of complex gas- and particle-phase composition spectra. The methodology comprises principal component analysis (PCA), hierarchical cluster analysis (HCA) and positive least-squares discriminant analysis (PLS-DA). These methods are, for the first time, applied to simultaneous gas- and particle-phase composition data obtained from a comprehensive series of environmental simulation chamber experiments focused on biogenic volatile organic compound (BVOC) photooxidation and associated secondary organic aerosol (SOA) formation. We primarily investigated the biogenic SOA precursors isoprene, α-pinene, limonene, myrcene, linalool and β-caryophyllene. The chemometric analysis is used to classify the oxidation systems and resultant SOA according to the controlling chemistry and the products formed. Results show that "model" biogenic oxidative systems can be successfully separated and classified according to their oxidation products. Furthermore, a holistic view of results obtained across both the gas- and particle-phases shows the different SOA formation chemistry, initiating in the gas-phase, proceeding to govern the differences between the various BVOC SOA compositions. The results obtained are used to describe the particle composition in the context of the oxidised gas-phase matrix. An extension of the technique, which incorporates into the statistical models data from anthropogenic (i.e. toluene) oxidation and "more realistic" plant mesocosm systems, demonstrates that such an ensemble of chemometric mapping has the potential to be used for the classification of more complex spectra of unknown origin. More specifically, the addition of mesocosm data from fig and birch tree experiments shows that isoprene and monoterpene emitting sources, respectively, can be mapped onto the statistical model structure and their positional vectors can provide insight into their biological sources and controlling oxidative chemistry. The potential to extend the methodology to the analysis of ambient air is discussed using results obtained from a zero-dimensional box model incorporating mechanistic data obtained from the Master Chemical Mechanism (MCMv3.2). Such an extension to analysing ambient air would prove a powerful asset in assisting with the identification of SOA sources and the elucidation of the underlying chemical mechanisms involved.
Compact Video Microscope Imaging System Implemented in Colloid Studies
NASA Technical Reports Server (NTRS)
McDowell, Mark
2002-01-01
Long description Photographs showing fiber-optic light source, microscope and charge-coupled discharge (CCD) camera head connected to camera body, CCD camera body feeding data to image acquisition board in PC, and Cartesian robot controlled via PC board. The Compact Microscope Imaging System (CMIS) is a diagnostic tool with intelligent controls for use in space, industrial, medical, and security applications. CMIS can be used in situ with a minimum amount of user intervention. This system can scan, find areas of interest in, focus on, and acquire images automatically. Many multiple-cell experiments require microscopy for in situ observations; this is feasible only with compact microscope systems. CMIS is a miniature machine vision system that combines intelligent image processing with remote control. The software also has a user-friendly interface, which can be used independently of the hardware for further post-experiment analysis. CMIS has been successfully developed in the SML Laboratory at the NASA Glenn Research Center and adapted for use for colloid studies and is available for telescience experiments. The main innovations this year are an improved interface, optimized algorithms, and the ability to control conventional full-sized microscopes in addition to compact microscopes. The CMIS software-hardware interface is being integrated into our SML Analysis package, which will be a robust general-purpose image-processing package that can handle over 100 space and industrial applications.
Adaptable data management for systems biology investigations.
Boyle, John; Rovira, Hector; Cavnor, Chris; Burdick, David; Killcoyne, Sarah; Shmulevich, Ilya
2009-03-06
Within research each experiment is different, the focus changes and the data is generated from a continually evolving barrage of technologies. There is a continual introduction of new techniques whose usage ranges from in-house protocols through to high-throughput instrumentation. To support these requirements data management systems are needed that can be rapidly built and readily adapted for new usage. The adaptable data management system discussed is designed to support the seamless mining and analysis of biological experiment data that is commonly used in systems biology (e.g. ChIP-chip, gene expression, proteomics, imaging, flow cytometry). We use different content graphs to represent different views upon the data. These views are designed for different roles: equipment specific views are used to gather instrumentation information; data processing oriented views are provided to enable the rapid development of analysis applications; and research project specific views are used to organize information for individual research experiments. This management system allows for both the rapid introduction of new types of information and the evolution of the knowledge it represents. Data management is an important aspect of any research enterprise. It is the foundation on which most applications are built, and must be easily extended to serve new functionality for new scientific areas. We have found that adopting a three-tier architecture for data management, built around distributed standardized content repositories, allows us to rapidly develop new applications to support a diverse user community.
BioSig3D: High Content Screening of Three-Dimensional Cell Culture Models
Bilgin, Cemal Cagatay; Fontenay, Gerald; Cheng, Qingsu; Chang, Hang; Han, Ju; Parvin, Bahram
2016-01-01
BioSig3D is a computational platform for high-content screening of three-dimensional (3D) cell culture models that are imaged in full 3D volume. It provides an end-to-end solution for designing high content screening assays, based on colony organization that is derived from segmentation of nuclei in each colony. BioSig3D also enables visualization of raw and processed 3D volumetric data for quality control, and integrates advanced bioinformatics analysis. The system consists of multiple computational and annotation modules that are coupled together with a strong use of controlled vocabularies to reduce ambiguities between different users. It is a web-based system that allows users to: design an experiment by defining experimental variables, upload a large set of volumetric images into the system, analyze and visualize the dataset, and either display computed indices as a heatmap, or phenotypic subtypes for heterogeneity analysis, or download computed indices for statistical analysis or integrative biology. BioSig3D has been used to profile baseline colony formations with two experiments: (i) morphogenesis of a panel of human mammary epithelial cell lines (HMEC), and (ii) heterogeneity in colony formation using an immortalized non-transformed cell line. These experiments reveal intrinsic growth properties of well-characterized cell lines that are routinely used for biological studies. BioSig3D is being released with seed datasets and video-based documentation. PMID:26978075
Automated Analysis of siRNA Screens of Virus Infected Cells Based on Immunofluorescence Microscopy
NASA Astrophysics Data System (ADS)
Matula, Petr; Kumar, Anil; Wörz, Ilka; Harder, Nathalie; Erfle, Holger; Bartenschlager, Ralf; Eils, Roland; Rohr, Karl
We present an image analysis approach as part of a high-throughput microscopy screening system based on cell arrays for the identification of genes involved in Hepatitis C and Dengue virus replication. Our approach comprises: cell nucleus segmentation, quantification of virus replication level in cells, localization of regions with transfected cells, cell classification by infection status, and quality assessment of an experiment. The approach is fully automatic and has been successfully applied to a large number of cell array images from screening experiments. The experimental results show a good agreement with the expected behavior of positive as well as negative controls and encourage the application to screens from further high-throughput experiments.
Validation of Skills, Knowledge and Experience in Lifelong Learning in Europe
ERIC Educational Resources Information Center
Ogunleye, James
2012-01-01
The paper examines systems of validation of skills and experience as well as the main methods/tools currently used for validating skills and knowledge in lifelong learning. The paper uses mixed methods--a case study research and content analysis of European Union policy documents and frameworks--as a basis for this research. The selection of the…
Shedding More Light and Less Heat on the Results of School Integration. The Georgia Experience.
ERIC Educational Resources Information Center
Christison, Milton; Sida, Donald
One hundred and eighty-eight Georgia school system superintendents were polled in the Spring of 1976 as to their perceptions and experiences concerning the effects of school integration. This paper presents the results of this investigation. Three broad areas were selected for analysis: (1) integration outcomes affecting the public schools, (2)…
LDEF: 69 Months in Space. Part 4: Second Post-Retrieval Symposium
NASA Technical Reports Server (NTRS)
Levine, Arlene S. (Editor)
1993-01-01
A compilation of papers presented at the Second Long Duration Exposure Facility (LDEF) Post-Retrieval Symposium are presented. The papers represent the data analysis of the 57 experiments flown on the LDEF. The experiments include materials, coatings, thermal systems, power and propulsion, science (cosmic ray, interstellar gas, heavy ions, micrometeoroid, etc.), electronics, optics, and life sciences.
LDEF: 69 Months in Space. Part 1: Second Post-Retrieval Symposium
NASA Technical Reports Server (NTRS)
Levine, Arlene S. (Editor)
1993-01-01
A compilation of papers presented at the Second Long Duration Exposure Facility (LDEF) Post-Retrieval Symposium is included. The papers represent the data analysis of the 57 experiments flown on the LDEF. The experiments include materials, coatings, thermal systems, power and propulsion, science (cosmic ray, interstellar gas, heavy ions, micrometeoroid, etc.), electronics, optics, and life sciences.
Experiences of People with Learning Disabilities in the Criminal Justice System
ERIC Educational Resources Information Center
Hyun, Elly; Hahn, Lyndsey; McConnell, David
2014-01-01
The aim of this review is to synthesise findings from research about the experiences of people with learning disabilities who have faced arrest and jail time. After an extensive search of the literature, four relevant articles were found. The first-person accounts presented in these four studies were pooled, and a thematic analysis was undertaken.…
ERIC Educational Resources Information Center
Davis, Charles E.; And Others
A coherent system of decision making is described that may be incorporated into an instructional sequence to provide a supplement to the experience-based judgment of the classroom teacher. The elements of this decision process incorporate prior information such as a teacher's past experience, experimental results such as a test score, and…
French Experience Before 1968. Case Studies on Innovation in Higher Education.
ERIC Educational Resources Information Center
Grignon, C.; Passeron, J. C.
This volume is the fourth in a series of case studies published by the Organization for Economic Cooperation and Development. Chapter 1 discusses the aims of the study, including the concepts of innovation and change, the role of pilot experiments in the development of the university system, and the methods used for this critical analysis of…
Early results from the ultra heavy cosmic ray experiment
NASA Technical Reports Server (NTRS)
Osullivan, D.; Thompson, A.; Bosch, J.; Keegan, R.; Wenzel, K.-P.; Jansen, F.; Domingo, C.
1995-01-01
Data extraction and analysis of the LDEF Ultra Heavy Cosmic Ray Experiment is continuing. Almost twice the pre LDEF world sample has been investigated and some details of the charge spectrum in the region from Z approximately 70 up to and including the actinides are presented. The early results indicate r process enhancement over solar system source abundances.
NASA Cold Land Processes Experiment (CLPX 2002/03): Atmospheric analyses datasets
Glen E. Liston; Daniel L. Birkenheuer; Christopher A. Hiemstra; Donald W. Cline; Kelly Elder
2008-01-01
This paper describes the Local Analysis and Prediction System (LAPS) and the 20-km horizontal grid version of the Rapid Update Cycle (RUC20) atmospheric analyses datasets, which are available as part of the Cold Land Processes Field Experiment (CLPX) data archive. The LAPS dataset contains spatially and temporally continuous atmospheric and surface variables over...
LDEF: 69 Months in Space. Part 3: Second Post-Retrieval Symposium
NASA Technical Reports Server (NTRS)
Levine, Arlene S. (Editor)
1993-01-01
Papers presented at the Second Long Duration Exposure Facility (LDEF) Post-Retrieval Symposium are included. The papers represent the data analysis of the 57 experiments flown on the LDEF. The experiments include materials, coatings, thermal systems, power and propulsion, science (cosmic ray, interstellar gas, heavy ions, micrometeoroid, etc.), electronics, optics, and life science.
Liberia's Experiment with Privatising Education: A Critical Analysis of the RCT Study
ERIC Educational Resources Information Center
Klees, Steven J.
2018-01-01
To experiment with the possible privatisation of its primary education system, Liberia initiated the Partnership Schools of Liberia (PSL), which turned over the management of 93 public schools to eight private contractors. A randomised controlled trial (RCT) study was initiated comparing the PSL schools with matched public schools and the results…
NASA Astrophysics Data System (ADS)
Jules, Kenol; Lin, Paul P.
2007-06-01
With the International Space Station currently operational, a significant amount of acceleration data is being down-linked, processed and analyzed daily on the ground on a continuous basis for the space station reduced gravity environment characterization, the vehicle design requirements verification and science data collection. To help understand the impact of the unique spacecraft environment on the science data, an artificial intelligence monitoring system was developed, which detects in near real time any change in the reduced gravity environment susceptible to affect the on-going experiments. Using a dynamic graphical display, the monitoring system allows science teams, at any time and any location, to see the active vibration disturbances, such as pumps, fans, compressor, crew exercise, re-boost and extra-vehicular activities that might impact the reduced gravity environment the experiments are exposed to. The monitoring system can detect both known and unknown vibratory disturbance activities. It can also perform trend analysis and prediction by analyzing past data over many increments (an increment usually lasts 6 months) collected onboard the station for selected disturbances. This feature can be used to monitor the health of onboard mechanical systems to detect and prevent potential systems failures. The monitoring system has two operating modes: online and offline. Both near real-time on-line vibratory disturbance detection and off-line detection and trend analysis are discussed in this paper.
STS payload data collection and accommodations analysis study. Volume 3: Accommodations analysis
NASA Technical Reports Server (NTRS)
1978-01-01
Payload requirements were compared to launch site accommodations and flight accommodations for a number of Spacelab payloads. Experiment computer operating system accommodations were also considered. A summary of accommodations in terms of resources available for payload discretionary use and recommendations for Spacelab/STS accommodation improvements are presented.
Computer-Generated, Three-Dimensional Character Animation: A Report and Analysis.
ERIC Educational Resources Information Center
Kingsbury, Douglas Lee
This master's thesis details the experience gathered in the production "Snoot and Muttly," a short character animation with 3-D computer generated images, and provides an analysis of the computer-generated 3-D character animation system capabilities. Descriptions are provided of the animation environment at the Ohio State University…
On the assimilation of satellite derived soil moisture in numerical weather prediction models
NASA Astrophysics Data System (ADS)
Drusch, M.
2006-12-01
Satellite derived surface soil moisture data sets are readily available and have been used successfully in hydrological applications. In many operational numerical weather prediction systems the initial soil moisture conditions are analysed from the modelled background and 2 m temperature and relative humidity. This approach has proven its efficiency to improve surface latent and sensible heat fluxes and consequently the forecast on large geographical domains. However, since soil moisture is not always related to screen level variables, model errors and uncertainties in the forcing data can accumulate in root zone soil moisture. Remotely sensed surface soil moisture is directly linked to the model's uppermost soil layer and therefore is a stronger constraint for the soil moisture analysis. Three data assimilation experiments with the Integrated Forecast System (IFS) of the European Centre for Medium-range Weather Forecasts (ECMWF) have been performed for the two months period of June and July 2002: A control run based on the operational soil moisture analysis, an open loop run with freely evolving soil moisture, and an experimental run incorporating bias corrected TMI (TRMM Microwave Imager) derived soil moisture over the southern United States through a nudging scheme using 6-hourly departures. Apart from the soil moisture analysis, the system setup reflects the operational forecast configuration including the atmospheric 4D-Var analysis. Soil moisture analysed in the nudging experiment is the most accurate estimate when compared against in-situ observations from the Oklahoma Mesonet. The corresponding forecast for 2 m temperature and relative humidity is almost as accurate as in the control experiment. Furthermore, it is shown that the soil moisture analysis influences local weather parameters including the planetary boundary layer height and cloud coverage. The transferability of the results to other satellite derived soil moisture data sets will be discussed.
Cooper, Lauren; Balandin, Susan; Trembath, David
2009-01-01
Young adults with cerebral palsy who use augmentative and alternative communication (AAC) systems may be at increased risk of loneliness due to the additional challenges they experience with communication. Six young adults, aged 24-30 years, who used AAC and had cerebral palsy, participated in in-depth interviews to explore their experiences of loneliness as they made the transition into adulthood. A total of five major themes in the data were identified using the constant comparative method of analysis. Three of these themes were discussed by all participants: (a) Support Networks, (b) AAC System Use, and (c) Technology. The authors concluded that these three themes were most important in understanding the experiences of loneliness of the young adults with cerebral palsy who participated in this study.
Design of an automated imaging system for use in a space experiment
NASA Technical Reports Server (NTRS)
Hartz, William G.; Bozzolo, Nora G.; Lewis, Catherine C.; Pestak, Christopher J.
1991-01-01
An experiment, occurring in an orbiting platform, examines the mass transfer across gas-liquid and liquid-liquid interfaces. It employs an imaging system with real time image analysis. The design includes optical design, imager selection and integration, positioner control, image recording, software development for processing and interfaces to telemetry. It addresses the constraints of weight, volume, and electric power associated with placing the experiment in the Space Shuttle cargo bay. Challenging elements of the design are: imaging and recording of a 200-micron-diameter bubble with a resolution of 2 microns to serve a primary source of data; varying frame rates from 500 per second to 1 frame per second, depending on the experiment phase; and providing three-dimensional information to determine the shape of the bubble.
NASA Technical Reports Server (NTRS)
Bertelrud, Arild; Anders, J. B. (Technical Monitor)
2002-01-01
A 2-D high-lift system experiment was conducted in August of 1996 in the Low Turbulence Pressure Tunnel at NASA Langley Research Center, Hampton, VA. The purpose of the experiment was to obtain transition measurements on a three element high-lift system for CFD code validation studies. A transition database has been created using the data from this experiment. The present report contains the analysis of the surface hot film data in terms of the transition locations on the three elements. It also includes relevant information regarding the pressure loads and distributions and the wakes behind the model to aid in the interpretation of the transition data. For some of the configurations the current pressure data has been compared with previous wind tunnel entries of the same model. The methodology used to determine the regions of transitional flow is outlined and each configuration tested has been analyzed. A discussion of interference effects, repeatability, and three-dimensional effects on the data is included.
Temperature-Driven Shape Changes of the Near Earth Asteroid Scout Solar Sail
NASA Technical Reports Server (NTRS)
Stohlman, Olive R.; Loper, Erik R.; Lockett, Tiffany E.
2017-01-01
Near Earth Asteroid Scout (NEA Scout) is a NASA deep space Cubesat, scheduled to launch on the Exploration Mission 1 flight of the Space Launch System. NEA Scout will use a deployable solar sail as its primary propulsion system. The sail is a square membrane supported by rigid metallic tapespring booms, and analysis predicts that these booms will experience substantial thermal warping if they are exposed to direct sunlight in the space environment. NASA has conducted sunspot chamber experiments to confirm the thermal distortion of this class of booms, demonstrating tip displacement of between 20 and 50 centimeters in a 4-meter boom. The distortion behavior of the boom is complex and demonstrates an application for advanced thermal-structural analysis. The needs of the NEA Scout project were supported by changing the solar sail design to keep the booms shaded during use of the solar sail, and an additional experiment in the sunspot chamber is presented in support of this solution.
The Use of a Microcomputer in Collecting Data from Cardiovascular Experiments on Muscle Relaxants
Thut, Paul D.; Polansky, Gregg; Pruzansky, Elysa
1983-01-01
The possible association of cardiovascular side-effects from potentially, clinically useful non-depolarizing neuromuscular blocking drugs has been studied with the aid of a micro- computer. The maximal changes in heart rate, systolic, diastolic and mean arterial pressure and pulse pressure were recorded in the onset, maximal effect and recovery phase of relaxant activity in dogs anesthetized with isoflurane. The data collection system employed a Gould 2800S polygraph, an Apple II Plus microcomputer, a Cyborg Corp. ‘Issac’ 12 bit analog to digital converter, two 5 1/4″ floppy disk drives, a ‘Videoterm’ 80 column display board and a 12″ green phosphor monitor. Prior to development of the computer system, direct analysis of polygraph records required more than three times more time than the actual experiment. With the aid of the computer, analysis of data, tabular and graphic presentation and narrative reports were completed within 15 minutes after the end of the experiment.
Acquisition and analysis of accelerometer data
NASA Astrophysics Data System (ADS)
Verges, Keith R.
1990-08-01
Acceleration data reduction must be undertaken with a complete understanding of the physical process, the means by which the data are acquired, and finally, the calculations necessary to put the data into a meaningful format. Discussed here are the acceleration sensor requirements dictated by the measurements desired. Sensor noise, dynamic range, and linearity will be determined from the physical parameters of the experiment. The digitizer requirements are discussed. Here the system from sensor to digital storage medium will be integrated, and rules of thumb for experiment duration, filter response, and number of bits are explained. Data reduction techniques after storage are also discussed. Time domain operations including decimating, digital filtering, and averaging are covered, as well as frequency domain methods, including windowing and the difference between power and amplitude spectra, and simple noise determination via coherence analysis. Finally, an example experiment using the Teledyne Geotech Model 44000 Seismometer to measure from 1 Hz to 10(exp -6) Hz is discussed. The sensor, data acquisition system, and example spectra are presented.
Acquisition and analysis of accelerometer data
NASA Technical Reports Server (NTRS)
Verges, Keith R.
1990-01-01
Acceleration data reduction must be undertaken with a complete understanding of the physical process, the means by which the data are acquired, and finally, the calculations necessary to put the data into a meaningful format. Discussed here are the acceleration sensor requirements dictated by the measurements desired. Sensor noise, dynamic range, and linearity will be determined from the physical parameters of the experiment. The digitizer requirements are discussed. Here the system from sensor to digital storage medium will be integrated, and rules of thumb for experiment duration, filter response, and number of bits are explained. Data reduction techniques after storage are also discussed. Time domain operations including decimating, digital filtering, and averaging are covered, as well as frequency domain methods, including windowing and the difference between power and amplitude spectra, and simple noise determination via coherence analysis. Finally, an example experiment using the Teledyne Geotech Model 44000 Seismometer to measure from 1 Hz to 10(exp -6) Hz is discussed. The sensor, data acquisition system, and example spectra are presented.
Piezoelectric power generation using friction-induced vibration
NASA Astrophysics Data System (ADS)
Tadokoro, Chiharu; Matsumoto, Aya; Nagamine, Takuo; Sasaki, Shinya
2017-06-01
In order to examine the feasibility of power generation by using friction-induced vibration with a piezoelectric element, we performed experiments and numerical analysis. In the experiments, the generated power in the piezoelectric element and the displacement of an oscillator were measured by a newly developed apparatus that embodied a single-degree-of-freedom (1-DOF) system with friction. In the numerical analysis, an analytical model of a 1-DOF system with friction and piezoelectric element was proposed to simulate the experiments. The experimental results demonstrated that the power of a few microwatts was generated by sliding between a steel ball and a steel plate lubricated with glycerol. In this study, a maximum power of approximately 10 μW was generated at a driving velocity of 40 mm s-1 and a normal load of 15 N. The numerical results demonstrated good qualitative agreement with the experimental results. This implies that this analytical model can be applied to optimize the oscillator design in piezoelectric power generation using friction-induced vibration.
Bouvignies, Guillaume; Hansen, D Flemming; Vallurupalli, Pramodh; Kay, Lewis E
2011-02-16
A method for quantifying millisecond time scale exchange in proteins is presented based on scaling the rate of chemical exchange using a 2D (15)N, (1)H(N) experiment in which (15)N dwell times are separated by short spin-echo pulse trains. Unlike the popular Carr-Purcell-Meiboom-Gill (CPMG) experiment where the effects of a radio frequency field on measured transverse relaxation rates are quantified, the new approach measures peak positions in spectra that shift as the effective exchange time regime is varied. The utility of the method is established through an analysis of data recorded on an exchanging protein-ligand system for which the exchange parameters have been accurately determined using alternative approaches. Computations establish that a combined analysis of CPMG and peak shift profiles extends the time scale that can be studied to include exchanging systems with highly skewed populations and exchange rates as slow as 20 s(-1).
NASA Technical Reports Server (NTRS)
1972-01-01
An analysis and conceptual design of a baseline mission and spacecraft are presented. Aspects of the HEAO-C discussed include: baseline experiments with X-ray observations of space, analysis of mission requirements, observatory design, structural analysis, thermal control, attitude sensing and control system, communication and data handling, and space shuttle launch and retrieval of HEAO-C.
Thermal integration of Spacelab experiments
NASA Technical Reports Server (NTRS)
Patterson, W. C.; Hopson, G. D.
1978-01-01
The method of thermally integrating the experiments for Spacelab is discussed. The scientific payload consists of a combination of European and United States sponsored experiments located in the module as well as on a single Spacelab pallet. The thermal integration must result in accomodating the individual experiment requirements as well as ensuring that the total payload is within the Spacelab Environmental Control System (ECS) resource capability. An integrated thermal/ECS analysis of the module and pallet is performed in concert with the mission timeline to ensure that the agreed upon experiment requirements are accommodated and to ensure the total payload is within the Spacelab ECS resources.
High Energy Astronomy Observatory, Mission C, Phase A. Volume 3: Appendices
NASA Technical Reports Server (NTRS)
1972-01-01
Technical data, and experiment and spacecraft alternatives are presented in support of the HEAO-C, whose primary objective is a detailed study of the more interesting high energy sources, using grazing incidence X-ray telescopes and a spacecraft pointing accuracy of + or - 1 arc minute. The analyses presented cover the mission analysis and launch vehicle; thermal control trade studies and supporting analyses; attitude sensing and control analyses; electrical systems; and reliability analysis. The alternate experiments which were considered are listed, and the advantages and disadvantages of several alternate observatory configurations are assessed.
2008-03-01
investigated, as well as the methodology used . Chapter IV presents the data collection and analysis procedures, and the resulting analysis and...interpolate the data, although a non-interpolating model is possible. For this research Design and Analysis of Computer Experiments (DACE) is used ...followed by the analysis . 4.1. Testing Approach The initial SMOMADS algorithm used for this research was acquired directly from Walston [70]. The
Ontological and Epistemological Issues Regarding Climate Models and Computer Experiments
NASA Astrophysics Data System (ADS)
Vezer, M. A.
2010-12-01
Recent philosophical discussions (Parker 2009; Frigg and Reiss 2009; Winsberg, 2009; Morgon 2002, 2003, 2005; Gula 2002) about the ontology of computer simulation experiments and the epistemology of inferences drawn from them are of particular relevance to climate science as computer modeling and analysis are instrumental in understanding climatic systems. How do computer simulation experiments compare with traditional experiments? Is there an ontological difference between these two methods of inquiry? Are there epistemological considerations that result in one type of inference being more reliable than the other? What are the implications of these questions with respect to climate studies that rely on computer simulation analysis? In this paper, I examine these philosophical questions within the context of climate science, instantiating concerns in the philosophical literature with examples found in analysis of global climate change. I concentrate on Wendy Parker’s (2009) account of computer simulation studies, which offers a treatment of these and other questions relevant to investigations of climate change involving such modelling. Two theses at the center of Parker’s account will be the focus of this paper. The first is that computer simulation experiments ought to be regarded as straightforward material experiments; which is to say, there is no significant ontological difference between computer and traditional experimentation. Parker’s second thesis is that some of the emphasis on the epistemological importance of materiality has been misplaced. I examine both of these claims. First, I inquire as to whether viewing computer and traditional experiments as ontologically similar in the way she does implies that there is no proper distinction between abstract experiments (such as ‘thought experiments’ as well as computer experiments) and traditional ‘concrete’ ones. Second, I examine the notion of materiality (i.e., the material commonality between object and target systems) and some arguments for the claim that materiality entails some inferential advantage to traditional experimentation. I maintain that Parker’s account of the ontology of computer simulations has some interesting though potentially problematic implications regarding conventional distinctions between abstract and concrete methods of inquiry. With respect to her account of materiality, I outline and defend an alternative account, posited by Mary Morgan (2002, 2003, 2005), which holds that ontological similarity between target and object systems confers some epistemological advantage to traditional forms of experimental inquiry.
NASA Technical Reports Server (NTRS)
Poulos, Gregory S.; Stamus, Peter A.; Snook, John S.
2005-01-01
The Cold Land Processes Experiment (CLPX) experiment emphasized the development of a strong synergism between process-oriented understanding, land surface models and microwave remote sensing. Our work sought to investigate which topographically- generated atmospheric phenomena are most relevant to the CLPX MSA's for the purpose of evaluating their climatic importance to net local moisture fluxes and snow transport through the use of high-resolution data assimilation/atmospheric numerical modeling techniques. Our task was to create three long-term, scientific quality atmospheric datasets for quantitative analysis (for all CLPX researchers) and provide a summary of the meteorologically-relevant phenomena of the three MSAs (see Figure) over northern Colorado. Our efforts required the ingest of a variety of CLPX datasets and the execution an atmospheric and land surface data assimilation system based on the Navier-Stokes equations (the Local Analysis and Prediction System, LAPS, and an atmospheric numerical weather prediction model, as required) at topographically- relevant grid spacing (approx. 500 m). The resulting dataset will be analyzed by the CLPX community as a part of their larger research goals to determine the relative influence of various atmospheric phenomena on processes relevant to CLPX scientific goals.
Feng, Haibo; Dong, Dinghui; Ma, Tengfei; Zhuang, Jinlei; Fu, Yili; Lv, Yi; Li, Liyi
2017-12-01
Surgical robot systems which can significantly improve surgical procedures have been widely used in laparoendoscopic single-site surgery (LESS). For a relative complex surgical procedure, the development of an in vivo visual robot system for LESS can effectively improve the visualization for surgical robot systems. In this work, an in vivo visual robot system with a new mechanism for LESS was investigated. A finite element method (FEM) analysis was carried out to ensure the safety of the in vivo visual robot during the movement, which was the most important concern for surgical purposes. A master-slave control strategy was adopted, in which the control model was established by off-line experiments. The in vivo visual robot system was verified by using a phantom box. The experiment results show that the robot system can successfully realize the expected functionalities and meet the demands of LESS. The experiment results indicate that the in vivo visual robot with high manipulability has great potential in clinical application. Copyright © 2017 John Wiley & Sons, Ltd.
Modular space station phase B extension preliminary system design. Volume 7: Ancillary studies
NASA Technical Reports Server (NTRS)
Jones, A. L.
1972-01-01
Sortie mission analysis and reduced payloads size impact studies are presented. In the sortie mission analysis, a modular space station oriented experiment program to be flown by the space shuttle during the period prior to space station IOC is discussed. Experiments are grouped into experiment packages. Mission payloads are derived by grouping experiment packages and by adding support subsystems and structure. The operational and subsystems analyses of these payloads are described. Requirements, concepts, and shuttle interfaces are integrated. The sortie module/station module commonality and a sortie laboratory concept are described. In the payloads size analysis, the effect on the modular space station concept of reduced diameter and reduced length of the shuttle cargo bay is discussed. Design concepts are presented for reduced sizes of 12 by 60 ft, 14 by 40 ft, and 12 by 40 ft. Comparisons of these concepts with the modular station (14 by 60 ft) are made to show the impact of payload size changes.
Dicks, Sean Glenton; Ranse, Kristen; van Haren, Frank MP; Boer, Douglas P
2017-01-01
Information and compassion assist families of potential organ donors to make informed decisions. However, psychological implications of the in-hospital process are not well described with past research focusing on decision-making. To enhance understanding and improve service delivery, a systematic review was conducted. Inductive analysis and synthesis utilised Grounded Theory Methodology within a systems theory framework and contributed to a model proposing that family and staff form a System of Systems with shared responsibility for process outcomes. This model can guide evaluation and improvement of care and will be tested by means of a longitudinal study of family experiences. PMID:28680696
NASA Astrophysics Data System (ADS)
Williams, Caitlin R. S.; Sorrentino, Francesco; Murphy, Thomas E.; Roy, Rajarshi
2013-12-01
We experimentally study the complex dynamics of a unidirectionally coupled ring of four identical optoelectronic oscillators. The coupling between these systems is time-delayed in the experiment and can be varied over a wide range of delays. We observe that as the coupling delay is varied, the system may show different synchronization states, including complete isochronal synchrony, cluster synchrony, and two splay-phase states. We analyze the stability of these solutions through a master stability function approach, which we show can be effectively applied to all the different states observed in the experiment. Our analysis supports the experimentally observed multistability in the system.
Onset of dissolution-driven instabilities in fluids with nonmonotonic density profile
NASA Astrophysics Data System (ADS)
Jafari Raad, Seyed Mostafa; Hassanzadeh, Hassan
2015-11-01
Analog systems have recently been used in several experiments in the context of convective mixing of C O2 . We generalize the nonmonotonic density dependence of the growth of instabilities and provide a scaling relation for the onset of instability. The results of linear stability analysis and direct numerical simulations show that these fluids do not resemble the dynamics of C O2 -water convective instabilities. A typical analog system, such as water-propylene glycol, is found to be less unstable than C O2 -water. These results provide a basis for further research and proper selection of analog systems and are essential to the interpretation of experiments.
Modal Parameter Identification of a Flexible Arm System
NASA Technical Reports Server (NTRS)
Barrington, Jason; Lew, Jiann-Shiun; Korbieh, Edward; Wade, Montanez; Tantaris, Richard
1998-01-01
In this paper an experiment is designed for the modal parameter identification of a flexible arm system. This experiment uses a function generator to provide input signal and an oscilloscope to save input and output response data. For each vibrational mode, many sets of sine-wave inputs with frequencies close to the natural frequency of the arm system are used to excite the vibration of this mode. Then a least-squares technique is used to analyze the experimental input/output data to obtain the identified parameters for this mode. The identified results are compared with the analytical model obtained by applying finite element analysis.
Summary Report of Mission Acceleration Measurements for STS-78. Launched June 20, 1996
NASA Technical Reports Server (NTRS)
Hakimzadeh, Roshanak; Hrovat, Kenneth; McPherson, Kevin M.; Moskowitz, Milton E.; Rogers, Melissa J. B.
1997-01-01
The microgravity environment of the Space Shuttle Columbia was measured during the STS-78 mission using accelerometers from three different instruments: the Orbital Acceleration Research Experiment, the Space Acceleration Measurement System and the Microgravity Measurement Assembly. The quasi-steady environment was also calculated in near real-time during the mission by the Microgravity Analysis Workstation. The Orbital Acceleration Research Experiment provided investigators with real-time quasi-steady acceleration measurements. The Space Acceleration Measurement System recorded higher frequency data on-board for post-mission analysis. The Microgravity Measurement Assembly provided investigators with real-time quasi-steady and higher frequency acceleration measurements. The Microgravity Analysis Workstation provided calculation of the quasi-steady environment. This calculation was presented to the science teams in real-time during the mission. The microgravity environment related to several different Orbiter, crew and experiment operations is presented and interpreted in this report. A radiator deploy, the Flight Control System checkout, and a vernier reaction control system reboost demonstration had minimal effects on the acceleration environment, with excitation of frequencies in the 0.01 to 10 Hz range. Flash Evaporator System venting had no noticeable effect on the environment while supply and waste water dumps caused excursions of 2 x lO(exp -6) to 4 x 10(exp -6) g in the Y(sub b) and Z(sub b) directions. Crew sleep and ergometer exercise periods can be clearly seen in the acceleration data, as expected. Accelerations related to the two Life Science Laboratory Equipment Refrigerator/Freezers were apparent in the data as are accelerations caused by the Johnson Space Center Projects Centrifuge. As on previous microgravity missions, several signals are present in the acceleration data for which a source has not been identified. The causes of these accelerations are under investigation.
Delayed-Choice Experiments and the Metaphysics of Entanglement
NASA Astrophysics Data System (ADS)
Egg, Matthias
2013-09-01
Delayed-choice experiments in quantum mechanics are often taken to undermine a realistic interpretation of the quantum state. More specifically, Healey has recently argued that the phenomenon of delayed-choice entanglement swapping is incompatible with the view that entanglement is a physical relation between quantum systems. This paper argues against these claims. It first reviews two paradigmatic delayed-choice experiments and analyzes their metaphysical implications. It then applies the results of this analysis to the case of entanglement swapping, showing that such experiments pose no threat to realism about entanglement.
Lu, Liang-Xing; Wang, Ying-Min; Srinivasan, Bharathi Madurai; Asbahi, Mohamed; Yang, Joel K W; Zhang, Yong-Wei
2016-09-01
We perform systematic two-dimensional energetic analysis to study the stability of various nanostructures formed by dewetting solid films deposited on patterned substrates. Our analytical results show that by controlling system parameters such as the substrate surface pattern, film thickness and wetting angle, a variety of equilibrium nanostructures can be obtained. Phase diagrams are presented to show the complex relations between these system parameters and various nanostructure morphologies. We further carry out both phase field simulations and dewetting experiments to validate the analytically derived phase diagrams. Good agreements between the results from our energetic analyses and those from our phase field simulations and experiments verify our analysis. Hence, the phase diagrams presented here provide guidelines for using solid-state dewetting as a tool to achieve various nanostructures.
Seeking Information with an Information Visualization System: A Study of Cognitive Styles
ERIC Educational Resources Information Center
Yuan, Xiaojun; Zhang, Xiangman; Chen, Chaomei; Avery, Joshua M.
2011-01-01
Introduction: This study investigated the effect of cognitive styles on users' information-seeking task performance using a knowledge domain information visualization system called CiteSpace. Method: Sixteen graduate students participated in a user experiment. Each completed an extended cognitive style analysis wholistic-analytic test (the…
Structure Prediction and Analysis of Neuraminidase Sequence Variants
ERIC Educational Resources Information Center
Thayer, Kelly M.
2016-01-01
Analyzing protein structure has become an integral aspect of understanding systems of biochemical import. The laboratory experiment endeavors to introduce protein folding to ascertain structures of proteins for which the structure is unavailable, as well as to critically evaluate the quality of the prediction obtained. The model system used is the…
2016-01-22
applications. For space applications, attitude control systems can provide good angular control of the antenna aperture with small residual angular...Bilyeu, and G.R. Veal, Development of Flight Hardware for a Large Inflatable- Deployable Antenna Experiment , Acta Astronautica, Vol. 38, Nos. 4-8
NASA Astrophysics Data System (ADS)
Petrosyan, A. Sh.
2016-09-01
PanDA (Production and Distributed Analysis System) is a workload management system, widely used for data processing at experiments on Large Hadron Collider and others. COMPASS is a high-energy physics experiment at the Super Proton Synchrotron. Data processing for COMPASS runs locally at CERN, on lxbatch, the data itself stored in CASTOR. In 2014 an idea to start running COMPASS production through PanDA arose. Such transformation in experiment's data processing will allow COMPASS community to use not only CERN resources, but also Grid resources worldwide. During the spring and summer of 2015 installation, validation and migration work is being performed at JINR. Details and results of this process are presented in this paper.
Control-based continuation: Bifurcation and stability analysis for physical experiments
NASA Astrophysics Data System (ADS)
Barton, David A. W.
2017-02-01
Control-based continuation is technique for tracking the solutions and bifurcations of nonlinear experiments. The idea is to apply the method of numerical continuation to a feedback-controlled physical experiment such that the control becomes non-invasive. Since in an experiment it is not (generally) possible to set the state of the system directly, the control target becomes a proxy for the state. Control-based continuation enables the systematic investigation of the bifurcation structure of a physical system, much like if it was numerical model. However, stability information (and hence bifurcation detection and classification) is not readily available due to the presence of stabilising feedback control. This paper uses a periodic auto-regressive model with exogenous inputs (ARX) to approximate the time-varying linearisation of the experiment around a particular periodic orbit, thus providing the missing stability information. This method is demonstrated using a physical nonlinear tuned mass damper.
An Introduction to MAMA (Meta-Analysis of MicroArray data) System.
Zhang, Zhe; Fenstermacher, David
2005-01-01
Analyzing microarray data across multiple experiments has been proven advantageous. To support this kind of analysis, we are developing a software system called MAMA (Meta-Analysis of MicroArray data). MAMA utilizes a client-server architecture with a relational database on the server-side for the storage of microarray datasets collected from various resources. The client-side is an application running on the end user's computer that allows the user to manipulate microarray data and analytical results locally. MAMA implementation will integrate several analytical methods, including meta-analysis within an open-source framework offering other developers the flexibility to plug in additional statistical algorithms.
A Control System and Streaming DAQ Platform with Image-Based Trigger for X-ray Imaging
NASA Astrophysics Data System (ADS)
Stevanovic, Uros; Caselle, Michele; Cecilia, Angelica; Chilingaryan, Suren; Farago, Tomas; Gasilov, Sergey; Herth, Armin; Kopmann, Andreas; Vogelgesang, Matthias; Balzer, Matthias; Baumbach, Tilo; Weber, Marc
2015-06-01
High-speed X-ray imaging applications play a crucial role for non-destructive investigations of the dynamics in material science and biology. On-line data analysis is necessary for quality assurance and data-driven feedback, leading to a more efficient use of a beam time and increased data quality. In this article we present a smart camera platform with embedded Field Programmable Gate Array (FPGA) processing that is able to stream and process data continuously in real-time. The setup consists of a Complementary Metal-Oxide-Semiconductor (CMOS) sensor, an FPGA readout card, and a readout computer. It is seamlessly integrated in a new custom experiment control system called Concert that provides a more efficient way of operating a beamline by integrating device control, experiment process control, and data analysis. The potential of the embedded processing is demonstrated by implementing an image-based trigger. It records the temporal evolution of physical events with increased speed while maintaining the full field of view. The complete data acquisition system, with Concert and the smart camera platform was successfully integrated and used for fast X-ray imaging experiments at KIT's synchrotron radiation facility ANKA.
LDEF systems special investigation group overview
NASA Technical Reports Server (NTRS)
Mason, Jim; Dursch, Harry
1995-01-01
The Systems Special Investigation Group (Systems SIG), formed by the LDEF Project Office to perform post-flight analysis of LDEF systems hardware, was chartered to investigate the effects of the extended LDEF mission on both satellite and experiment systems and to coordinate and integrate all systems related analyses performed during post-flight investigations. The Systems SIG published a summary report in April, 1992 titled 'Analysis of Systems Hardware Flown on LDEF - Results of the Systems Special Investigation Group' that described findings through the end of 1991. The Systems SIG, unfunded in FY 92 and FY93, has been funded in FY 94 to update this report with all new systems related findings. This paper provides a brief summary of the highlights of earlier Systems SIG accomplishments and describes tasks the Systems SIG has been funded to accomplish in FY 94.
NASA Astrophysics Data System (ADS)
Grzeszczuk, A.; Kowalski, S.
2015-04-01
Compute Unified Device Architecture (CUDA) is a parallel computing platform developed by Nvidia for increase speed of graphics by usage of parallel mode for processes calculation. The success of this solution has opened technology General-Purpose Graphic Processor Units (GPGPUs) for applications not coupled with graphics. The GPGPUs system can be applying as effective tool for reducing huge number of data for pulse shape analysis measures, by on-line recalculation or by very quick system of compression. The simplified structure of CUDA system and model of programming based on example Nvidia GForce GTX580 card are presented by our poster contribution in stand-alone version and as ROOT application.
Hilton, Gillean; Unsworth, Carolyn A; Stuckey, Ruth; Murphy, Gregory C
2018-01-01
Vocational potential in people with spinal cord injury (SCI) are unrealised with rates of employment substantially lower than in the labour force participation of the general population and the pre-injury employment rates. To understand the experience and pathway of people achieving employment outcome after traumatic spinal cord injury by; classifying participants into employment outcome groups of stable, unstable and without employment; identifying pre and post-injury pathways for participants in each group and, exploring the experiences of people of seeking, gaining and maintaining employment. Thirty-one participants were interviewed. Mixed methods approach including interpretive phenomenological analysis and vocational pathway mapping of quantitative data. The most common pathway identified was from study and work pre-injury to stable employment post-injury. Four super-ordinate themes were identified from the interpretive phenomenological analysis; expectations of work, system impacts, worker identity and social supports. Implications for clinical practice include fostering cultural change, strategies for system navigation, promotion of worker identity and optimal use of social supports. The findings increase insight and understanding of the complex experience of employment after spinal cord injury. There is opportunity to guide experimental research, policy development and education concerning the complexity of the return to work experience and factors that influence pathways.
NASA Astrophysics Data System (ADS)
Yang, Xiaojun; Lu, Dun; Liu, Hui; Zhao, Wanhua
2018-06-01
The complicated electromechanical coupling phenomena due to different kinds of causes have significant influences on the dynamic precision of the direct driven feed system in machine tools. In this paper, a novel integrated modeling and analysis method of the multiple electromechanical couplings for the direct driven feed system in machine tools is presented. At first, four different kinds of electromechanical coupling phenomena in the direct driven feed system are analyzed systematically. Then a novel integrated modeling and analysis method of the electromechanical coupling which is influenced by multiple factors is put forward. In addition, the effects of multiple electromechanical couplings on the dynamic precision of the feed system and their main influencing factors are compared and discussed, respectively. Finally, the results of modeling and analysis are verified by the experiments. It finds out that multiple electromechanical coupling loops, which are overlapped and influenced by each other, are the main reasons of the displacement fluctuations in the direct driven feed system.
Polarization-insensitive techniques for optical signal processing
NASA Astrophysics Data System (ADS)
Salem, Reza
2006-12-01
This thesis investigates polarization-insensitive methods for optical signal processing. Two signal processing techniques are studied: clock recovery based on two-photon absorption in silicon and demultiplexing based on cross-phase modulation in highly nonlinear fiber. The clock recovery system is tested at an 80 Gb/s data rate for both back-to-back and transmission experiments. The demultiplexer is tested at a 160 Gb/s data rate in a back-to-back experiment. We experimentally demonstrate methods for eliminating polarization dependence in both systems. Our experimental results are confirmed by theoretical and numerical analysis.
Analysis and control of the vibration of doubly fed wind turbine
NASA Astrophysics Data System (ADS)
Yu, Manye; Lin, Ying
2017-01-01
The fault phenomena of the violent vibration of certain doubly-fed wind turbine were researched comprehensively, and the dynamic characteristics, load and fault conditions of the system were discussed. Firstly, the structural dynamics analysis of wind turbine is made, and the dynamics mold is built. Secondly, the vibration testing of wind turbine is done with the German test and analysis systems BBM. Thirdly, signal should be analyzed and dealt with. Based on the experiment, spectrum analysis of the motor dynamic balance can be made by using signal processing toolbox of MATLAB software, and the analysis conclusions show that the vibration of wind turbine is caused by dynamic imbalance. The results show that integrating mechanical system dynamics theory with advanced test technology can solve the vibration problem more successfully, which is important in vibration diagnosis of mechanical equipment.
Transformation Systems at NASA Ames
NASA Technical Reports Server (NTRS)
Buntine, Wray; Fischer, Bernd; Havelund, Klaus; Lowry, Michael; Pressburger, TOm; Roach, Steve; Robinson, Peter; VanBaalen, Jeffrey
1999-01-01
In this paper, we describe the experiences of the Automated Software Engineering Group at the NASA Ames Research Center in the development and application of three different transformation systems. The systems span the entire technology range, from deductive synthesis, to logic-based transformation, to almost compiler-like source-to-source transformation. These systems also span a range of NASA applications, including solving solar system geometry problems, generating data analysis software, and analyzing multi-threaded Java code.
NASA Astrophysics Data System (ADS)
Khavekar, Rajendra; Vasudevan, Hari, Dr.; Modi, Bhavik
2017-08-01
Two well-known Design of Experiments (DoE) methodologies, such as Taguchi Methods (TM) and Shainin Systems (SS) are compared and analyzed in this study through their implementation in a plastic injection molding unit. Experiments were performed at a perfume bottle cap manufacturing company (made by acrylic material) using TM and SS to find out the root cause of defects and to optimize the process parameters for minimum rejection. Experiments obtained the rejection rate to be 8.57% from 40% (appx.) during trial runs, which is quiet low, representing successful implementation of these DoE methods. The comparison showed that both methodologies gave same set of variables as critical for defect reduction, but with change in their significance order. Also, Taguchi methods require more number of experiments and consume more time compared to the Shainin System. Shainin system is less complicated and is easy to implement, whereas Taguchi methods is statistically more reliable for optimization of process parameters. Finally, experimentations implied that DoE methods are strong and reliable in implementation, as organizations attempt to improve the quality through optimization.
Online & Offline data storage and data processing at the European XFEL facility
NASA Astrophysics Data System (ADS)
Gasthuber, Martin; Dietrich, Stefan; Malka, Janusz; Kuhn, Manuela; Ensslin, Uwe; Wrona, Krzysztof; Szuba, Janusz
2017-10-01
For the upcoming experiments at the European XFEL light source facility, a new online and offline data processing and storage infrastructure is currently being built and verified. Based on the experience of the system being developed for the Petra III light source at DESY, presented at the last CHEP conference, we further develop the system to cope with the much higher volumes and rates ( 50GB/sec) together with a more complex data analysis and infrastructure conditions (i.e. long range InfiniBand connections). This work will be carried out in collaboration of DESY/IT, European XFEL and technology support from IBM/Research. This presentation will shortly wrap up the experience of 1 year runtime of the PetraIII ([3]) system, continue with a short description of the challenges for the European XFEL ([2]) experiments and the main section, showing the proposed system for online and offline with initial result from real implementation (HW & SW). This will cover the selected cluster filesystem GPFS ([5]) including Quality of Service (QOS), extensive use of flash based subsystems and other new and unique features this architecture will benefit from.
Results from the testing and analysis of LDEF batteries
NASA Technical Reports Server (NTRS)
Spear, Steve; Dursch, Harry; Johnson, Chris
1992-01-01
Batteries were used on the Long Duration Exposure Facility (LDEF) to provide power to both the active experiments and the experiment support equipment such as the Experiment Initiative System, Experiment Power and Data System (data acquisition system), and the Environment Exposure Control Canisters. Three different types of batteries were used: lithium sulfur dioxide (LiSO2), lithium carbon monofluoride (LiCF), and nickel cadmium (NiCd). A total of 92 LiSO2, 10 LiCF, and 1 NiCd batteries were flown on the LDEF. In addition, approximately 20 LiSO2 batteries were kept in cold storage at NASA LaRC. The various investigations and post-flight analyses of the flight and control batteries are reviewed. The primary objectives of these studies was to identify degradation modes (if any) of the batteries and to provide information useful to future spacecraft missions. Systems SIG involvement in the post-flight evaluation of LDEF batteries was two-fold: (1) to fund SAFT (original manufacturer of the LiSO2 batteries) to perform characterization of 13 LiSO2 batteries (10 flight and 3 control batteries); and (2) to integrate investigator results.
SPIRE Data-Base Management System
NASA Technical Reports Server (NTRS)
Fuechsel, C. F.
1984-01-01
Spacelab Payload Integration and Rocket Experiment (SPIRE) data-base management system (DBMS) based on relational model of data bases. Data bases typically used for engineering and mission analysis tasks and, unlike most commercially available systems, allow data items and data structures stored in forms suitable for direct analytical computation. SPIRE DBMS designed to support data requests from interactive users as well as applications programs.
ERIC Educational Resources Information Center
Bye, Amanda; Aston, Megan
2016-01-01
Children with intellectual disabilities spend more time in the health-care system than mainstream children. Parents have to learn how to navigate the system by coordinating appointments, understanding the referral process, knowing what services are available, and advocating for those services. This places an incredible amount of responsibility on…
NASA Technical Reports Server (NTRS)
Hanson, Curt
2009-01-01
The NASA F/A-18 tail number (TN) 853 full-scale Integrated Resilient Aircraft Control (IRAC) testbed has been designed with a full array of capabilities in support of the Aviation Safety Program. Highlights of the system's capabilities include: 1) a quad-redundant research flight control system for safely interfacing controls experiments to the aircraft's control surfaces; 2) a dual-redundant airborne research test system for hosting multi-disciplinary state-of-the-art adaptive control experiments; 3) a robust reversionary configuration for recovery from unusual attitudes and configurations; 4) significant research instrumentation, particularly in the area of static loads; 5) extensive facilities for experiment simulation, data logging, real-time monitoring and post-flight analysis capabilities; and 6) significant growth capability in terms of interfaces and processing power.
Offline software for the DAMPE experiment
NASA Astrophysics Data System (ADS)
Wang, Chi; Liu, Dong; Wei, Yifeng; Zhang, Zhiyong; Zhang, Yunlong; Wang, Xiaolian; Xu, Zizong; Huang, Guangshun; Tykhonov, Andrii; Wu, Xin; Zang, Jingjing; Liu, Yang; Jiang, Wei; Wen, Sicheng; Wu, Jian; Chang, Jin
2017-10-01
A software system has been developed for the DArk Matter Particle Explorer (DAMPE) mission, a satellite-based experiment. The DAMPE software is mainly written in C++ and steered using a Python script. This article presents an overview of the DAMPE offline software, including the major architecture design and specific implementation for simulation, calibration and reconstruction. The whole system has been successfully applied to DAMPE data analysis. Some results obtained using the system, from simulation and beam test experiments, are presented. Supported by Chinese 973 Program (2010CB833002), the Strategic Priority Research Program on Space Science of the Chinese Academy of Science (CAS) (XDA04040202-4), the Joint Research Fund in Astronomy under cooperative agreement between the National Natural Science Foundation of China (NSFC) and CAS (U1531126) and 100 Talents Program of the Chinese Academy of Science
SHEFEX - the vehicle and sub-systems for a hypersonic re-entry flight experiment
NASA Astrophysics Data System (ADS)
Turner, John; Hörschgen, Marcus; Turner, Peter; Ettl, Josef; Jung, Wolfgang; Stamminger, Andreas
2005-08-01
The purpose of the Sharp Edge Flight Experiment (SHEFEX) is to investigate the aerodynamic behaviour and thermal problems of an unconventional shape for re-entry vehicles, comprising multi-facetted surfaces with sharp edges. The main object of this experiment is the correlation of numerical analysis with real flight data in terms of the aerodynamic effects and structural concept for the thermal protection system (TPS). The Mobile Rocket Base of the German Aerospace Center (DLR) is responsible for the test flight of SHEFEX on a two stage unguided solid propellant sounding rocket which is required to provide a velocity of the order of March 7 for more than 30 seconds during atmospheric re-entry. This paper discusses the problems associated with the mission requirements and the solutions developed for the vehicle and sub-systems.
Making Ice Creep in the Classroom
NASA Astrophysics Data System (ADS)
Prior, David; Vaughan, Matthew; Banjan, Mathilde; Hamish Bowman, M.; Craw, Lisa; Tooley, Lauren; Wongpan, Pat
2017-04-01
Understanding the creep of ice has direct application to the role of ice sheet flow in sea level and climate change and to modelling of icy planets and satellites of the outer solar system. Additionally ice creep can be used as an analogue for the high temperature creep of rocks, most particularly quartzites. We adapted technologies developed for ice creep experiments in the research lab, to build some inexpensive ( EU200) rigs to conduct ice creep experiments in an undergraduate (200 and 300 level) class in rock deformation. The objective was to give the students an experience of laboratory rock deformation experiments so that they would understand better what controls the creep rate of ice and rocks. Students worked in eight groups of 5/6 students. Each group had one deformation rig and temperature control system. Each group conducted two experiments over a 2 week period. The results of all 16 experiments were then shared so that all students could analyse the mechanical data and generate a "flow law" for ice. Additionally thin sections were made of each deformed sample so that some microstructural analysis could be incorporated in the data analysis. Students were able to derive a flow law that showed the relationship of creep rate to both stress and temperature. The flow law matches with those from published research. The class did provide a realistic introduction to laboratory rock deformation experiments and helped students' understanding of what controls the creep of rocks.
NASA Technical Reports Server (NTRS)
2004-01-01
Since its founding in 1992, Global Science & Technology, Inc. (GST), of Greenbelt, Maryland, has been developing technologies and providing services in support of NASA scientific research. GST specialties include scientific analysis, science data and information systems, data visualization, communications, networking and Web technologies, computer science, and software system engineering. As a longtime contractor to Goddard Space Flight Center s Earth Science Directorate, GST scientific, engineering, and information technology staff have extensive qualifications with the synthesis of satellite, in situ, and Earth science data for weather- and climate-related projects. GST s experience in this arena is end-to-end, from building satellite ground receiving systems and science data systems, to product generation and research and analysis.
The Deep Impact Network Experiment Operations Center Monitor and Control System
NASA Technical Reports Server (NTRS)
Wang, Shin-Ywan (Cindy); Torgerson, J. Leigh; Schoolcraft, Joshua; Brenman, Yan
2009-01-01
The Interplanetary Overlay Network (ION) software at JPL is an implementation of Delay/Disruption Tolerant Networking (DTN) which has been proposed as an interplanetary protocol to support space communication. The JPL Deep Impact Network (DINET) is a technology development experiment intended to increase the technical readiness of the JPL implemented ION suite. The DINET Experiment Operations Center (EOC) developed by JPL's Protocol Technology Lab (PTL) was critical in accomplishing the experiment. EOC, containing all end nodes of simulated spaces and one administrative node, exercised publish and subscribe functions for payload data among all end nodes to verify the effectiveness of data exchange over ION protocol stacks. A Monitor and Control System was created and installed on the administrative node as a multi-tiered internet-based Web application to support the Deep Impact Network Experiment by allowing monitoring and analysis of the data delivery and statistics from ION. This Monitor and Control System includes the capability of receiving protocol status messages, classifying and storing status messages into a database from the ION simulation network, and providing web interfaces for viewing the live results in addition to interactive database queries.
Extended GTST-MLD for aerospace system safety analysis.
Guo, Chiming; Gong, Shiyu; Tan, Lin; Guo, Bo
2012-06-01
The hazards caused by complex interactions in the aerospace system have become a problem that urgently needs to be settled. This article introduces a method for aerospace system hazard interaction identification based on extended GTST-MLD (goal tree-success tree-master logic diagram) during the design stage. GTST-MLD is a functional modeling framework with a simple architecture. Ontology is used to extend the ability of system interaction description in GTST-MLD by adding the system design knowledge and the past accident experience. From the level of functionality and equipment, respectively, this approach can help the technician detect potential hazard interactions. Finally, a case is used to show the method. © 2011 Society for Risk Analysis.
Computational singular perturbation analysis of stochastic chemical systems with stiffness
NASA Astrophysics Data System (ADS)
Wang, Lijin; Han, Xiaoying; Cao, Yanzhao; Najm, Habib N.
2017-04-01
Computational singular perturbation (CSP) is a useful method for analysis, reduction, and time integration of stiff ordinary differential equation systems. It has found dominant utility, in particular, in chemical reaction systems with a large range of time scales at continuum and deterministic level. On the other hand, CSP is not directly applicable to chemical reaction systems at micro or meso-scale, where stochasticity plays an non-negligible role and thus has to be taken into account. In this work we develop a novel stochastic computational singular perturbation (SCSP) analysis and time integration framework, and associated algorithm, that can be used to not only construct accurately and efficiently the numerical solutions to stiff stochastic chemical reaction systems, but also analyze the dynamics of the reduced stochastic reaction systems. The algorithm is illustrated by an application to a benchmark stochastic differential equation model, and numerical experiments are carried out to demonstrate the effectiveness of the construction.
Knowledge Interaction Design for Creative Knowledge Work
NASA Astrophysics Data System (ADS)
Nakakoji, Kumiyo; Yamamoto, Yasuhiro
This paper describes our approach for the development of application systems for creative knowledge work, particularly for early stages of information design tasks. Being a cognitive tool serving as a means of externalization, an application system affects how the user is engaged in the creative process through its visual interaction design. Knowledge interaction design described in this paper is a framework where a set of application systems for different information design domains are developed based on an interaction model, which is designed for a particular model of a thinking process. We have developed two sets of application systems using the knowledge interaction design framework: one includes systems for linear information design, such as writing, movie-editing, and video-analysis; the other includes systems for network information design, such as file-system navigation and hypertext authoring. Our experience shows that the resulting systems encourage users to follow a certain cognitive path through graceful user experience.
Baedecker, P.A.; Grossman, J.N.
1995-01-01
A PC based system has been developed for the analysis of gamma-ray spectra and for the complete reduction of data from INAA experiments, including software to average the results from mulitple lines and multiple countings and to produce a final report of analysis. Graphics algorithms may be called for the analysis of complex spectral features, to compare the data from alternate photopeaks and to evaluate detector performance during a given counting cycle. A database of results for control samples can be used to prepare quality control charts to evaluate long term precision and to search for systemic variations in data on reference samples as a function of time. The entire software library can be accessed through a user-friendly menu interface with internal help.
DCMS: A data analytics and management system for molecular simulation.
Kumar, Anand; Grupcev, Vladimir; Berrada, Meryem; Fogarty, Joseph C; Tu, Yi-Cheng; Zhu, Xingquan; Pandit, Sagar A; Xia, Yuni
Molecular Simulation (MS) is a powerful tool for studying physical/chemical features of large systems and has seen applications in many scientific and engineering domains. During the simulation process, the experiments generate a very large number of atoms and intend to observe their spatial and temporal relationships for scientific analysis. The sheer data volumes and their intensive interactions impose significant challenges for data accessing, managing, and analysis. To date, existing MS software systems fall short on storage and handling of MS data, mainly because of the missing of a platform to support applications that involve intensive data access and analytical process. In this paper, we present the database-centric molecular simulation (DCMS) system our team developed in the past few years. The main idea behind DCMS is to store MS data in a relational database management system (DBMS) to take advantage of the declarative query interface ( i.e. , SQL), data access methods, query processing, and optimization mechanisms of modern DBMSs. A unique challenge is to handle the analytical queries that are often compute-intensive. For that, we developed novel indexing and query processing strategies (including algorithms running on modern co-processors) as integrated components of the DBMS. As a result, researchers can upload and analyze their data using efficient functions implemented inside the DBMS. Index structures are generated to store analysis results that may be interesting to other users, so that the results are readily available without duplicating the analysis. We have developed a prototype of DCMS based on the PostgreSQL system and experiments using real MS data and workload show that DCMS significantly outperforms existing MS software systems. We also used it as a platform to test other data management issues such as security and compression.
Huang, Qingchao; Liu, Dachang; Chen, Yinfang; Wang, Yuehui; Tan, Jun; Chen, Wei; Liu, Jianguo; Zhu, Ninghua
2018-05-14
A secure free-space optical (S-FSO) communication system based on data fragmentation multipath transmission (DFMT) scheme is proposed and demonstrated for enhancing the security of FSO communications. By fragmenting the transmitted data and simultaneously distributing data fragments into different atmospheric channels, the S-FSO communication system can protect confidential messages from being eavesdropped effectively. A field experiment of S-FSO communication between two buildings has been successfully undertaken, and the experiment results demonstrate the feasibility of the scheme. The transmission distance is 50m and the maximum throughput is 1 Gb/s. We also established a theoretical model to analysis the security performance of the S-FSO communication system. To the best of our knowledge, this is the first application of DFMT scheme in FSO communication system.
Flight experience of solar mesosphere explorer's power system over high temperatures ranges
NASA Technical Reports Server (NTRS)
Faber, Jack; Hurley, Daniel
1987-01-01
The performance of the power system on the Solar Mesosphere Explorer (SME) satellite for the life of the mission and the techniques used to ensure power system health are summarized. Early in the mission high cell imbalances in one of the batteries resulted in a loading scheme which attempted to minimize the cell imbalances without causing an undervoltage condition. A short term model of the power system allowed planners to predict depth of discharge using the latest available data. Due to expected orbital shifts the solar arrays experience extended periods of no eclipse. This has required special conditioning schemes to keep the batteries healthy when the eclipses return. Analysis of the SME data indicates long term health of the SME power system as long as the conditioning scheme is continued.
Data acquisition and processing system for the HT-6M tokamak fusion experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shu, Y.T.; Liu, G.C.; Pang, J.Q.
1987-08-01
This paper describes a high-speed data acquisition and processing system which has been successfully operated on the HT-6M tokamak fusion experimental device. The system collects, archives and analyzes up to 512 kilobytes of data from each shot of the experiment. A shot lasts 50-150 milliseconds and occurs every 5-10 minutes. The system consists of two PDP-11/24 computer systems. One PDP-11/24 is used for real-time data taking and on-line data analysis. It is based upon five CAMAC crates organized into a parallel branch. Another PDP-11/24 is used for off-line data processing. Both data acquisition software RSX-DAS and data processing software RSX-DAPmore » have modular, multi-tasking and concurrent processing features.« less
NASA Astrophysics Data System (ADS)
Marlin, Benjamin
Education planning provides the policy maker and the decision maker a logical framework in which to develop and implement education policy. At the international level, education planning is often confounded by both internal and external complexities, making the development of education policy difficult. This research presents a discrete event simulation in which individual students and teachers flow through the system across a variable time horizon. This simulation is then used with advancements in design of experiments, multivariate statistical analysis, and data envelopment analysis, to provide a methodology designed to assist the international education planning community. We propose that this methodology will provide the education planner with insights into the complexity of the education system, the effects of both endogenous and exogenous factors upon the system, and the implications of policies as they pertain to potential futures of the system. We do this recognizing that there are multiple actors and stochastic events in play, which although cannot be accurately forecasted, must be accounted for within the education model. To both test the implementation and usefulness of such a model and to prove its relevance, we chose the Afghan education system as the focal point of this research. The Afghan education system is a complex, real world system with competing actors, dynamic requirements, and ambiguous states. At the time of this writing, Afghanistan is at a pivotal point as a nation, and has been the recipient of a tremendous amount of international support and attention. Finally, Afghanistan is a fragile state, and the proliferation of the current disparity in education across gender, districts, and ethnicity could provide the catalyst to drive the country into hostility. In order to prevent the failure of the current government, it is essential that the education system is able to meet the demands of the Afghan people. This work provides insights into the Afghan education system, to include implications of security, the potential effects of societal issues, and prescriptive policy options. In using the proposed methodology, we provide justification for the future use of larger complex simulations in education planning |--- especially when said simulation is integrated with efficient design of experiments and data envelopment analysis.
Next Generation Workload Management and Analysis System for Big Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
De, Kaushik
We report on the activities and accomplishments of a four-year project (a three-year grant followed by a one-year no cost extension) to develop a next generation workload management system for Big Data. The new system is based on the highly successful PanDA software developed for High Energy Physics (HEP) in 2005. PanDA is used by the ATLAS experiment at the Large Hadron Collider (LHC), and the AMS experiment at the space station. The program of work described here was carried out by two teams of developers working collaboratively at Brookhaven National Laboratory (BNL) and the University of Texas at Arlingtonmore » (UTA). These teams worked closely with the original PanDA team – for the sake of clarity the work of the next generation team will be referred to as the BigPanDA project. Their work has led to the adoption of BigPanDA by the COMPASS experiment at CERN, and many other experiments and science projects worldwide.« less
Stability analysis for a multi-camera photogrammetric system.
Habib, Ayman; Detchev, Ivan; Kwak, Eunju
2014-08-18
Consumer-grade digital cameras suffer from geometrical instability that may cause problems when used in photogrammetric applications. This paper provides a comprehensive review of this issue of interior orientation parameter variation over time, it explains the common ways used for coping with the issue, and describes the existing methods for performing stability analysis for a single camera. The paper then points out the lack of coverage of stability analysis for multi-camera systems, suggests a modification of the collinearity model to be used for the calibration of an entire photogrammetric system, and proposes three methods for system stability analysis. The proposed methods explore the impact of the changes in interior orientation and relative orientation/mounting parameters on the reconstruction process. Rather than relying on ground truth in real datasets to check the system calibration stability, the proposed methods are simulation-based. Experiment results are shown, where a multi-camera photogrammetric system was calibrated three times, and stability analysis was performed on the system calibration parameters from the three sessions. The proposed simulation-based methods provided results that were compatible with a real-data based approach for evaluating the impact of changes in the system calibration parameters on the three-dimensional reconstruction.
Stability Analysis for a Multi-Camera Photogrammetric System
Habib, Ayman; Detchev, Ivan; Kwak, Eunju
2014-01-01
Consumer-grade digital cameras suffer from geometrical instability that may cause problems when used in photogrammetric applications. This paper provides a comprehensive review of this issue of interior orientation parameter variation over time, it explains the common ways used for coping with the issue, and describes the existing methods for performing stability analysis for a single camera. The paper then points out the lack of coverage of stability analysis for multi-camera systems, suggests a modification of the collinearity model to be used for the calibration of an entire photogrammetric system, and proposes three methods for system stability analysis. The proposed methods explore the impact of the changes in interior orientation and relative orientation/mounting parameters on the reconstruction process. Rather than relying on ground truth in real datasets to check the system calibration stability, the proposed methods are simulation-based. Experiment results are shown, where a multi-camera photogrammetric system was calibrated three times, and stability analysis was performed on the system calibration parameters from the three sessions. The proposed simulation-based methods provided results that were compatible with a real-data based approach for evaluating the impact of changes in the system calibration parameters on the three-dimensional reconstruction. PMID:25196012
ERIC Educational Resources Information Center
Kyle, James; And Others
The guide provides multiple experience-based activities for use by secondary social studies students as they examine occupational possibilities in their communities. The purposes of the materials are to help students evaluate themselves and their value systems, examine occupations, and become aware of the changing philosophy and value of work in…
ERIC Educational Resources Information Center
Bridgeford, Nancy; Douglas, Marcia
A study assessed the activities of five state networks that were designed to transfer experience-based career education (EBCE) ownership to appropriate state and local institutions and to develop a state-level support system for continued implementation of EBCE in local districts. Focus of the analysis was on factors contributing to EBCE entry,…
ERIC Educational Resources Information Center
Burgess, K.; McKenzie, W.; Fehr, F.
2016-01-01
This pilot study explored the international female (IF) students' (n = 17) lived experiences of health care accessibility while studying in a small town in Canada. Analysis guided by a phenomenological method resulted in three major themes--(1) after arriving to attend university, IF students experienced challenges in staying healthy, such as…
ERIC Educational Resources Information Center
Sanseau, Pierre-Yves; Ansart, Sandrine
2013-01-01
In this paper, the researchers analyse how lifelong learning can be enriched and develop a different perspective based on the experiment involving the accreditation of prior experiential learning (APEL) conducted in France at the university level. The French system for the accreditation of prior experiential learning, called Validation des Acquis…
LDEF: 69 Months in Space. Second Post-Retrieval Symposium, part 2
NASA Technical Reports Server (NTRS)
Levine, Arlene S. (Editor)
1993-01-01
This document is a compilation of papers presented at the Second Long Duration Exposure Facility (LDEF) Post-Retrieval Symposium. The papers represent the data analysis of the 57 experiments flown on the LDEF. The experiments include materials, coatings, thermal systems, power and propulsion, science (cosmic ray, interstellar gas, heavy ions, micrometeoroid, etc.), electronics, optics, and life science.
LDEF: 69 Months in Space. First Post-Retrieval Symposium, part 3
NASA Technical Reports Server (NTRS)
Levine, Arlene S. (Editor)
1992-01-01
A compilation of papers presented at the First Long Duration Exposure Facility (LDEF) Post-Retrieval Symposium is presented. The papers represent the preliminary data analysis of the 57 experiments flown on the LDEF. The experiments include materials, coatings, thermal systems, power and propulsion, science (cosmic ray, interstellar gas, heavy ions, and micrometeoroid), electronics, optics, and life sciences.
Principal Components Analysis of Triaxial Vibration Data From Helicopter Transmissions
NASA Technical Reports Server (NTRS)
Tumer, Irem Y.; Huff, Edward M.
2001-01-01
Research on the nature of the vibration data collected from helicopter transmissions during flight experiments has led to several crucial observations believed to be responsible for the high rates of false alarms and missed detections in aircraft vibration monitoring systems. This work focuses on one such finding, namely, the need to consider additional sources of information about system vibrations. In this light, helicopter transmission vibration data, collected using triaxial accelerometers, were explored in three different directions, analyzed for content, and then combined using Principal Components Analysis (PCA) to analyze changes in directionality. In this paper, the PCA transformation is applied to 176 test conditions/data sets collected from an OH58C helicopter to derive the overall experiment-wide covariance matrix and its principal eigenvectors. The experiment-wide eigenvectors. are then projected onto the individual test conditions to evaluate changes and similarities in their directionality based on the various experimental factors. The paper will present the foundations of the proposed approach, addressing the question of whether experiment-wide eigenvectors accurately model the vibration modes in individual test conditions. The results will further determine the value of using directionality and triaxial accelerometers for vibration monitoring and anomaly detection.
First biological and dosimetric results of the free flyer biostack experiment AO015 on LDEF
NASA Technical Reports Server (NTRS)
Reitz, G.; Buecker, H.; Facius, R.; Horneck, G.; Schaeffer, M.; Schott, J. U.; Bayonove, J.; Beaujean, R.; Benton, E. V.; Delpoux, M.
1991-01-01
The main objectives of the Biostack Experiment are to study the effectiveness of the structured components of the cosmic radiation to bacterial spores, plant seeds, and animal cysts for a long duration spaceflight and to get dosimetric data such as particle fluences and spectra and total doses for the Long Duration Exposure Facility orbit. The configuration of the experiment packages allows the localization of the trajectory of the particles in each biological layer and to correlate the potential biological impairment or injury with the physical characteristics of the responsible particle. Although the Biostack Experiment was designed for a long duration flight of only nine months, most of the biological systems show a high hatching or germination rate. Some of the first observations are an increase of the mutation rate of embryonic lethals in the second generation of Arabidopsis seeds, somatic mutations, and a reduction of growth rates of corn plants and a reduction of life span of Artemia salina shrimps. The different passive detector systems are also in a good shape and give access to a proper dosimetric analysis. The results are summarized, and some aspects of future analysis are shown.
An integrated modeling and design tool for advanced optical spacecraft
NASA Technical Reports Server (NTRS)
Briggs, Hugh C.
1992-01-01
Consideration is given to the design and status of the Integrated Modeling of Optical Systems (IMOS) tool and to critical design issues. A multidisciplinary spacecraft design and analysis tool with support for structural dynamics, controls, thermal analysis, and optics, IMOS provides rapid and accurate end-to-end performance analysis, simulations, and optimization of advanced space-based optical systems. The requirements for IMOS-supported numerical arrays, user defined data structures, and a hierarchical data base are outlined, and initial experience with the tool is summarized. A simulation of a flexible telescope illustrates the integrated nature of the tools.
McMurray, Josephine; McNeil, Heather; Lafortune, Claire; Black, Samantha; Prorok, Jeanette; Stolee, Paul
2016-01-01
To identify key dimensions of patients' experience across the rehabilitative care system and to recommend a framework to develop survey items that measure the rehabilitative care experience. Data were sourced from a literature review that searched MEDLINE (PubMed), CINAHL (Ebsco), and PsycINFO (APA PsycNET) databases from 2004 to 2014, the reference lists of the final accepted articles, and hand searches of relevant journals. Four reviewers performed the screening process on 2472 articles; 33 were included for analysis. Interrater reliability was confirmed through 2 rounds of title review and 1 round of abstract review, with an average κ score of .69. The final sample of 33 accepted articles was imported into a qualitative data analysis software application. Multiple levels of coding and a constant comparative methodology generated 6 themes. There were 502 discreet survey questions measuring patient experience that were categorized using the following dimensions: rehabilitative care ecosystem, client and informal caregiver engagement, patient and health care provider relation, pain and functional status, group and individual identity, and open ended. The most common survey questions examine the care delivery ecosystem (37%), the engagement of clients and their informal caregivers (24.9%), and the quality of relations between providers and patients (21.7%). Examination of patient's functional status and management of pain yielded (15.3%) of the instruments' questions. Currently available instruments and questions that measure patients' experience in rehabilitative care are unable to assess the performance of rehabilitative delivery systems that aspire to integrate care across the continuum. However, question panels derived from our 6 key themes may measure the key concepts that define rehabilitative care and facilitate measurement of patient experience at the system level. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Tsai, Hsiu-Hsin; Tsai, Yun-Fang; Huang, Hsiu-Li
2016-03-01
To explore the experiences of nursing home nurses when they transfer residents from nursing homes to the emergency department in Taiwan. The transfer of residents between nursing homes and emergency departments challenges continuity of care. Understanding nursing home nurses' experiences during these transfers may help to improve residents' continuity of care. However, few empirical data are available on these nurses' transfer experiences worldwide, and none could be found in Asian countries. Qualitative descriptive study. Data were collected from August 2012-June 2013 in audiotaped, individual, in-depth interviews with 25 nurses at five nursing homes in Taiwan. Interview transcripts were analysed by constant comparative analysis. Analysis of interview transcripts revealed that the core theme of nursing home nurses' transfer experience was discontinuity in nursing home to emergency department transitions. This core theme comprised three themes: discontinuity in family involvement, discontinuity in medical resources and expectations, and discontinuity in nurses' professional role. Nursing home nurses need a working environment that is better connected to residents' family members and more immediate and/or easier access to acute care for residents. Communication between nurses and residents' family could be improved by using text messages or social media by mobile phones, which are widely used in Taiwan and worldwide. To improve access to acute care, we suggest developing a real-time telehealth transfer system tailored to the medical culture and policies of each country. This system should facilitate communication among nursing home staff, family members and hospital staff. Our findings on nurses' experiences during transfer of nursing home residents to the emergency department can be used to design more effective transfer policies such as telemedicine systems in Taiwan and other Asian countries or in those with large populations of Chinese immigrants. © 2016 John Wiley & Sons Ltd.
Statistical evaluation of vibration analysis techniques
NASA Technical Reports Server (NTRS)
Milner, G. Martin; Miller, Patrice S.
1987-01-01
An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.