Science.gov

Sample records for experiment cepex design

  1. Central Equatorial Pacific Experiment (CEPEX). Design document

    SciTech Connect

    Not Available

    1993-04-01

    The Earth`s climate has varied significantly in the past, yet climate records reveal that in the tropics, sea surface temperatures seem to have been remarkably stable, varying by less than a few degrees Celsius over geologic time. Today, the large warm pool of the western Pacific shows similar characteristics. Its surface temperature always exceeds 27{degree}C, but never 31{degree}C. Heightened interest in this observation has been stimulated by questions of global climate change and the exploration of stabilizing climate feedback processes. Efforts to understand the observed weak sensitivity of tropical sea surface temperatures to climate forcing has led to a number of competing ideas about the nature of this apparent thermostat. Although there remains disagreement on the processes that regulate tropical sea surface temperature, most agree that further progress in resolving these differences requires comprehensive field observations of three-dimensional water vapor concentrations, solar and infrared radiative fluxes, surface fluxes of heat and water vapor, and cloud microphysical properties. This document describes the Central Equatorial Pacific Experiment (CEPEX) plan to collect such observations over the central equatorial Pacific Ocean during March of 1993.

  2. Central Equatorial Pacific Experiment (CEPEX)

    SciTech Connect

    Not Available

    1993-01-01

    The Earth's climate has varied significantly in the past, yet climate records reveal that in the tropics, sea surface temperatures seem to have been remarkably stable, varying by less than a few degrees Celsius over geologic time. Today, the large warm pool of the western Pacific shows similar characteristics. Its surface temperature always exceeds 27[degree]C, but never 31[degree]C. Heightened interest in this observation has been stimulated by questions of global climate change and the exploration of stabilizing climate feedback processes. Efforts to understand the observed weak sensitivity of tropical sea surface temperatures to climate forcing has led to a number of competing ideas about the nature of this apparent thermostat. Although there remains disagreement on the processes that regulate tropical sea surface temperature, most agree that further progress in resolving these differences requires comprehensive field observations of three-dimensional water vapor concentrations, solar and infrared radiative fluxes, surface fluxes of heat and water vapor, and cloud microphysical properties. This document describes the Central Equatorial Pacific Experiment (CEPEX) plan to collect such observations over the central equatorial Pacific Ocean during March of 1993.

  3. High albedos of cirrus in the tropical Pacific warm pool: Microphysical interpretation from CEPEX and from Kwajalein, Marshall Islands

    SciTech Connect

    Heymsfield, A.J.; McFarquhar, G.M.

    1996-09-01

    Recent studies suggest that extensive shields of cirrus clouds over the equatorial Pacific {open_quotes}warm pool{close_quotes} may have a significant influence on the global climate, yet details of the links between cloud microphysical properties, upper-tropospheric latent and radiative beating rates, and climate are poorly understood. This study addresses whether relatively reflective ice crystals with dimensions smaller than about 100 {mu}m near the tops of tropical cirrus clouds, produced by deep convection when the sea surface temperature exceeds 300 K, are principally responsible for the high albedos observed in this region. In situ measurements of ice crystal size distributions and shapes, acquired during the Central Equatorial Pacific Experiment (CEPEX), are used to derive cloud ice water content (IWC), particle cross-sectional area (A), and other microphysical and optical properties from particles with sizes down to 5 {mu}m. These measurements are needed to ascertain the microphysical properties primarily responsible for determining cloud optical depth and albedo in visible wavelengths. Analysis shows that IWC, A, and various measures of particle size all tend to decrease with decreasing temperature and increasing altitude, although considerable scatter is observed. Small ice crystals make up more than half the mass and cause more than half the extinction on average in the upper, colder parts of the cirrus; however, the predominantly large particles in the lower, warmer parts of the cirrus contain at least an order of magnitude greater mass and are dominant in producing the high observed albedos. An examination of the lidar and radiometer data acquired onboard the NASA ER-2, which overflew the Learjet during CEPEX, supports the conclusion that the higher, colder regions of the cirrus typically have volume extinction coefficients that are only about 10% of those in the lower, warmer regions. 36 refs., 25 figs., 4 tabs.

  4. High Albedos of Cirrus in the Tropical Pacific Warm Pool: Microphysical Interpretations from CEPEX and from Kwajalein, Marshall Islands.

    NASA Astrophysics Data System (ADS)

    Heymsfield, Andrew J.; McFarquhar, Greg M.

    1996-09-01

    Recent studies suggest that extensive shields of cirrus clouds over the equatorial Pacific `warn pool' may have a significant influence on the global climate, yet details of the links between cloud microphysical properties, upper-tropospheric latent and radiative heating rates, and climate are poorly understood. This study addresses whether relatively reflective ice crystals with dimensions smaller than about 100 µm near the tops of tropical cirrus clouds, produced by deep convection when the sea surface temperature exceeds 300 K, are principally responsible for the high albedos observed in this region.In situ measurements of ice crystal size distributions and shapes, acquired during the Central Equatorial Pacific Experiment (CEPEX), are used to derive cloud ice water content (IWC), particle cross-sectional area (A), and other microphysical and optical properties from particles with sizes down to 5 µm. These measurements are needed to ascertain the microphysical properties primarily responsible for determining cloud optical depth and albedo in visible wavelengths and were acquired by a Learjet flying in tropical cirrus and occasionally in convection between altitudes of 8 and 14 km (20°C to 70°C). Previously unanalyzed microphysical measurements in the vicinity of Kwajalein, Marshall Islands, acquired in the mid-1970s from a WB57F aircraft between altitudes of 5 and 17 km, are also used to study the variation in microphysical properties from cirrus base to top, using a combination of constant-altitude penetrations and steep ascents and descents through cloud.Analysis shows that IWC, A, and various measures of particle size all tend to decrease with decreasing temperature and increasing altitude, although considerable scatter is observed. Small ice crystals make up more than half the mass and cause more than half the extinction on average in the upper, colder parts of the cirrus; however, the predominantly large particles in the lower, warmer parts of the cirrus

  5. SEDS experiment design definition

    NASA Technical Reports Server (NTRS)

    Carroll, Joseph A.; Alexander, Charles M.; Oldson, John C.

    1990-01-01

    The Small Expendable-tether Deployment System (SEDS) was developed to design, build, integrate, fly, and safely deploy and release an expendable tether. A suitable concept for an on-orbit test of SEDS was developed. The following tasks were performed: (1) Define experiment objectives and requirements; (2) Define experiment concepts to reach those objectives; (3) Support NASA in experiment concept selection and definition; (4) Perform analyses and tests of SEDS hardware; (5) Refine the selected SEDS experiment concept; and (6) Support interactive SEDS system definition process. Results and conclusions are given.

  6. Experiment design and operations

    NASA Technical Reports Server (NTRS)

    Sellers, P. J.; Hall, F. G.; Markham, B. J.; Wang, J. R.; Strebel, D. E.

    1990-01-01

    The objectives, design, and field operations of the First ISLSCP Field Experiment (FIFE) are described. The simultaneous acquisition of satellite, atmosphere, and surface data, and the understanding of the processes governing surface energy and mass exchange and how these are manifested in satellite-resolution radiometric data are identified as the specific objectives of the field-phase experiment. The central issues concerning the design of the field experiment are considered: the size of the site, the duration of the experiment, and the location of the site; it is noted that the Konza Prairie National Reserve was selected as the focus of the study. Field operations in 1987 and 1989 are discussed, and it is pointed out that a data set is available now from a single combined repository to all FIFE investigators, and that scientists can test models and algorithms on scales consistent with satellite observations and with enough supporting data on finer scales.

  7. DEWFALL validation experiment designs

    SciTech Connect

    Lowry, B.; Walsh, B.

    1989-09-30

    Three experiments are proposed as tests to validate the DEWFALL analysis model for the large vessel project. This document is a very brief record of the techniques and test designs that could be used for validation of the model. Processes of the model which require validation include: (1) vaporization and recondensation of the vessel wall material due to energy transfer from the source, (2) melt and refreeze of vessel wall material, and (3) condensation and solidification of the source material. A methodology was developed to analyze the maximum thickness of material melted and vaporized with given experimental configurations and initial energies. DEWFALL reference calculations are included in an appendix to the document. 2 refs., 3 figs., 3 tabs.

  8. Structural Assembly Demonstration Experiment (SADE) experiment design

    NASA Technical Reports Server (NTRS)

    Akin, D. L.; Bowden, M. L.

    1982-01-01

    The Structural Assembly Demonstration Experiment concept is to erect a hybrid deployed/assembled structure as an early space experiment in large space structures technology. The basic objectives can be broken down into three generic areas: (1) by performing assembly tasks both in space and in neutral buoyancy simulation, a mathematical basis will be found for the validity conditions of neutral buoyancy, thus enhancing the utility of water as a medium for simulation of weightlessness; (2) a data base will be established describing the capabilities and limitations of EVA crewmembers, including effects of such things as hardware size and crew restraints; and (3) experience of the M.I.T. Space Systems Lab in neutral buoyancy simulation of large space structures assembly indicates that the assembly procedure may create the largest loads that a structure will experience during its lifetime. Data obtained from the experiment will help establish an accurate loading model to aid designers of future space structures.

  9. Employment of CEPEX enclosures for monitoring toxicity of Hg and Zn on in situ structural and functional characteristics of algal communities of River Ganga in Varanasi, India.

    PubMed

    Rai, L C; Singh, A K; Mallick, N

    1990-10-01

    Effects of Hg and Zn on in situ nitrogen fixation, autotrophic index, pigment diversity, 14CO2 uptake, and change in algal community structure of Ganges water have been studied for the first time using CEPEX chambers in aquatic ecosystem of India. A concentration-dependent decrease in in situ nitrogenase activity of Ganges water with Hg and Zn has been noticed. No ethylene production was observed at 0.8 microgram/ml of Hg. However, an increase in the autotrophic index was observed in CEPEX enclosures treated with Hg and Zn. The AI value was maximum at 0.8 microgram/ml Hg after an incubation of 15 days. An increase in pigment diversity also followed the pattern of AI with the test metals used. Inhibition of 14CO2 uptake of phytoplankton of Ganges water was maximum at 0.8 microgram/ml Hg (79%) followed by Zn (69%). Carbon fixation showed an increase for 1 hr, after which no appreciable change was noticed. Maximum inhibition of algal number was observed at 0.8 microgram/ml Hg followed by 8.0 micrograms/ml of Zn in the CEPEX chamber. Members of Chlorophyceae showed more tolerance than Cyanophyceae and Bacillariophyceae. The filamentous forms were more tolerant to Hg and Zn. In contrast, unicellular forms were more sensitive to Hg. The test of significance (ANOVA) showed that metal-induced variations in pigment diversity, the autotrophic index, and the 14CO2 uptake were highly significant (P less than 0.001).

  10. Designing experiments through compressed sensing.

    SciTech Connect

    Young, Joseph G.; Ridzal, Denis

    2013-06-01

    In the following paper, we discuss how to design an ensemble of experiments through the use of compressed sensing. Specifically, we show how to conduct a small number of physical experiments and then use compressed sensing to reconstruct a larger set of data. In order to accomplish this, we organize our results into four sections. We begin by extending the theory of compressed sensing to a finite product of Hilbert spaces. Then, we show how these results apply to experiment design. Next, we develop an efficient reconstruction algorithm that allows us to reconstruct experimental data projected onto a finite element basis. Finally, we verify our approach with two computational experiments.

  11. Student-Designed Field Experiences

    ERIC Educational Resources Information Center

    Permaul, Jane S.

    1976-01-01

    Opportunities should be made available for students to design their own field experiences with the use of learning contracts. This approach affords the student flexibility, emphasizes initiative and involvement, and aids in the resolution of the problem of school-to-work transition. (Author/JDS)

  12. Designing Effective Undergraduate Research Experiences

    NASA Astrophysics Data System (ADS)

    Severson, S.

    2010-12-01

    I present a model for designing student research internships that is informed by the best practices of the Center for Adaptive Optics (CfAO) Professional Development Program. The dual strands of the CfAO education program include: the preparation of early-career scientists and engineers in effective teaching; and changing the learning experiences of students (e.g., undergraduate interns) through inquiry-based "teaching laboratories." This paper will focus on the carry-over of these ideas into the design of laboratory research internships such as the CfAO Mainland internship program as well as NSF REU (Research Experiences for Undergraduates) and senior-thesis or "capstone" research programs. Key ideas in maximizing student learning outcomes and generating productive research during internships include: defining explicit content, scientific process, and attitudinal goals for the project; assessment of student prior knowledge and experience, then following up with formative assessment throughout the project; setting reasonable goals with timetables and addressing motivation; and giving students ownership of the research by implementing aspects of the inquiry process within the internship.

  13. Menagerie: designing a virtual experience

    NASA Astrophysics Data System (ADS)

    Fisher, Scott S.; Amkraut, Susan; Girard, Michael; Trayle, Mark

    1994-04-01

    This paper describes an ongoing effort to develop one of the first fully immersive virtual environment installations that is inhabited by virtual characters and presences specially designed to respond to and interact with its users. This experience allows a visitor to become visually and aurally immersed in a 3D computer generated environment that is inhabited by many virtual animals. As a user explores the virtual space, he/she encounters several species of computer generated animals, birds, and insects that move about independently, and interactively respond to the user's presence in various ways. The hardware configuration of this system includes a head-coupled, stereoscopic color viewer, and special DSP hardware that provides realistic, 3D localized sound cues linked to characters and events in the virtual space. Also, the virtual environment and characters surrounding the user are generated by a high performance, real-time computer graphics platform. The paper describes the computer programs that model the motion of the animals, the system configuration that supports the experience, and the design issues involved in developing a virtual environment system for public installation.

  14. Role-Based Design: Design Experiences

    ERIC Educational Resources Information Center

    Miller, Charles; Hokanson, Brad; Doering, Aaron; Brandt, Tom

    2010-01-01

    This is the fourth and final installment in a series of articles presenting a new outlook on the methods of instructional design. These articles examine the nature of the process of instructional design and are meant to stimulate discussion about the roles of designers in the fields of instructional design, the learning sciences, and interaction…

  15. Experimenting with Science Facility Design.

    ERIC Educational Resources Information Center

    Butterfield, Eric

    1999-01-01

    Discusses the modern school science facility and how computers and teaching methods are changing their design. Issues include power, lighting, and space requirements; funding for planning; architect assessment; materials requirements for work surfaces; and classroom flexibility. (GR)

  16. Experiment Design and Analysis Guide - Neutronics & Physics

    SciTech Connect

    Misti A Lillo

    2014-06-01

    The purpose of this guide is to provide a consistent, standardized approach to performing neutronics/physics analysis for experiments inserted into the Advanced Test Reactor (ATR). This document provides neutronics/physics analysis guidance to support experiment design and analysis needs for experiments irradiated in the ATR. This guide addresses neutronics/physics analysis in support of experiment design, experiment safety, and experiment program objectives and goals. The intent of this guide is to provide a standardized approach for performing typical neutronics/physics analyses. Deviation from this guide is allowed provided that neutronics/physics analysis details are properly documented in an analysis report.

  17. Nova pulse power design and operational experience

    NASA Astrophysics Data System (ADS)

    Whitham, K.; Larson, D.; Merritt, B.; Christie, D.

    1987-01-01

    Nova is a 100 TW Nd++ solid state laser designed for experiments with laser fusion at Lawrence Livermore National Laboratory (LLNL). The pulsed power for Nova includes a 58 MJ capacitor bank driving 5336 flashlamps with millisecond pulses and subnanosecond high voltages for electro optics. This paper summarizes the pulsed power designs and the operational experience to date.

  18. Spaceflight payload design flight experience G-408

    NASA Technical Reports Server (NTRS)

    Durgin, William W.; Looft, Fred J.; Sacco, Albert, Jr.; Thompson, Robert; Dixon, Anthony G.; Roberti, Dino; Labonte, Robert; Moschini, Larry

    1992-01-01

    Worcester Polytechnic Institute's first payload of spaceflight experiments flew aboard Columbia, STS-40, during June of 1991 and culminated eight years of work by students and faculty. The Get Away Special (GAS) payload was installed on the GAS bridge assembly at the aft end of the cargo bay behind the Spacelab Life Sciences (SLS-1) laboratory. The Experiments were turned on by astronaut signal after reaching orbit and then functioned for 72 hours. Environmental and experimental measurements were recorded on three cassette tapes which, together with zeolite crystals grown on orbit, formed the basis of subsequent analyses. The experiments were developed over a number of years by undergraduate students meeting their project requirements for graduation. The experiments included zeolite crystal growth, fluid behavior, and microgravity acceleration measurement in addition to environmental data acquisition. Preparation also included structural design, thermal design, payload integration, and experiment control. All of the experiments functioned on orbit and the payload system performed within design estimates.

  19. Design and experiment of silicon PCR chips

    NASA Astrophysics Data System (ADS)

    Cui, Zheng; Zhao, Zhan; Xia, Shanhong

    2002-04-01

    There are considerable interests in integrating Polymerase chain reaction (PCR) on a microchip can have much fast heating and cooling rate, the delicacy in its structure makes the PCR experiment difficult and cracks often occur particularly for the thin membrane type of PCR chips. Design study and experiment of silicon PCR chips are presented with the aim of identifying the problems encountered in experiment and finding an optimum chip structure. Heating characteristics of four different heater designs have been compared, so have the PCR chambers with fixed frame and with suspended frame. The thermal stress analysis has shown that the structure and heater design can make a significant difference in heating characteristics and in reducing the failure of PCR chips. Different solutions to reduce PCR chip failure have been proposed. One of the solutions was implemented in the experiment, confirming the design study results. Silicon PCR chips have been fabricated. Thermal cycling and initial DNA amplification results are presented.

  20. Super Spool: An Experiment in Powerplant Design

    ERIC Educational Resources Information Center

    Kesler, Ronald

    1974-01-01

    Discusses the use of rubberbands, an empty wooden thread spool, two wooden matches, a wax washer, and a small nail to conduct an experiment or demonstration in powerplant design. Detailed procedures and suggested activities are included. (CC)

  1. Design for Engaging Experience and Social Interaction

    ERIC Educational Resources Information Center

    Harteveld, Casper; ten Thij, Eleonore; Copier, Marinka

    2011-01-01

    One of the goals of game designers is to design for an engaging experience and for social interaction. The question is how. We know that games can be engaging and allow for social interaction, but how do we achieve this or even improve on it? This article provides an overview of several scientific approaches that deal with this question. It…

  2. Adaptive design of visual perception experiments

    NASA Astrophysics Data System (ADS)

    O'Connor, John D.; Hixson, Jonathan; Thomas, James M., Jr.; Peterson, Matthew S.; Parasuraman, Raja

    2010-04-01

    Meticulous experimental design may not always prevent confounds from affecting experimental data acquired during visual perception experiments. Although experimental controls reduce the potential effects of foreseen sources of interference, interaction, or noise, they are not always adequate for preventing the confounding effects of unforeseen forces. Visual perception experimentation is vulnerable to unforeseen confounds because of the nature of the associated cognitive processes involved in the decision task. Some confounds are beyond the control of experimentation, such as what a participant does immediately prior to experimental participation, or the participant's attitude or emotional state. Other confounds may occur through ignorance of practical control methods on the part of the experiment's designer. The authors conducted experiments related to experimental fatigue and initially achieved significant results that were, upon re-examination, attributable to a lack of adequate controls. Re-examination of the original results and the processes and events that led to them yielded a second experimental design with more experimental controls and significantly different results. The authors propose that designers of visual perception experiments can benefit from planning to use a test-fix-test or adaptive experimental design cycle, so that unforeseen confounds in the initial design can be remedied.

  3. Analysis of designed experiments with complex aliasing

    SciTech Connect

    Hamada, M.; Wu, C.F.J. )

    1992-07-01

    Traditionally, Plackett-Burman (PB) designs have been used in screening experiments for identifying important main effects. The PB designs whose run sizes are not a power of two have been criticized for their complex aliasing patterns, which according to conventional wisdom gives confusing results. This paper goes beyond the traditional approach by proposing the analysis strategy that entertains interactions in addition to main effects. Based on the precepts of effect sparsity and effect heredity, the proposed procedure exploits the designs' complex aliasing patterns, thereby turning their 'liability' into an advantage. Demonstration of the procedure on three real experiments shows the potential for extracting important information available in the data that has, until now, been missed. Some limitations are discussed, and extentions to overcome them are given. The proposed procedure also applies to more general mixed level designs that have become increasingly popular. 16 refs.

  4. Design and Simulation of Hybridization Experiments

    1995-11-28

    DB EXP DESIGN is a suite of three UNIX shell-like programs, DWC which computes oligomer composition of DNA texts using directed acyclic word data structures; DWO, which simulates hybridization experiments; and DMI, which calculates the information contenet of individual probes, their mutual information content, and their joint information content through estimation of Markov trees.

  5. Designing a Curriculum for Clinical Experiences

    ERIC Educational Resources Information Center

    Henning, John E.; Erb, Dorothy J.; Randles, Halle Schoener; Fults, Nanette; Webb, Kathy

    2016-01-01

    The purpose of this article is to describe a collaborative effort among five teacher preparation programs to create a conceptual tool designed to put clinical experiences at the center of our programs. The authors refer to the resulting product as a clinical curriculum. The clinical curriculum describes a developmental sequence of clinical…

  6. Power and replication - designing powerful experiments

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Biological research is expensive, with monetary costs to granting agencies and emotional costs to researchers. As such, biological researchers should always follow the mantra, "failure is not an option." A failed experimental design is generally manifested as an experiment with high P-values, leavin...

  7. THE QUEST TO DESIGN BETTER EXPERIMENTS.

    PubMed

    Perkel, Jeffrey

    2016-01-01

    First suggested by R.A. Fisher in the 1930s, design of experiments (DOE) strategies are finding their way into modern life science research. Jeffrey Perkel looks at how DOE is impacting everything from genome editing to mass spectrometry. PMID:27401668

  8. Conceptual design for spacelab pool boiling experiment

    NASA Technical Reports Server (NTRS)

    Lienhard, J. H.; Peck, R. E.

    1978-01-01

    A pool boiling heat transfer experiment to be incorporated with a larger two-phase flow experiment on Spacelab was designed to confirm (or alter) the results of earth-normal gravity experiments which indicate that the hydrodynamic peak and minimum pool boiling heat fluxes vanish at very low gravity. Twelve small sealed test cells containing water, methanol or Freon 113 and cylindrical heaters of various sizes are to be built. Each cell will be subjected to one or more 45 sec tests in which the surface heat flux on the heaters is increased linearly until the surface temperature reaches a limiting value of 500 C. The entire boiling process will be photographed in slow-motion. Boiling curves will be constructed from thermocouple and electric input data, for comparison with the motion picture records. The conduct of the experiment will require no more than a few hours of operator time.

  9. Advanced ISDN satellite designs and experiments

    NASA Technical Reports Server (NTRS)

    Pepin, Gerard R.

    1992-01-01

    The research performed by GTE Government Systems and the University of Colorado in support of the NASA Satellite Communications Applications Research (SCAR) Program is summarized. Two levels of research were undertaken. The first dealt with providing interim services Integrated Services Digital Network (ISDN) satellite (ISIS) capabilities that accented basic rate ISDN with a ground control similar to that of the Advanced Communications Technology Satellite (ACTS). The ISIS Network Model development represents satellite systems like the ACTS orbiting switch. The ultimate aim is to move these ACTS ground control functions on-board the next generation of ISDN communications satellite to provide full-service ISDN satellite (FSIS) capabilities. The technical and operational parameters for the advanced ISDN communications satellite design are obtainable from the simulation of ISIS and FSIS engineering software models of the major subsystems of the ISDN communications satellite architecture. Discrete event simulation experiments would generate data for analysis against NASA SCAR performance measure and the data obtained from the ISDN satellite terminal adapter hardware (ISTA) experiments, also developed in the program. The Basic and Option 1 phases of the program are also described and include the following: literature search, traffic mode, network model, scenario specifications, performance measures definitions, hardware experiment design, hardware experiment development, simulator design, and simulator development.

  10. Statistical considerations in design of spacelab experiments

    NASA Technical Reports Server (NTRS)

    Robinson, J.

    1978-01-01

    After making an analysis of experimental error sources, statistical models were developed for the design and analysis of potential Space Shuttle experiments. Guidelines for statistical significance and/or confidence limits of expected results were also included. The models were then tested out on the following proposed Space Shuttle biomedical experiments: (1) bone density by computer tomography; (2) basal metabolism; and (3) total body water. Analysis of those results and therefore of the models proved inconclusive due to the lack of previous research data and statistical values. However, the models were seen as possible guides to making some predictions and decisions.

  11. Simulator design for advanced ISDN satellite design and experiments

    NASA Technical Reports Server (NTRS)

    Pepin, Gerald R.

    1992-01-01

    This simulation design task completion report documents the simulation techniques associated with the network models of both the Interim Service ISDN (integrated services digital network) Satellite (ISIS) and the Full Service ISDN Satellite (FSIS) architectures. The ISIS network model design represents satellite systems like the Advanced Communication Technology Satellite (ACTS) orbiting switch. The FSIS architecture, the ultimate aim of this element of the Satellite Communications Applications Research (SCAR) program, moves all control and switching functions on-board the next generation ISDN communication satellite. The technical and operational parameters for the advanced ISDN communications satellite design will be obtained from the simulation of ISIS and FSIS engineering software models for their major subsystems. Discrete events simulation experiments will be performed with these models using various traffic scenarios, design parameters and operational procedures. The data from these simulations will be used to determine the engineering parameters for the advanced ISDN communications satellite.

  12. Simulation of integrated beam experiment designs

    NASA Astrophysics Data System (ADS)

    Grote, D. P.; Sharp, W. M.

    2005-05-01

    Simulations of designs of an Integrated Beam Experiment (IBX) class accelerator have been carried out. These simulations are an important tool for validating such designs. Issues such as envelope mismatch and emittance growth can be examined in a self-consistent manner including the details of injection, accelerator transitions, long-term transport, and longitudinal compression. The simulations are three-dimensional and time dependent, and begin at the source. They continue up through the end of the acceleration region, where the data are passed on to a separate simulation of the drift compression. Results are presented.

  13. Simulation of integrated beam experiment designs

    SciTech Connect

    Grote, D.P.; Sharp, W.M.

    2004-06-11

    Simulation of designs of an Integrated Beam Experiment (IBX) class accelerator have been carried out. These simulations are an important tool for validating such designs. Issues such as envelope mismatch and emittance growth can be examined in a self-consistent manner, including the details of injection, accelerator transitions, long-term transport, and longitudinal compression. The simulations are three-dimensional and time-dependent, and begin at the source. They continue up through the end of the acceleration region, at which point the data is passed on to a separate simulation of the drift compression. Results are be presented.

  14. CMM Interim Check Design of Experiments (U)

    SciTech Connect

    Montano, Joshua Daniel

    2015-07-29

    Coordinate Measuring Machines (CMM) are widely used in industry, throughout the Nuclear Weapons Complex and at Los Alamos National Laboratory (LANL) to verify part conformance to design definition. Calibration cycles for CMMs at LANL are predominantly one year in length and include a weekly interim check to reduce risk. The CMM interim check makes use of Renishaw’s Machine Checking Gauge which is an off-the-shelf product simulates a large sphere within a CMM’s measurement volume and allows for error estimation. As verification on the interim check process a design of experiments investigation was proposed to test a couple of key factors (location and inspector). The results from the two-factor factorial experiment proved that location influenced results more than the inspector or interaction.

  15. Design Calculations For NIF Convergent Ablator Experiments

    SciTech Connect

    Olson, R E; Hicks, D G; Meezan, N B; Callahan, D A; Landen, O L; Jones, O S; Langer, S H; Kline, J L; Wilson, D C; Rinderknecht, H; Zylstra, A; Petrasso, R D

    2011-10-25

    The NIF convergent ablation tuning effort is underway. In the early experiments, we have discovered that the design code simulations over-predict the capsule implosion velocity and shock flash rhor, but under-predict the hohlraum x-ray flux measurements. The apparent inconsistency between the x-ray flux and radiography data implies that there are important unexplained aspects of the hohlraum and/or capsule behavior.

  16. Design calculations for NIF convergent ablator experiments

    NASA Astrophysics Data System (ADS)

    Olson, R. E.; Hicks, D. G.; Meezan, N. B.; Callahan, D. A.; Landen, O. L.; Jones, O. S.; Langer, S. H.; Kline, J. L.; Wilson, D. C.; Rinderknecht, H.; Zylstra, A.; Petrasso, R. D.

    2013-11-01

    The NIF convergent ablation tuning effort is underway. In the early experiments, we have discovered that the design code simulations over-predict the capsule implosion velocity and shock flash ρr, but under-predict the hohlraum x-ray flux measurements. The apparent inconsistency between the x-ray flux and radiography data implies that there are important unexplained aspects of the hohlraum and/or capsule behavior.

  17. GCFR plenum shield design: exit shield experiment

    SciTech Connect

    Muckenthaler, F.J.; Hull, J.L.; Manning, J.J.

    1981-05-01

    This report describes the integral flux, energy spectra, and dose rate measurements made for the Exit Shield Experiment at the Oak Ridge National Laboratory Tower Shielding Facility as part of the Gas Cooled Fast Breeder Reactor program. The source was the same mockup of fuel pins used in the previous Grid Plate Shield Experiment. Two mockups of the upper axial shield were studied: one with seven subassemblies prototypic of that portion of the Exit Shield without a control rod, and another that was representative of the shield region with a control rod. The experiment was performed to provide verification of: the shield design methods, the shield effectiveness of a prototypic mockup, the analytical ability to calculate streaming effects in the presence of a control rod, and the source term bias factors for the upper plenum.

  18. Design Calculations for NIF Convergent Ablator Experiments

    NASA Astrophysics Data System (ADS)

    Olson, R. E.; Callahan, D. A.; Hicks, D. G.; Landen, O. L.; Langer, S. H.; Meezan, N. B.; Spears, B. K.; Widmann, K.; Kline, J. L.; Wilson, D. C.; Petrasso, R. D.; Leeper, R. J.

    2010-11-01

    Design calculations for NIF convergent ablator experiments will be described. The convergent ablator experiments measure the implosion trajectory, velocity, and ablation rate of an x-ray driven capsule and are a important component of the U. S. National Ignition Campaign at NIF. The design calculations are post-processed to provide simulations of the key diagnostics -- 1) Dante measurements of hohlraum x-ray flux and spectrum, 2) streaked radiographs of the imploding ablator shell, 3) wedge range filter measurements of D-He3 proton output spectra, and 4) GXD measurements of the imploded core. The simulated diagnostics will be compared to the experimental measurements to provide an assessment of the accuracy of the design code predictions of hohlraum radiation temperature, capsule ablation rate, implosion velocity, shock flash areal density, and x-ray bang time. Post-shot versions of the design calculations are used to enhance the understanding of the experimental measurements and will assist in choosing parameters for subsequent shots and the path towards optimal ignition capsule tuning. *SNL, LLNL, and LANL are operated under US DOE contracts DE-AC04-94AL85000. DE-AC52-07NA27344, and DE-AC04-94AL85000.

  19. Design calculations for NIF convergent ablator experiments.

    SciTech Connect

    Callahan, Debra; Leeper, Ramon Joe; Spears, B. K.; Zylstra, A.; Seguin, F.; Landen, Otto L.; Petrasso, R. D.; Rinderknecht, H.; Kline, J. L.; Frenje, J.; Wilson, D. C.; Langer, S. H.; Widmann, K.; Meezan, Nathan B.; Hicks, Damien G.; Olson, Richard Edward

    2010-11-01

    Design calculations for NIF convergent ablator experiments will be described. The convergent ablator experiments measure the implosion trajectory, velocity, and ablation rate of an x-ray driven capsule and are a important component of the U. S. National Ignition Campaign at NIF. The design calculations are post-processed to provide simulations of the key diagnostics: (1) Dante measurements of hohlraum x-ray flux and spectrum, (2) streaked radiographs of the imploding ablator shell, (3) wedge range filter measurements of D-He3 proton output spectra, and (4) GXD measurements of the imploded core. The simulated diagnostics will be compared to the experimental measurements to provide an assessment of the accuracy of the design code predictions of hohlraum radiation temperature, capsule ablation rate, implosion velocity, shock flash areal density, and x-ray bang time. Post-shot versions of the design calculations are used to enhance the understanding of the experimental measurements and will assist in choosing parameters for subsequent shots and the path towards optimal ignition capsule tuning.

  20. Design of a water electrolysis flight experiment

    NASA Technical Reports Server (NTRS)

    Lee, M. Gene; Grigger, David J.; Thompson, C. Dean; Cusick, Robert J.

    1993-01-01

    Supply of oxygen (O2) and hydrogen (H2) by electolyzing water in space will play an important role in meeting the National Aeronautics and Space Administration's (NASA's) needs and goals for future space missios. Both O2 and H2 are envisioned to be used in a variety of processes including crew life support, spacecraft propulsion, extravehicular activity, electrical power generation/storage as well as in scientific experiment and manufacturing processes. The Electrolysis Performance Improvement Concept Study (EPICS) flight experiment described herein is sponsored by NASA Headquarters as a part of the In-Space Technology Experiment Program (IN-STEP). The objective of the EPICS is to further contribute to the improvement of the SEF technology, specifially by demonstrating and validating the SFE electromechanical process in microgravity as well as investigating perrformance improvements projected possible in a microgravity environment. This paper defines the experiment objective and presents the results of the preliminary design of the EPICS. The experiment will include testing three subscale self-contained SFE units: one containing baseline components, and two units having variations in key component materials. Tests will be conducted at varying current and thermal condition.

  1. The POLARBEAR Experiment: Design and Characterization

    NASA Astrophysics Data System (ADS)

    Kermish, Zigmund David

    We present the design and characterization of the POLARBEAR experiment. POLARBEAR is a millimeter-wave polarimeter that will measure the Cosmic Microwave Background (CMB) polarization. It was designed to have both the sensitivity and angular resolution to detect the expected B-mode polarization due to gravitational lensing at small angular scales while still enabling a search for the degree scale B-mode polarization caused by inflationary gravitational waves. The instrument utilizes the Huan Tran Telescope (HTT), a 2.5-meter primary mirror telescope, coupled to a unique focal plane of 1,274 antenna-coupled transition-edge sensor (TES) detectors to achieve unprecedented sensitivity from angular scales of the experiment's 4 arcminute beam to several degrees. This dissertation focuses on the design, integration and characterization of the cryogenic receiver for the POLARBEAR instrument. The receiver cools the ˜20 cm focal plane to 0.25 Kelvin, with detector readout provided by a digital frequency-multiplexed SQUID system. The POLARBEAR receiver was been successfully deployed on the HTT for an engineering run in the Eastern Sierras of California and is currently deployed on Cerro Toco in the Atacama Dessert of Chile. We present results from lab tests done to characterize the instrument, from the engineering run and preliminary results from Chile.

  2. Principles of designing interpretable optogenetic behavior experiments

    PubMed Central

    Allen, Brian D.; Singer, Annabelle C.

    2015-01-01

    Over the last decade, there has been much excitement about the use of optogenetic tools to test whether specific cells, regions, and projection pathways are necessary or sufficient for initiating, sustaining, or altering behavior. However, the use of such tools can result in side effects that can complicate experimental design or interpretation. The presence of optogenetic proteins in cells, the effects of heat and light, and the activity of specific ions conducted by optogenetic proteins can result in cellular side effects. At the network level, activation or silencing of defined neural populations can alter the physiology of local or distant circuits, sometimes in undesired ways. We discuss how, in order to design interpretable behavioral experiments using optogenetics, one can understand, and control for, these potential confounds. PMID:25787711

  3. Investment casting design of experiment. Final report

    SciTech Connect

    Owens, R.

    1997-10-01

    Specific steps in the investment casting process were analyzed in a designed experiment. The casting`s sensitivity to changes in these process steps was experimentally determined Dimensional and radiographic inspection were used to judge the sensitivity of the casting. Thirty-six castings of different pedigrees were poured and measured. Some of the dimensional inspection was conducted during the processing. It was confirmed that wax fixturing, number of gates, gate location, pour and mold temperature, pour speed, and cooling profile all affected the radiographic quality of the casting. Gate and runner assembly techniques, number of gates, and mold temperature affect the dimensional quality of the casting.

  4. Preliminary design of the redundant software experiment

    NASA Technical Reports Server (NTRS)

    Campbell, Roy; Deimel, Lionel; Eckhardt, Dave, Jr.; Kelly, John; Knight, John; Lauterbach, Linda; Lee, Larry; Mcallister, Dave; Mchugh, John

    1985-01-01

    The goal of the present experiment is to characterize the fault distributions of highly reliable software replicates, constructed using techniques and environments which are similar to those used in comtemporary industrial software facilities. The fault distributions and their effect on the reliability of fault tolerant configurations of the software will be determined through extensive life testing of the replicates against carefully constructed randomly generated test data. Each detected error will be carefully analyzed to provide insight in to their nature and cause. A direct objective is to develop techniques for reducing the intensity of coincident errors, thus increasing the reliability gain which can be achieved with fault tolerance. Data on the reliability gains realized, and the cost of the fault tolerant configurations can be used to design a companion experiment to determine the cost effectiveness of the fault tolerant strategy. Finally, the data and analysis produced by this experiment will be valuable to the software engineering community as a whole because it will provide a useful insight into the nature and cause of hard to find, subtle faults which escape standard software engineering validation techniques and thus persist far into the software life cycle.

  5. Design of Barker coded multiple pulse experiments

    NASA Astrophysics Data System (ADS)

    Zamlutti, C. J.

    1980-12-01

    The combination of Barker-coded pulse compression techniques with the multiple pulse technique in incoherent scatter studies of the lower ionosphere is discussed. The basic principles of both techniques are reviewed, and the combined technique is presented as consisting of the coding of each pulse of the multiple pulse scheme by a b-baud Barker code. Design considerations for measurements from which the autocorrelation function can be computed are examined for the case of the radar at Arecibo, and possible experiments for observations of sporadic E layers, man-made ionospheric modification and the nighttime E layer are proposed. It is noted that the advantage of the Barker-coded multiple pulse technique consists in the possibility of obtaining simultaneously height and frequency resolution, which is important in the observation of thin layers with narrow frequency spectra.

  6. Interim Service ISDN Satellite (ISIS) hardware experiment design for advanced ISDN satellite design and experiments

    NASA Technical Reports Server (NTRS)

    Pepin, Gerard R.

    1992-01-01

    The Interim Service Integrated Services Digital Network (ISDN) Satellite (ISIS) Hardware Experiment Design for Advanced Satellite Designs describes the design of the ISDN Satellite Terminal Adapter (ISTA) capable of translating ISDN protocol traffic into time division multiple access (TDMA) signals for use by a communications satellite. The ISTA connects the Type 1 Network Termination (NT1) via the U-interface on the line termination side of the CPE to the V.35 interface for satellite uplink. The same ISTA converts in the opposite direction the V.35 to U-interface data with a simple switch setting.

  7. Design Point for a Spheromak Compression Experiment

    NASA Astrophysics Data System (ADS)

    Woodruff, Simon; Romero-Talamas, Carlos A.; O'Bryan, John; Stuber, James; Darpa Spheromak Team

    2015-11-01

    Two principal issues for the spheromak concept remain to be addressed experimentally: formation efficiency and confinement scaling. We are therefore developing a design point for a spheromak experiment that will be heated by adiabatic compression, utilizing the CORSICA and NIMROD codes as well as analytic modeling with target parameters R_initial =0.3m, R_final =0.1m, T_initial =0.2keV, T_final =1.8keV, n_initial =1019m-3 and n_final = 1021m-3, with radial convergence of C =3. This low convergence differentiates the concept from MTF with C =10 or more, since the plasma will be held in equilibrium throughout compression. We present results from CORSICA showing the placement of coils and passive structure to ensure stability during compression, and design of the capacitor bank needed to both form the target plasma and compress it. We specify target parameters for the compression in terms of plasma beta, formation efficiency and energy confinement. Work performed under DARPA grant N66001-14-1-4044.

  8. Design and construction experience from photovoltaic installations

    NASA Astrophysics Data System (ADS)

    Kauffman, W. R.; Lambarski, T. J.; Forrester, D. L.

    Lessons learned by one company during the design, construction, and operation of several prototype commercial photovoltaic (PV) power plants are reported. Intensive on-site quality control management was found necessary for controlling costs during installation of a 1 MW array for utility power production. A need to use modular components to offset high labor costs was noted. Installation of a 240 kW array of parabolic trough concentrators at a community school revealed the significance of stable support structures for tracking systems, which require reliable electronics for operation. A roof-mounted 50 kW array showed the potential hazards from wind damage and the greater economic benefits that accrue from maximizing output rather than cutting corners on the system costs. An airbase 40 kW array disclosed heat dissipation and moisture intrusion problems in ancillary equipment, and a residential 4.68 kW roof-mounted system demonstrated the increases in installation efficiency available with larger crews and greater experience.

  9. Distributed Design and Analysis of Computer Experiments

    2002-11-11

    DDACE is a C++ object-oriented software library for the design and analysis of computer experiments. DDACE can be used to generate samples from a variety of sampling techniques. These samples may be used as input to a application code. DDACE also contains statistical tools such as response surface models and correlation coefficients to analyze input/output relationships between variables in an application code. DDACE can generate input values for uncertain variables within a user's application. Formore » example, a user might like to vary a temperature variable as well as some material variables in a series of simulations. Through the series of simulations the user might be looking for optimal settings of parameters based on some user criteria. Or the user may be interested in the sensitivity to input variability shown by an output variable. In either case, the user may provide information about the suspected ranges and distributions of a set of input variables, along with a sampling scheme, and DDACE will generate input points based on these specifications. The input values generated by DDACE and the one or more outputs computed through the user's application code can be analyzed with a variety of statistical methods. This can lead to a wealth of information about the relationships between the variables in the problem. While statistical and mathematical packages may be employeed to carry out the analysis on the input/output relationships, DDACE also contains some tools for analyzing the simulation data. DDACE incorporates a software package called MARS (Multivariate Adaptive Regression Splines), developed by Jerome Friedman. MARS is used for generating a spline surface fit of the data. With MARS, a model simplification may be calculated using the input and corresponding output, values for the user's application problem. The MARS grid data may be used for generating 3-dimensional response surface plots of the simulation data. DDACE also contains an implementation

  10. Designing Effective Research Experiences for Undergraduates (Invited)

    NASA Astrophysics Data System (ADS)

    Jones Whyte, P.; Dalbotten, D. M.

    2009-12-01

    The undergraduate research experience has been recognized as a valuable component of preparation for graduate study. As competition for spaces in graduate schools become more keen students benefit from a formal introduction to the life of a scholar. Over the last twenty years a model of preparing students for graduate study with the research experience as the base has been refined at the University of Minnesota. The experience includes assignment with a faculty member and a series of seminars that support the experience. The seminars cover topics to include academic writing, scholarly literature review, writing of the abstract, research subject protection protocols, GRE test preparation, opportunities to interact with graduate student, preparing the graduate school application, and preparation of a poster to demonstrate the results of the research. The next phase of the process is to determine the role of the undergraduate research experience in the graduate school admission process.

  11. An Architectural Experience for Interface Design

    ERIC Educational Resources Information Center

    Gong, Susan P.

    2016-01-01

    The problem of human-computer interface design was brought to the foreground with the emergence of the personal computer, the increasing complexity of electronic systems, and the need to accommodate the human operator in these systems. With each new technological generation discovering the interface design problems of its own technologies, initial…

  12. Statistical design of a uranium corrosion experiment

    SciTech Connect

    Wendelberger, Joanne R; Moore, Leslie M

    2009-01-01

    This work supports an experiment being conducted by Roland Schulze and Mary Ann Hill to study hydride formation, one of the most important forms of corrosion observed in uranium and uranium alloys. The study goals and objectives are described in Schulze and Hill (2008), and the work described here focuses on development of a statistical experiment plan being used for the study. The results of this study will contribute to the development of a uranium hydriding model for use in lifetime prediction models. A parametric study of the effect of hydrogen pressure, gap size and abrasion on hydride initiation and growth is being planned where results can be analyzed statistically to determine individual effects as well as multi-variable interactions. Input to ESC from this experiment will include expected hydride nucleation, size, distribution, and volume on various uranium surface situations (geometry) as a function of age. This study will also address the effect of hydrogen threshold pressure on corrosion nucleation and the effect of oxide abrasion/breach on hydriding processes. Statistical experiment plans provide for efficient collection of data that aids in understanding the impact of specific experiment factors on initiation and growth of corrosion. The experiment planning methods used here also allow for robust data collection accommodating other sources of variation such as the density of inclusions, assumed to vary linearly along the cast rods from which samples are obtained.

  13. EXPERIENCES IN DESIGNING SOLVENTS FOR THE ENVIRONMENT

    EPA Science Inventory

    To meet the great need of replacing many harmful solvents commonly used by industry and the public with environmentally benign substitute solvents, the PARIS II solvent design software has been developed. Although the difficulty of successfully finding replacements increases with...

  14. Hybrid Rocket Experiment Station for Capstone Design

    NASA Technical Reports Server (NTRS)

    Conley, Edgar; Hull, Bethanne J.

    2012-01-01

    Portable hybrid rocket motors and test stands can be seen in many papers but none have been reported on the design or instrumentation at such a small magnitude. The design of this hybrid rocket and test stand is to be small and portable (suitcase size). This basic apparatus will be used for demonstrations in rocket propulsion. The design had to include all of the needed hardware to operate the hybrid rocket unit (with the exception of the external Oxygen tank). The design of this project includes making the correlation between the rocket's thrust and its size, the appropriate transducers (physical size, resolution, range, and cost), compatability with a laptop analog card, the ease of setup, and its portability.

  15. Thermal Characterization of Functionally Graded Materials: Design of Optimum Experiments

    NASA Technical Reports Server (NTRS)

    Cole, Kevin D.

    2003-01-01

    This paper is a study of optimal experiment design applied to the measure of thermal properties in functionally graded materials. As a first step, a material with linearly-varying thermal properties is analyzed, and several different tran- sient experimental designs are discussed. An optimality criterion, based on sen- sitivity coefficients, is used to identify the best experimental design. Simulated experimental results are analyzed to verify that the identified best experiment design has the smallest errors in the estimated parameters. This procedure is general and can be applied to design of experiments for a variety of materials.

  16. Designing a successful HMD-based experience

    NASA Technical Reports Server (NTRS)

    Pierce, J. S.; Pausch, R.; Sturgill, C. B.; Christiansen, K. D.; Kaiser, M. K. (Principal Investigator)

    1999-01-01

    For entertainment applications, a successful virtual experience based on a head-mounted display (HMD) needs to overcome some or all of the following problems: entering a virtual world is a jarring experience, people do not naturally turn their heads or talk to each other while wearing an HMD, putting on the equipment is hard, and people do not realize when the experience is over. In the Electric Garden at SIGGRAPH 97, we presented the Mad Hatter's Tea Party, a shared virtual environment experienced by more than 1,500 SIGGRAPH attendees. We addressed these HMD-related problems with a combination of back story, see-through HMDs, virtual characters, continuity of real and virtual objects, and the layout of the physical and virtual environments.

  17. Hypersonic drone vehicle design: A multidisciplinary experience

    NASA Technical Reports Server (NTRS)

    1988-01-01

    UCLA's Advanced Aeronautic Design group focussed their efforts on design problems of an unmanned hypersonic vehicle. It is felt that a scaled hypersonic drone is necesary to bridge the gap between present theory on hypersonics and the future reality of the National Aerospace Plane (NASP) for two reasons: (1) to fulfill a need for experimental data in the hypersonic regime, and (2) to provide a testbed for the scramjet engine which is to be the primary mode of propulsion for the NASP. The group concentrated on three areas of great concern to NASP design: propulsion, thermal management, and flight systems. Problem solving in these areas was directed toward design of the drone with the idea that the same design techniques could be applied to the NASP. A 70 deg swept double-delta wing configuration, developed in the 70's at the NASA Langley, was chosen as the aerodynamic and geometric model for the drone. This vehicle would be air launched from a B-1 at Mach 0.8 and 48,000 feet, rocket boosted by two internal engines to Mach 10 and 100,000 feet, and allowed to cruise under power of the scramjet engine until burnout. It would then return to base for an unpowered landing. Preliminary energy calculations based on flight requirements give the drone a gross launch weight of 134,000 pounds and an overall length of 85 feet.

  18. STELLA Experiment: Design and Model Predictions

    SciTech Connect

    Kimura, W. D.; Babzien, M.; Ben-Zvi, I.; Campbell, L. P.; Cline, D. B.; Fiorito, R. B.; Gallardo, J. C.; Gottschalk, S. C.; He, P.; Kusche, K. P.; Liu, Y.; Pantell, R. H.; Pogorelsky, I. V.; Quimby, D. C.; Robinson, K. E.; Rule, D. W.; Sandweiss, J.; Skaritka, J.; van Steenbergen, A.; Steinhauer, L. C.; Yakimenko, V.

    1998-07-05

    The STaged ELectron Laser Acceleration (STELLA) experiment will be one of the first to examine the critical issue of staging the laser acceleration process. The BNL inverse free electron laser (EEL) will serve as a prebuncher to generate {approx} 1 {micro}m long microbunches. These microbunches will be accelerated by an inverse Cerenkov acceleration (ICA) stage. A comprehensive model of the STELLA experiment is described. This model includes the EEL prebunching, drift and focusing of the microbunches into the ICA stage, and their subsequent acceleration. The model predictions will be presented including the results of a system error study to determine the sensitivity to uncertainties in various system parameters.

  19. Batch sequential designs for computer experiments

    SciTech Connect

    Moore, Leslie M; Williams, Brian J; Loeppky, Jason L

    2009-01-01

    Computer models simulating a physical process are used in many areas of science. Due to the complex nature of these codes it is often necessary to approximate the code, which is typically done using a Gaussian process. In many situations the number of code runs available to build the Guassian process approximation is limited. When the initial design is small or the underlying response surface is complicated this can lead to poor approximations of the code output. In order to improve the fit of the model, sequential design strategies must be employed. In this paper we introduce two simple distance based metrics that can be used to augment an initial design in a batch sequential manner. In addition we propose a sequential updating strategy to an orthogonal array based Latin hypercube sample. We show via various real and simulated examples that the distance metrics and the extension of the orthogonal array based Latin hypercubes work well in practice.

  20. Experience in Constructions: Designing a Wall

    ERIC Educational Resources Information Center

    Glenn, Barbara

    1978-01-01

    Viewing a contemporary artist's works to learn about the artist and his/her personal vision is one thing for elementary school students. Adding an actual experience of doing makes the exposure much more alive. Students at Snail Lake Elementary School in Moundsview, Minnesota, viewed a Louise Nevelson exhibit and were inspired to new uses of art…

  1. Modal identification experiment design for large space structures

    NASA Technical Reports Server (NTRS)

    Kim, Hyoung M.; Doiron, Harold H.

    1991-01-01

    This paper describes an on-orbit modal identification experiment design for large space structures. Space Station Freedom (SSF) systems design definition and structural dynamic models were used as representative large space structures for optimizing experiment design. Important structural modes of study models were selected to provide a guide for experiment design and used to assess the design performance. A pulsed random excitation technique using propulsion jets was developed to identify closely-spaced modes. A measuremenat location selection approach was developed to estimate accurate mode shapes as well as frequencies and damping factors. The data acquisition system and operational scenarios were designed to have minimal impacts on the SSF. A comprehensive simulation was conducted to assess the overall performance of the experiment design.

  2. Hypersonic drone design: A multidisciplinary experience

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Efforts were focused on design problems of an unmanned hypersonic vehicle. It is felt that a scaled hypersonic drone is necessary to bridge the gap between present theory on hypersonics and the future reality of the National Aerospace Plane (NASP) for two reasons: to fulfill a need for experimental data in the hypersonic regime, and to provide a testbed for the scramjet engine which is to be the primary mode of propulsion for the NASP. Three areas of great concern to NASP design were examined: propulsion, thermal management, and flight systems. Problem solving in these areas was directed towards design of the drone with the idea that the same design techniques could be applied to the NASP. A seventy degree swept double delta wing configuration, developed in the 70's at NASA Langley, was chosen as the aerodynamic and geometric model for the drone. This vehicle would be air-launched from a B-1 at Mach 0.8 and 48,000 feet, rocket boosted by two internal engines to Mach 10 and 100,000 feet, and allowed to cruise under power of the scramjet engine until burnout. It would then return to base for an unpowered landing. Preliminary energy calculations based upon the flight requirements give the drone a gross launch weight of 134,000 lb. and an overall length of 85 feet.

  3. Principles of Designing Interpretable Optogenetic Behavior Experiments

    ERIC Educational Resources Information Center

    Allen, Brian D.; Singer, Annabelle C.; Boyden, Edward S.

    2015-01-01

    Over the last decade, there has been much excitement about the use of optogenetic tools to test whether specific cells, regions, and projection pathways are necessary or sufficient for initiating, sustaining, or altering behavior. However, the use of such tools can result in side effects that can complicate experimental design or interpretation.…

  4. Monte Carlo Experiments: Design and Implementation.

    ERIC Educational Resources Information Center

    Paxton, Pamela; Curran, Patrick J.; Bollen, Kenneth A.; Kirby, Jim; Chen, Feinian

    2001-01-01

    Illustrates the design and planning of Monte Carlo simulations, presenting nine steps in planning and performing a Monte Carlo analysis from developing a theoretically derived question of interest through summarizing the results. Uses a Monte Carlo simulation to illustrate many of the relevant points. (SLD)

  5. Learning Experience as Transaction: A Framework for Instructional Design

    ERIC Educational Resources Information Center

    Parrish, Patrick E.; Wilson, Brent G.; Dunlap, Joanna C.

    2011-01-01

    This article presents a framework for understanding learning experience as an object for instructional design--as an object for design as well as research and understanding. Compared to traditional behavioral objectives or discrete cognitive skills, the object of experience is more holistic, requiring simultaneous attention to cognition, behavior,…

  6. A micrometeoroid deceleration and capture experiment: Conceptual experiment design description

    NASA Technical Reports Server (NTRS)

    Wolfe, J. H.; Ballard, R. W.; Carle, G. C.; Bunch, T. E.

    1986-01-01

    The preliminary conceptual design for a cosmic dust collector is described. For the case of low Earth orbit (LEO), dust particles enter the collector through the collimator at a few volts negative potential due to charging in the ionosphere, at a velocity of 1 to 50 km/sec. The particles then pass through an electron stream and are charged to about 1 KV negative (regardless of incoming polarity). The 1 KV negatively charged particle then passes through three sensing grids coupled to charge sensitive preamps (CSP). The comparison of the two pulses provided by S(1) and S(2) are utilized by the microprocessor to determine the charge, q, on the particle (pulse amplitude) and its velocity, v (by time of flight). The third sensing grid, S(3), is kept at about 20 KV negative so that the dust particle will now be decelerated in passing from S(2) (zero potential) to S(3). S(3) is capacitively coupled to its CSP and the pulse from S(3) is utilized by the microprocessor to determine the particle's energy, E, and therefore its mass, m (again by time of flight) by comparison with the pulses from S(1) and S(2). The microprocessor can now precisely program the high-voltage switching network for the proper timing in the grounding of the successive deceleration grids. As determined by the microprocessor, each successive deceleration grid is grounded just after the dust particle passes, thus reducing the particle's energy by the amount q*100 KV at each stage. The microprocessor also determines at which stage the particle will fall below a certain critical energy where all remaining grids remain unswitched so that the particle will drift to the collector. The collector is kept at about 100V positive and is covered with gold foil to eliminate contamination and is removable for subsequent return to earth for detailed analysis.

  7. Experiment to measure vacuum birefringence: Conceptual design

    NASA Astrophysics Data System (ADS)

    Mueller, Guido; Tanner, David; Doebrich, Babette; Poeld, Jan; Lindner, Axel; Willke, Benno

    2016-03-01

    Vacuum birefringence is another lingering challenge which will soon become accessible to experimental verification. The effect was first calculated by Euler and Heisenberg in 1936 and is these days described as a one-loop correction to the differential index of refraction between light which is polarized parallel and perpendicular to an external magnetic field. Our plan is to realize (and slightly modify) an idea which was originally published by Hall, Ye, and Ma using advanced LIGO and LISA technology and the infrastructure of the ALPS light-shining-through-walls experiment following the ALPS IIc science run. This work is supported by the Deutsche Forschungsgemeinschaft and the Heising-Simons Foundation.

  8. Student designed experiments to learn fluids

    NASA Astrophysics Data System (ADS)

    Stern, Catalina

    2013-11-01

    Lasers and high speed cameras are a wonderful tool to visualize the very complex behavior of fluids, and to help students grasp concepts like turbulence, surface tension and vorticity. In this work we present experiments done by physics students in their senior year at the School of Science of the National University of Mexico as a final project in the continuum mechanics course. Every semester, the students make an oral presentation of their work and videos and images are kept in the web page ``Pasión por los Fluidos''. I acknowledge support from the Physics Department of Facultad de Ciencias, Universidad Nacional Autónoma de México.

  9. Proper battery system design for GAS experiments

    NASA Technical Reports Server (NTRS)

    Calogero, Stephen A.

    1992-01-01

    The purpose of this paper is to help the GAS experimenter to design a battery system that meets mission success requirements while at the same time reducing the hazards associated with the battery system. Lead-acid, silver-zinc and alkaline chemistry batteries will be discussed. Lithium batteries will be briefly discussed with emphasis on back-up power supply capabilities. The hazards associated with different battery configurations will be discussed along with the controls necessary to make the battery system two-fault tolerant.

  10. Design for a High Energy Density Kelvin-Helmholtz Experiment

    SciTech Connect

    Hurricane, O A

    2007-10-29

    While many high energy density physics (HEDP) Rayleigh-Taylor and Richtmyer-Meshkov instability experiments have been fielded as part of basic HEDP and astrophysics studies, not one HEDP Kelvin-Helmholtz (KH) experiment has been successfully performed. Herein, a design for a novel HEDP x-ray driven KH experiment is presented along with supporting radiation-hydrodynamic simulation and theory.

  11. Hypersonic Wind Tunnel Calibration Using the Modern Design of Experiments

    NASA Technical Reports Server (NTRS)

    Rhode, Matthew N.; DeLoach, Richard

    2005-01-01

    A calibration of a hypersonic wind tunnel has been conducted using formal experiment design techniques and response surface modeling. Data from a compact, highly efficient experiment was used to create a regression model of the pitot pressure as a function of the facility operating conditions as well as the longitudinal location within the test section. The new calibration utilized far fewer design points than prior experiments, but covered a wider range of the facility s operating envelope while revealing interactions between factors not captured in previous calibrations. A series of points chosen randomly within the design space was used to verify the accuracy of the response model. The development of the experiment design is discussed along with tactics used in the execution of the experiment to defend against systematic variation in the results. Trends in the data are illustrated, and comparisons are made to earlier findings.

  12. Design and Analysis of AN Static Aeroelastic Experiment

    NASA Astrophysics Data System (ADS)

    Hou, Ying-Yu; Yuan, Kai-Hua; Lv, Ji-Nan; Liu, Zi-Qiang

    2016-06-01

    Static aeroelastic experiments are very common in the United States and Russia. The objective of static aeroelastic experiments is to investigate deformation and loads of elastic structure in flow field. Generally speaking, prerequisite of this experiment is that the stiffness distribution of structure is known. This paper describes a method for designing experimental models, in the case where the stiffness distribution and boundary condition of a real aircraft are both uncertain. The stiffness distribution form of the structure can be calculated via finite element modeling and simulation calculation and F141 steels and rigid foam are used to make elastic model. In this paper, the design and manufacturing process of static aeroelastic models is presented and a set of experiment model was designed to simulate the stiffness of the designed wings, a set of experiments was designed to check the results. The test results show that the experimental method can effectively complete the design work of elastic model. This paper introduces the whole process of the static aeroelastic experiment, and the experimental results are analyzed. This paper developed a static aeroelasticity experiment technique and established an experiment model targeting at the swept wing of a certain kind of large aspect ratio aircraft.

  13. Divertor design for the Tokamak Physics Experiment

    SciTech Connect

    Hill, D.N.; Braams, B.; Brooks, J.N.

    1994-05-01

    In this paper we discuss the present divertor design for the planned TPX tokamak, which will explore the physics and technology of steady-state (1000s pulses) heat and particle removal in high confinement (2--4{times} L-mode), high beta ({beta}{sub N} {ge} 3) divertor plasmas sustained by non-inductive current drive. The TPX device will operate in the double-null divertor configuration, with actively cooled graphite targets forming a deep (0.5 m) slot at the outer strike point. The peak heat flux on, the highly tilted (74{degrees} from normal) re-entrant (to recycle ions back toward the separatrix) will be in the range of 4--6 MW/m{sup 2} with 18 MW of neutral beams and RF heating power. The combination of active pumping and gas puffing (deuterium plus impurities), along with higher heating power (45 MW maximum) will allow testing of radiative divertor concepts at ITER-like power densities.

  14. Recent experience with design and manufacture of cine lenses

    NASA Astrophysics Data System (ADS)

    Thorpe, Michael D.; Dalzell, Kristen E.

    2015-09-01

    Modern cine lenses require a high degree of aberration correction over a large and ever expanding image size. At low to medium volume production levels, these highly corrected designs also require a workable tolerance set and compensation scheme for successful manufacture. In this paper we discuss the design and manufacture of cine lenses with reference to current designs both internal and in the patent literature and some experience in design, tolerancing and manufacturing these lenses in medium volume production.

  15. User experience interaction design for digital educational games

    NASA Astrophysics Data System (ADS)

    Yuan, Jiugen; Zhang, Wenting; Xing, Ruonan

    2014-04-01

    Leading the elements of games into education is the newest teaching concepts in the field of educational technology, which is by using healthy games to impel and preserve the learner's motivation, improve the learning efficiency and bring one experience in learning something by playing games. First of all, this article has introduced the concept of Digital Game and User Experience and brought the essence of Digital Game to light to construct the frame of user experience interaction design for digital educational games and offer one design idea for the development of related products and hoping that Digital Game will bring us continuous innovation experience.

  16. Design Considerations for Large Mass Ultra-Low Background Experiments

    SciTech Connect

    Aguayo Navarrete, Estanislao; Reid, Douglas J.; Fast, James E.; Orrell, John L.

    2011-07-01

    Summary The objective of this document is to present the designers of the next generation of large-mass, ultra-low background experiments with lessons learned and design strategies from previous experimental work. Design issues divided by topic into mechanical, thermal and electrical requirements are addressed. Large mass low-background experiments have been recognized by the scientific community as appropriate tools to aid in the refinement of the standard model. The design of these experiments is very costly and a rigorous engineering review is required for their success. The extreme conditions that the components of the experiment must withstand (heavy shielding, vacuum/pressure and temperature gradients), in combination with unprecedented noise levels, necessitate engineering guidance to support quality construction and safe operating conditions. Physical properties and analytical results of typical construction materials are presented. Design considerations for achieving ultra-low-noise data acquisition systems are addressed. Five large-mass, low-background conceptual designs for the one-tonne scale germanium experiment are proposed and analyzed. The result is a series of recommendations for future experiments engineering and for the Majorana simulation task group to evaluate the different design approaches.

  17. Conceptual design of liquid droplet radiator shuttle-attached experiment

    NASA Technical Reports Server (NTRS)

    Pfeiffer, Shlomo L.

    1989-01-01

    The conceptual design of a shuttle-attached liquid droplet radiator (LDR) experiment is discussed. The LDR is an advanced, lightweight heat rejection concept that can be used to reject heat from future high-powered space platforms. In the LDR concept, submillimeter-sized droplets are generated, pass through space, radiate heat before they are collected, and recirculated back to the heat source. The LDR experiment is designed to be attached to the shuttle longeron and integrated into the shuttle bay using standard shuttle/experiment interfaces. Overall power, weight, and data requirements of the experiment are detailed. The conceptual designs of the droplet radiator, droplet collector, and the optical diagnostic system are discussed in detail. Shuttle integration and safety design issues are also discussed.

  18. Electrical design of payload G-534: The Pool Boiling Experiment

    NASA Technical Reports Server (NTRS)

    Francisco, David R.

    1992-01-01

    Payload G-534, the Pool Boiling Experiment (PBE), is a Get Away Special that is scheduled to fly on the shuttle in 1992. This paper will give a brief overall description of the experiment with the main discussion being the electrical design with a detailed description of the power system and interface to the GAS electronics. The batteries used and their interface to the experiment Power Control Unit (PCU) and GAS electronics will be examined. The design philosophy for the PCU will be discussed in detail. The criteria for selection of fuses, relays, power semiconductors and other electrical components along with grounding and shielding policy for the entire experiment will be presented. The intent of this paper is to discuss the use of military tested parts and basic design guidelines to build a quality experiment for minimal additional cost.

  19. Selecting the best design for nonstandard toxicology experiments.

    PubMed

    Webb, Jennifer M; Smucker, Byran J; Bailer, A John

    2014-10-01

    Although many experiments in environmental toxicology use standard statistical experimental designs, there are situations that arise where no such standard design is natural or applicable because of logistical constraints. For example, the layout of a laboratory may suggest that each shelf serve as a block, with the number of experimental units per shelf either greater than or less than the number of treatments in a way that precludes the use of a typical block design. In such cases, an effective and powerful alternative is to employ optimal experimental design principles, a strategy that produces designs with precise statistical estimates. Here, a D-optimal design was generated for an experiment in environmental toxicology that has 2 factors, 16 treatments, and constraints similar to those described above. After initial consideration of a randomized complete block design and an intuitive cyclic design, it was decided to compare a D-optimal design and a slightly more complicated version of the cyclic design. Simulations were conducted generating random responses under a variety of scenarios that reflect conditions motivated by a similar toxicology study, and the designs were evaluated via D-efficiency as well as by a power analysis. The cyclic design performed well compared to the D-optimal design.

  20. Assay optimization: a statistical design of experiments approach.

    PubMed

    Altekar, Maneesha; Homon, Carol A; Kashem, Mohammed A; Mason, Steven W; Nelson, Richard M; Patnaude, Lori A; Yingling, Jeffrey; Taylor, Paul B

    2007-03-01

    With the transition from manual to robotic HTS in the last several years, assay optimization has become a significant bottleneck. Recent advances in robotic liquid handling have made it feasible to reduce assay optimization timelines with the application of statistically designed experiments. When implemented, they can efficiently optimize assays by rapidly identifying significant factors, complex interactions, and nonlinear responses. This article focuses on the use of statistically designed experiments in assay optimization.

  1. Adaptive multibeam phased array design for a Spacelab experiment

    NASA Technical Reports Server (NTRS)

    Noji, T. T.; Fass, S.; Fuoco, A. M.; Wang, C. D.

    1977-01-01

    The parametric tradeoff analyses and design for an Adaptive Multibeam Phased Array (AMPA) for a Spacelab experiment are described. This AMPA Experiment System was designed with particular emphasis to maximize channel capacity and minimize implementation and cost impacts for future austere maritime and aeronautical users, operating with a low gain hemispherical coverage antenna element, low effective radiated power, and low antenna gain-to-system noise temperature ratio.

  2. 2011 AERA Presidential Address: Designing Resilient Ecologies--Social Design Experiments and a New Social Imagination

    ERIC Educational Resources Information Center

    Gutiérrez, Kris D.

    2016-01-01

    This article is about designing for educational possibilities--designs that in their inception, social organization, and implementation squarely address issues of cultural diversity, social inequality, and robust learning. I discuss an approach to design-based research, social design experiments, that privileges a social scientific inquiry…

  3. Teaching Optimal Design of Experiments Using a Spreadsheet

    ERIC Educational Resources Information Center

    Goos, Peter; Leemans, Herlinde

    2004-01-01

    In this paper, we present an interactive teaching approach to introduce the concept of optimal design of experiments to students. Our approach is based on the use of spreadsheets. One advantage of this approach is that no complex mathematical theory is needed nor that any design construction algorithm has to be discussed at the introductory stage.…

  4. Overview of the ICF 1000 MJ experiment chamber design

    SciTech Connect

    Slaughter, D.

    1988-09-23

    A conceptual design of an experiment chamber for a high gain ICF facility (1000 MJ) is being developed. Performance goals have been established. Several design approaches are being evaluated through computer simulation, engineering analysis, and experimental testing of candidate first wall components. 10 refs., 3 figs.

  5. Recreation Programming: Designing Leisure Experiences. 5th Edition

    ERIC Educational Resources Information Center

    Rossman, J. Robert; Schlatter, Barbara Elwood

    2008-01-01

    Originally published in 1989, "Recreation Programming: Designing Leisure Experiences" has become a standard in the park, recreation, and leisure service industry. This title has been used to teach beginning and experienced programmers in over 100 higher-education institutions, both nationally and internationally. Designed in a user-friendly…

  6. Building a Framework for Engineering Design Experiences in High School

    ERIC Educational Resources Information Center

    Denson, Cameron D.; Lammi, Matthew

    2014-01-01

    In this article, Denson and Lammi put forth a conceptual framework that will help promote the successful infusion of engineering design experiences into high school settings. When considering a conceptual framework of engineering design in high school settings, it is important to consider the complex issue at hand. For the purposes of this…

  7. Factorial Design: An Eight Factor Experiment Using Paper Helicopters

    NASA Technical Reports Server (NTRS)

    Kozma, Michael

    1996-01-01

    The goal of this paper is to present the analysis of the multi-factor experiment (factorial design) conducted in EG490, Junior Design at Loyola College in Maryland. The discussion of this paper concludes the experimental analysis and ties the individual class papers together.

  8. On Design Experiment Teaching in Engineering Quality Cultivation

    ERIC Educational Resources Information Center

    Chen, Xiao

    2008-01-01

    Design experiment refers to that designed and conducted by students independently and is surely an important method to cultivate students' comprehensive quality. According to the development and requirements of experimental teaching, this article carries out a study and analysis on the purpose, significance, denotation, connotation and…

  9. Thinking about "Design Thinking": A Study of Teacher Experiences

    ERIC Educational Resources Information Center

    Retna, Kala S.

    2016-01-01

    Schools are continuously looking for new ways of enhancing student learning to equip students with skills that would enable them to cope with twenty-first century demands. One promising approach focuses on design thinking. This study examines teacher's perceptions, experiences and challenges faced in adopting design thinking. There is a lack of…

  10. Resolution of an Orbital Issue: A Designed Experiment

    NASA Technical Reports Server (NTRS)

    Huddleston, Lisa L.

    2011-01-01

    Design of Experiments (DOE) is a systematic approach to investigation of a system or process. A series of structured tests are designed in which planned changes are made to the input variables of a process or system. The effects of these changes on a pre-defined output are then assessed. DOE is a formal method of maximizing information gained while minimizing resources required.

  11. Design of spatial experiments: Model fitting and prediction

    SciTech Connect

    Fedorov, V.V.

    1996-03-01

    The main objective of the paper is to describe and develop model oriented methods and algorithms for the design of spatial experiments. Unlike many other publications in this area, the approach proposed here is essentially based on the ideas of convex design theory.

  12. Thermal design, analysis and testing of the Halogen Occultation Experiment

    NASA Technical Reports Server (NTRS)

    Foss, Richard A.; Smith, Dewey M.

    1987-01-01

    This paper briefly introduces the Halogen Occultation Experiment (HALOE) and describes the thermal requirements in some detail. The thermal design of the HALOE is described, together with the design process and the analytical techniques used to arrive at this design. The flight hardware has undergone environmental testing in a thermal vacuum chamber to validate the thermal design. The HALOE is a unique problem in thermal control due to its variable solar loading, its extremely sensitive optical components and the high degree of pointing accuracy required. This paper describes the flight hardware, the design process and its verification.

  13. Aeroassist Flight Experiment Reaction Control System preliminary design

    NASA Technical Reports Server (NTRS)

    Langford, G. K.; Price, D. E.; Gallaher, M. W.

    1990-01-01

    The Aeroassist Flight Experiment (AFE) has several different flight modes associated with its mission. The effect the spacecraft attitude control system (ACS) has on the Reaction Control System (RCS) requirements for all the flight modes is discussed. The ACS requirements and their consequences on the design of the RCS is then discussed in detail. Special problems in the RCS design unique to the AFE mission and the design solutions to these problems are presented.

  14. Design of experiments in Biomedical Signal Processing Course.

    PubMed

    Li, Ling; Li, Bin

    2008-01-01

    Biomedical Signal Processing is one of the most important major subjects in Biomedical Engineering. The contents of Biomedical Signal Processing include the theories of digital signal processing, the knowledge of different biomedical signals, physiology and the ability of computer programming. Based on our past five years teaching experiences, in order to let students master the signal processing algorithm well, we found that the design of experiments following algorithm was very important. In this paper we presented the ideas and aims in designing the experiments. The results showed that our methods facilitated the study of abstractive signal processing algorithms and made understanding of biomedical signals in a simple way.

  15. Advances in Experiment Design for High Performance Aircraft

    NASA Technical Reports Server (NTRS)

    Morelli, Engene A.

    1998-01-01

    A general overview and summary of recent advances in experiment design for high performance aircraft is presented, along with results from flight tests. General theoretical background is included, with some discussion of various approaches to maneuver design. Flight test examples from the F-18 High Alpha Research Vehicle (HARV) are used to illustrate applications of the theory. Input forms are compared using Cramer-Rao bounds for the standard errors of estimated model parameters. Directions for future research in experiment design for high performance aircraft are identified.

  16. Functional design to support CDTI/DABS flight experiments

    NASA Technical Reports Server (NTRS)

    Goka, T.

    1982-01-01

    The objectives of this project are to: (1) provide a generalized functional design of CDTI avionics using the FAA developd DABS/ATARS ground system as the 'traffic sensor', (2) specify software modifications and/or additions to the existing DABS/ATARS ground system to support CDTI avionics, (3) assess the existing avionics of a NASA research aircraft in terms of CDTI applications, and (4) apply the generalized functional design to provide research flight experiment capability. DABS Data Link Formats are first specified for CDTI flight experiments. The set of CDTI/DABS Format specifications becomes a vehicle to coordinate the CDTI avionics and ground system designs, and hence, to develop overall system requirements. The report is the first iteration of a system design and development effort to support eventual CDTI flight test experiments.

  17. Vestibular Function Research (VFR) experiment. Phase B: Design definition study

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The Vestibular Functions Research (VFR) Experiment was established to investigate the neurosensory and related physiological processes believed to be associated with the space flight nausea syndrome and to develop logical means for its prediction, prevention and treatment. The VFR Project consists of ground and spaceflight experimentation using frogs as specimens. The phase B Preliminary Design Study provided for the preliminary design of the experiment hardware, preparation of performance and hardware specification and a Phase C/D development plan, establishment of STS (Space Transportation System) interfaces and mission operations, and the study of a variety of hardware, experiment and mission options. The study consist of three major tasks: (1) mission mode trade-off; (2) conceptual design; and (3) preliminary design.

  18. Design of Orion Soil Impact Study using the Modern Design of Experiments

    NASA Technical Reports Server (NTRS)

    DeLoach, Richard

    2010-01-01

    Two conventional One Factor At a Time (OFAT) test matrices under consideration for an Orion Landing System subscale soil impact study are reviewed. Certain weaknesses in the designs, systemic to OFAT experiment designs generally, are identified. An alternative test matrix is proposed that is based in the Modern Design of Experiments (MDOE), which achieves certain synergies by combining the original two test matrices into one. The attendant resource savings are quantified and the impact on uncertainty is discussed.

  19. Design of Experiments, Model Calibration and Data Assimilation

    SciTech Connect

    Williams, Brian J.

    2014-07-30

    This presentation provides an overview of emulation, calibration and experiment design for computer experiments. Emulation refers to building a statistical surrogate from a carefully selected and limited set of model runs to predict unsampled outputs. The standard kriging approach to emulation of complex computer models is presented. Calibration refers to the process of probabilistically constraining uncertain physics/engineering model inputs to be consistent with observed experimental data. An initial probability distribution for these parameters is updated using the experimental information. Markov chain Monte Carlo (MCMC) algorithms are often used to sample the calibrated parameter distribution. Several MCMC algorithms commonly employed in practice are presented, along with a popular diagnostic for evaluating chain behavior. Space-filling approaches to experiment design for selecting model runs to build effective emulators are discussed, including Latin Hypercube Design and extensions based on orthogonal array skeleton designs and imposed symmetry requirements. Optimization criteria that further enforce space-filling, possibly in projections of the input space, are mentioned. Designs to screen for important input variations are summarized and used for variable selection in a nuclear fuels performance application. This is followed by illustration of sequential experiment design strategies for optimization, global prediction, and rare event inference.

  20. Structural Design Feasibility Study for the Global Climate Experiment

    SciTech Connect

    Lewin,K.F.; Nagy, J.

    2008-12-01

    Neon, Inc. is proposing to establish a Global Change Experiment (GCE) Facility to increase our understanding of how ecological systems differ in their vulnerability to changes in climate and other relevant global change drivers, as well as provide the mechanistic basis for forecasting ecological change in the future. The experimental design was initially envisioned to consist of two complementary components; (A) a multi-factor experiment manipulating CO{sub 2}, temperature and water availability and (B) a water balance experiment. As the design analysis and cost estimates progressed, it became clear that (1) the technical difficulties of obtaining tight temperature control and maintaining elevated atmospheric carbon dioxide levels within an enclosure were greater than had been expected and (2) the envisioned study would not fit into the expected budget envelope if this was done in a partially or completely enclosed structure. After discussions between NEON management, the GCE science team, and Keith Lewin, NEON, Inc. requested Keith Lewin to expand the scope of this design study to include open-field exposure systems. In order to develop the GCE design to the point where it can be presented within a proposal for funding, a feasibility study of climate manipulation structures must be conducted to determine design approaches and rough cost estimates, and to identify advantages and disadvantages of these approaches including the associated experimental artifacts. NEON, Inc requested this design study in order to develop concepts for the climate manipulation structures to support the NEON Global Climate Experiment. This study summarizes the design concepts considered for constructing and operating the GCE Facility and their associated construction, maintenance and operations costs. Comparisons and comments about experimental artifacts, construction challenges and operational uncertainties are provided to assist in selecting the final facility design. The overall goal

  1. Transforming Inclusion: Designing in the Experience of Greater Technological Possibility.

    PubMed

    Bridge, Catherine; Demirbilek, Oya; Mintzes, Alicia

    2016-01-01

    Universal Design seeks to contribute to the sustainability and inclusivity of communities and co-design and participatory methods are a critical tool in this evolution. The fact that technology permeates our society is undeniable and the form and materials that technology takes in turn shape the basics of human life such as being able to shower and toilet oneself. In contrast, the various existing approaches to co-design have very different sorts of metaphysical, epistemological and normative assumptions behind them. As a result, design has recognised a set of problems surrounding the position of the "user" in design innovation. Additionally, there are many different perspectives on technology and the role of technology in co-design methods. Consequently, there are a number of different ways of conceiving of the "problem" of integrating technologies into co-design methods. Traditionally, participatory design has been viewed as merely the insertion of a more public dialog of the potential target market within technological design practices. Our research indicates that most if not all co-designers rely on their own personal and collective knowledge and experience and that if this is not actively explored as a part of a co-design methodology that both participation and innovation will be less than hoped for. For instance, assuming only known fixtures, fittings with current codes and standards is unlikely to result in product innovation. PMID:27534298

  2. Transforming Inclusion: Designing in the Experience of Greater Technological Possibility.

    PubMed

    Bridge, Catherine; Demirbilek, Oya; Mintzes, Alicia

    2016-01-01

    Universal Design seeks to contribute to the sustainability and inclusivity of communities and co-design and participatory methods are a critical tool in this evolution. The fact that technology permeates our society is undeniable and the form and materials that technology takes in turn shape the basics of human life such as being able to shower and toilet oneself. In contrast, the various existing approaches to co-design have very different sorts of metaphysical, epistemological and normative assumptions behind them. As a result, design has recognised a set of problems surrounding the position of the "user" in design innovation. Additionally, there are many different perspectives on technology and the role of technology in co-design methods. Consequently, there are a number of different ways of conceiving of the "problem" of integrating technologies into co-design methods. Traditionally, participatory design has been viewed as merely the insertion of a more public dialog of the potential target market within technological design practices. Our research indicates that most if not all co-designers rely on their own personal and collective knowledge and experience and that if this is not actively explored as a part of a co-design methodology that both participation and innovation will be less than hoped for. For instance, assuming only known fixtures, fittings with current codes and standards is unlikely to result in product innovation.

  3. Photon Detection System Designs for the Deep Underground Neutrino Experiment

    SciTech Connect

    Whittington, Denver

    2015-11-19

    The Deep Underground Neutrino Experiment (DUNE) will be a premier facility for exploring long-standing questions about the boundaries of the standard model. Acting in concert with the liquid argon time projection chambers underpinning the far detector design, the DUNE photon detection system will capture ultraviolet scintillation light in order to provide valuable timing information for event reconstruction. To maximize the active area while maintaining a small photocathode coverage, the experiment will utilize a design based on plastic light guides coated with a wavelength-shifting compound, along with silicon photomultipliers, to collect and record scintillation light from liquid argon. This report presents recent preliminary performance measurements of this baseline design and several alternative designs which promise significant improvements in sensitivity to low-energy interactions.

  4. Photon detection system designs for the Deep Underground Neutrino Experiment

    NASA Astrophysics Data System (ADS)

    Whittington, D.

    2016-05-01

    The Deep Underground Neutrino Experiment (DUNE) will be a premier facility for exploring long-standing questions about the boundaries of the standard model. Acting in concert with the liquid argon time projection chambers underpinning the far detector design, the DUNE photon detection system will capture ultraviolet scintillation light in order to provide valuable timing information for event reconstruction. To maximize the active area while maintaining a small photocathode coverage, the experiment will utilize a design based on plastic light guides coated with a wavelength-shifting compound, along with silicon photomultipliers, to collect and record scintillation light from liquid argon. This report presents recent preliminary performance measurements of this baseline design and several alternative designs which promise significant improvements in sensitivity to low-energy interactions.

  5. Preliminary Design Program: Vapor Compression Distillation Flight Experiment Program

    NASA Technical Reports Server (NTRS)

    Schubert, F. H.; Boyda, R. B.

    1995-01-01

    This document provides a description of the results of a program to prepare a preliminary design of a flight experiment to demonstrate the function of a Vapor Compression Distillation (VCD) Wastewater Processor (WWP) in microgravity. This report describes the test sequence to be performed and the hardware, control/monitor instrumentation and software designs prepared to perform the defined tests. the purpose of the flight experiment is to significantly reduce the technical and programmatic risks associated with implementing a VCD-based WWP on board the International Space Station Alpha.

  6. Operational experience and design recommendations for teleoperated flight hardware

    NASA Technical Reports Server (NTRS)

    Burgess, T. W.; Kuban, D. P.; Hankins, W. W.; Mixon, R. W.

    1988-01-01

    Teleoperation (remote manipulation) will someday supplement/minimize astronaut extravehicular activity in space to perform such tasks as satellite servicing and repair, and space station construction and servicing. This technology is being investigated by NASA with teleoperation of two space-related tasks having been demonstrated at the Oak Ridge National Lab. The teleoperator experiments are discussed and the results of these experiments are summarized. The related equipment design recommendations are also presented. In addition, a general discussion of equipment design for teleoperation is also presented.

  7. Linear design considerations for TO-10 candidate experiment

    SciTech Connect

    Atchison, Walter A; Rousculp, Christopher L

    2011-01-12

    As part of the LANL/VNIIEF collaboration a high velocity cylindrical liner driven Hugoniot experiment is being designed to be driven by a VNIEF Disk Explosive Magnetic (flux compression) Generator (DEMG). Several variations in drive current and liner thickness have been proposed. This presentation will describe the LANL 1D and 2D simulations used to evaluate those designs. The presentation will also propose an analysis technique to assess a high current drive systems ability to stably and optimally drive a cylindrical aluminum liner for this type of experiment.

  8. A Bubble Mixture Experiment Project for Use in an Advanced Design of Experiments Class

    ERIC Educational Resources Information Center

    Steiner, Stefan H.; Hamada, Michael; White, Bethany J.Giddings; Kutsyy, Vadim; Mosesova, Sofia; Salloum, Geoffrey

    2007-01-01

    This article gives an example of how student-conducted experiments can enhance a course in the design of experiments. We focus on a project whose aim is to find a good mixture of water, soap and glycerin for making soap bubbles. This project is relatively straightforward to implement and understand. At its most basic level the project introduces…

  9. Designing a Hybrid Laminar-Flow Control Experiment: The CFD-Experiment Connection

    NASA Technical Reports Server (NTRS)

    Streett, C. L.

    2003-01-01

    The NASA/Boeing hybrid laminar flow control (HLFC) experiment, designed during 1993-1994 and conducted in the NASA LaRC 8-foot Transonic Pressure Tunnel in 1995, utilized computational fluid dynamics and numerical simulation of complex fluid mechanics to an unprecedented extent for the design of the test article and measurement equipment. CFD was used in: the design of the test wing, which was carried from definition of desired disturbance growth characteristics, through to the final airfoil shape that would produce those growth characteristics; the design of the suction-surface perforation pattern that produced enhanced crossflow-disturbance growth: and in the design of the hot-wire traverse system that produced minimal influence on measured disturbance growth. These and other aspects of the design of the test are discussed, after the historical and technical context of the experiment is described.

  10. Design and implementation of coupled thermomechanical failure experiments.

    SciTech Connect

    Dempsey, J. Franklin; Wellman, Gerald William; Antoun, Bonnie R.; Connelly, Kevin; Scherzinger, William Mark

    2010-03-01

    Coupled thermal-mechanical experiments with well-defined, controlled boundary conditions were designed through an iterative process involving a team of experimentalists, material modelers and computational analysts. First the basic experimental premise was selected: an axisymmetric tubular specimen mechanically loaded by internal pressurization and thermally loaded asymmetrically by side radiant heating. Then several integrated experimental-analytical steps were taken to determine the experimental details. The boundary conditions were mostly thermally driven and were chosen so they could be modeled accurately; the experimental fixtures were designed to ensure that the boundary conditions were met. Preliminary, uncoupled analyses were used to size the specimen diameter, height and thickness with experimental consideration of maximum pressure loads and fixture design. Iterations of analyses and experiments were used to efficiently determine heating parameters including lamp and heating shroud design, set off distance between the lamps and shroud and between the shroud and specimen, obtainable ramp rates, and the number and spatial placement of thermocouples. The design process and the experimental implementation of the final coupled thermomechanical failure experiment design will be presented.

  11. A microcomputer system designed for psychological and behavioural experiments.

    PubMed

    Popplewell, D A; Burton, M J

    1985-05-01

    This paper describes a relatively cheap MC6809-based microcomputer designed to run experiments in real-time, and to use the hardware and software facilities of a larger (HOST) computer. Each microcomputer is capable of controlling a wide range of psychological and behavioural experiments, and includes 32K RAM, 4K EPROM, 32 digital input lines, 32 digital output lines, analogue/digital converters, and programmable timers. Any programming language may be used, providing a cross-compiler generating MC6809 executable code exists for the HOST. Following over a year of use we can confirm that this system provides an effective method of running psychological and behavioural experiments.

  12. Preliminary design of two Space Shuttle fluid physics experiments

    NASA Technical Reports Server (NTRS)

    Gat, N.; Kropp, J. L.

    1984-01-01

    The mid-deck lockers of the STS and the requirements for operating an experiment in this region are described. The design of the surface tension induced convection and the free surface phenomenon experiments use a two locker volume with an experiment unique structure as a housing. A manual mode is developed for the Surface Tension Induced Convection experiment. The fluid is maintained in an accumulator pre-flight. To begin the experiment, a pressurized gas drives the fluid into the experiment container. The fluid is an inert silicone oil and the container material is selected to be comparable. A wound wire heater, located axisymmetrically above the fluid can deliver three wattages to a spot on the fluid surface. These wattages vary from 1-15 watts. Fluid flow is observed through the motion of particles in the fluid. A 5 mw He/Ne laser illuminates the container. Scattered light is recorded by a 35mm camera. The free surface phenomena experiment consists of a trapezoidal cell which is filled from the bottom. The fluid is photographed at high speed using a 35mm camera which incorporated the entire cell length in the field of view. The assembly can incorporate four cells in one flight. For each experiment, an electronics block diagram is provided. A control panel concept is given for the surface induced convection. Both experiments are within the mid-deck locker weight and c-g limits.

  13. Optimizing an experimental design for an electromagnetic experiment

    NASA Astrophysics Data System (ADS)

    Roux, Estelle; Garcia, Xavier

    2013-04-01

    Most of geophysical studies focus on data acquisition and analysis, but another aspect which is gaining importance is the discussion on acquisition of suitable datasets. This can be done through the design of an optimal experiment. Optimizing an experimental design implies a compromise between maximizing the information we get about the target and reducing the cost of the experiment, considering a wide range of constraints (logistical, financial, experimental …). We are currently developing a method to design an optimal controlled-source electromagnetic (CSEM) experiment to detect a potential CO2 reservoir and monitor this reservoir during and after CO2 injection. Our statistical algorithm combines the use of linearized inverse theory (to evaluate the quality of one given design via the objective function) and stochastic optimization methods like genetic algorithm (to examine a wide range of possible surveys). The particularity of our method is that it uses a multi-objective genetic algorithm that searches for designs that fit several objective functions simultaneously. One main advantage of this kind of technique to design an experiment is that it does not require the acquisition of any data and can thus be easily conducted before any geophysical survey. Our new experimental design algorithm has been tested with a realistic one-dimensional resistivity model of the Earth in the region of study (northern Spain CO2 sequestration test site). We show that a small number of well distributed observations have the potential to resolve the target. This simple test also points out the importance of a well chosen objective function. Finally, in the context of CO2 sequestration that motivates this study, we might be interested in maximizing the information we get about the reservoir layer. In that case, we show how the combination of two different objective functions considerably improve its resolution.

  14. From Content to Context: Videogames as Designed Experience

    ERIC Educational Resources Information Center

    Squire, Kurt

    2006-01-01

    Interactive immersive entertainment, or videogame playing, has emerged as a major entertainment and educational medium. As research and development initiatives proliferate, educational researchers might benefit by developing more grounded theories about them. This article argues for framing game play as a "designed experience." Players'…

  15. The Design of Learning Experiences: A Connection to Physical Environments.

    ERIC Educational Resources Information Center

    Stueck, Lawrence E.; Tanner, C. Kenneth

    The school environment must create a rich, beautiful, dynamic, meaningful experience for students to learn; however, architects, school boards, and the state focus almost exclusively only on the building when making design decisions. This document lists specific aspects to developing a visionary campus: one that provides a three-dimensional…

  16. Lessons Learned from a First Instructional Design Experience

    ERIC Educational Resources Information Center

    Hodges, Charles B.

    2006-01-01

    This case study presents the experience of an instructional designer working on an instructional development project for the first time. The project team was creating online course materials for a college level business calculus course. Project management skills, considerations in the project budget, and communication issues are suggested as…

  17. Experiences of Computer Science Curriculum Design: A Phenomenological Study

    ERIC Educational Resources Information Center

    Sloan, Arthur; Bowe, Brian

    2015-01-01

    This paper presents a qualitative study of 12 computer science lecturers' experiences of curriculum design of several degree programmes during a time of transition from year-long to semesterised courses, due to institutional policy change. The background to the study is outlined, as are the reasons for choosing the research methodology. The main…

  18. Experimental design principles for isotopically instationary 13C labeling experiments.

    PubMed

    Nöh, Katharina; Wiechert, Wolfgang

    2006-06-01

    13C metabolic flux analysis (MFA) is a well-established tool in Metabolic Engineering that found numerous applications in recent years. However, one strong limitation of the current method is the requirement of an-at least approximate-isotopic stationary state at sampling time. This requirement leads to a principle lower limit for the duration of a 13C labeling experiment. A new methodological development is based on repeated sampling during the instationary transient of the 13C labeling dynamics. The statistical and computational treatment of such instationary experiments is a completely new terrain. The computational effort is very high because large differential equations have to be solved and, moreover, the intracellular pool sizes play a significant role. For this reason, the present contribution works out principles and strategies for the experimental design of instationary experiments based on a simple example network. Hereby, the potential of isotopically instationary experiments is investigated in detail. Various statistical results on instationary flux identifiability are presented and possible pitfalls of experimental design are discussed. Finally, a framework for almost optimal experimental design of isotopically instationary experiments is proposed which provides a practical guideline for the analysis of large-scale networks.

  19. Visual experiments on the web: design of a web-based visual experiment management system

    NASA Astrophysics Data System (ADS)

    Zuffi, Silvia; Beltrame, Elisa; Scala, Paolo

    2008-01-01

    In psychological research, it is common to perform investigations on the World Wide Web in the form of questionnaires to collect data from a large number of participants. By comparison, visual experiments have been mainly performed in the laboratory, where it is possible to use calibrated devices and controlled viewing conditions. Recently, the Web has been exploited also for "uncontrolled" visual experiments, despite the lack of control on image rendering at the client side, assuming that the large number of participants involved in a Web investigation "averages out" the parameters that the experiments would require to keep fixed if, following a traditional approach, it was performed under controlled conditions. This paper describes the design and implementation of a Web-based visual experiment management system, which acts as a repository of visual experiment, and is designed with the purpose of facilitating the publishing of online investigations.

  20. Designing Undergraduate Research Experiences: A Multiplicity of Options

    NASA Astrophysics Data System (ADS)

    Manduca, C. A.

    2001-12-01

    Research experiences for undergraduate students can serve many goals including: developing student understanding of the process of science; providing opportunities for students to develop professional skills or test career plans; completing publishable research; enabling faculty professional development; or enhancing the visibility of a science program. The large range of choices made in the design of an undergraduate research program or opportunity must reflect the goals of the program, the needs and abilities of the students and faculty, and the available resources including both time and money. Effective program design, execution, and evaluation can all be enhanced if the goals of the program are clearly articulated. Student research experiences can be divided into four components: 1) defining the research problem; 2) developing the research plan or experiment design; 3) collecting and interpreting data, and 4) communicating results. In each of these components, the program can be structured in a wide variety of ways and students can be given more or less guidance or freedom. While a feeling of ownership of the research project appears to be very important, examples of successful projects displaying a wide range of design decisions are available. Work with the Keck Geology Consortium suggests that four strategies can enhance the likelihood of successful student experiences: 1) students are well-prepared for research experience (project design must match student preparation); 2) timelines and events are structured to move students through intermediate goals to project completion; 3) support for the emotional, financial, academic and technical challenges of a research project is in place; 4) strong communications between students and faculty set clear expectations and enable mid-course corrections in the program or project design. Creating a research culture for the participants or embedding a project in an existing research culture can also assist students in

  1. A strategic map for high-impact virtual experience design

    NASA Astrophysics Data System (ADS)

    Faste, Haakon; Bergamasco, Massimo

    2009-02-01

    We have employed methodologies of human centered design to inspire and guide the engineering of a definitive low-cost aesthetic multimodal experience intended to stimulate cultural growth. Using a combination of design research, trend analysis and the programming of immersive virtual 3D worlds, over 250 innovative concepts have been brainstormed, prototyped, evaluated and refined. These concepts have been used to create a strategic map for the development of highimpact virtual art experiences, the most promising of which have been incorporated into a multimodal environment programmed in the online interactive 3D platform XVR. A group of test users have evaluated the experience as it has evolved, using a multimodal interface with stereo vision, 3D audio and haptic feedback. This paper discusses the process, content, results, and impact on our engineering laboratory that this research has produced.

  2. Facilitating an accelerated experience-based co-design project.

    PubMed

    Tollyfield, Ruth

    This article describes an accelerated experience-based co-design (AEBCD) quality improvement project that was undertaken in an adult critical care setting and the facilitation of that process. In doing so the aim is to encourage other clinical settings to engage with their patients, carers and staff alike and undertake their own quality improvement project. Patient, carer and staff experience and its place in the quality sphere is outlined and the importance of capturing patient, carer and staff feedback established. Experience-based co-design (EBCD) is described along with the recently tested accelerated version of the process. An overview of the project and outline of the organisational tasks and activities undertaken by the facilitator are given. The facilitation of the process and key outcomes are discussed and reflected on. Recommendations for future undertakings of the accelerated process are given and conclusions drawn.

  3. Optimal Experiment Design for Thermal Characterization of Functionally Graded Materials

    NASA Technical Reports Server (NTRS)

    Cole, Kevin D.

    2003-01-01

    The purpose of the project was to investigate methods to accurately verify that designed , materials meet thermal specifications. The project involved heat transfer calculations and optimization studies, and no laboratory experiments were performed. One part of the research involved study of materials in which conduction heat transfer predominates. Results include techniques to choose among several experimental designs, and protocols for determining the optimum experimental conditions for determination of thermal properties. Metal foam materials were also studied in which both conduction and radiation heat transfer are present. Results of this work include procedures to optimize the design of experiments to accurately measure both conductive and radiative thermal properties. Detailed results in the form of three journal papers have been appended to this report.

  4. Design of a proof of principle high current transport experiment

    SciTech Connect

    Lund, S.M.; Bangerter, R.O.; Barnard, J.J.; Celata, C.M.; Faltens, A.; Friedman, A.; Kwan, J.W.; Lee, E.P.; Seidl, P.A.

    2000-01-15

    Preliminary designs of an intense heavy-ion beam transport experiment to test issues for Heavy Ion Fusion (HIF) are presented. This transport channel will represent a single high current density beam at full driver scale and will evaluate practical issues such as aperture filling factors, electrons, halo, imperfect vacuum, etc., that cannot be fully tested using scaled experiments. Various machine configurations are evaluated in the context of the range of physics and technology issues that can be explored in a manner relevant to a full scale driver. it is anticipated that results from this experiment will allow confident construction of next generation ''Integrated Research Experiments'' leading to a full scale driver for energy production.

  5. Explorations in Teaching Sustainable Design: A Studio Experience in Interior Design/Architecture

    ERIC Educational Resources Information Center

    Gurel, Meltem O.

    2010-01-01

    This article argues that a design studio can be a dynamic medium to explore the creative potential of the complexity of sustainability from its technological to social ends. The study seeks to determine the impact of an interior design/architecture studio experience that was initiated to teach diverse meanings of sustainability and to engage the…

  6. Developing Conceptual Hypersonic Airbreathing Engines Using Design of Experiments Methods

    NASA Technical Reports Server (NTRS)

    Ferlemann, Shelly M.; Robinson, Jeffrey S.; Martin, John G.; Leonard, Charles P.; Taylor, Lawrence W.; Kamhawi, Hilmi

    2000-01-01

    Designing a hypersonic vehicle is a complicated process due to the multi-disciplinary synergy that is required. The greatest challenge involves propulsion-airframe integration. In the past, a two-dimensional flowpath was generated based on the engine performance required for a proposed mission. A three-dimensional CAD geometry was produced from the two-dimensional flowpath for aerodynamic analysis, structural design, and packaging. The aerodynamics, engine performance, and mass properties arc inputs to the vehicle performance tool to determine if the mission goals were met. If the mission goals were not met, then a flowpath and vehicle redesign would begin. This design process might have to be performed several times to produce a "closed" vehicle. This paper will describe an attempt to design a hypersonic cruise vehicle propulsion flowpath using a Design of' Experiments method to reduce the resources necessary to produce a conceptual design with fewer iterations of the design cycle. These methods also allow for more flexible mission analysis and incorporation of additional design constraints at any point. A design system was developed using an object-based software package that would quickly generate each flowpath in the study given the values of the geometric independent variables. These flowpath geometries were put into a hypersonic propulsion code and the engine performance was generated. The propulsion results were loaded into statistical software to produce regression equations that were combined with an aerodynamic database to optimize the flowpath at the vehicle performance level. For this example, the design process was executed twice. The first pass was a cursory look at the independent variables selected to determine which variables are the most important and to test all of the inputs to the optimization process. The second cycle is a more in-depth study with more cases and higher order equations representing the design space.

  7. Optimized design and analysis of sparse-sampling FMRI experiments.

    PubMed

    Perrachione, Tyler K; Ghosh, Satrajit S

    2013-01-01

    Sparse-sampling is an important methodological advance in functional magnetic resonance imaging (fMRI), in which silent delays are introduced between MR volume acquisitions, allowing for the presentation of auditory stimuli without contamination by acoustic scanner noise and for overt vocal responses without motion-induced artifacts in the functional time series. As such, the sparse-sampling technique has become a mainstay of principled fMRI research into the cognitive and systems neuroscience of speech, language, hearing, and music. Despite being in use for over a decade, there has been little systematic investigation of the acquisition parameters, experimental design considerations, and statistical analysis approaches that bear on the results and interpretation of sparse-sampling fMRI experiments. In this report, we examined how design and analysis choices related to the duration of repetition time (TR) delay (an acquisition parameter), stimulation rate (an experimental design parameter), and model basis function (an analysis parameter) act independently and interactively to affect the neural activation profiles observed in fMRI. First, we conducted a series of computational simulations to explore the parameter space of sparse design and analysis with respect to these variables; second, we validated the results of these simulations in a series of sparse-sampling fMRI experiments. Overall, these experiments suggest the employment of three methodological approaches that can, in many situations, substantially improve the detection of neurophysiological response in sparse fMRI: (1) Sparse analyses should utilize a physiologically informed model that incorporates hemodynamic response convolution to reduce model error. (2) The design of sparse fMRI experiments should maintain a high rate of stimulus presentation to maximize effect size. (3) TR delays of short to intermediate length can be used between acquisitions of sparse-sampled functional image volumes to increase

  8. SSSFD manipulator engineering using statistical experiment design techniques

    NASA Technical Reports Server (NTRS)

    Barnes, John

    1991-01-01

    The Satellite Servicer System Flight Demonstration (SSSFD) program is a series of Shuttle flights designed to verify major on-orbit satellite servicing capabilities, such as rendezvous and docking of free flyers, Orbital Replacement Unit (ORU) exchange, and fluid transfer. A major part of this system is the manipulator system that will perform the ORU exchange. The manipulator must possess adequate toolplate dexterity to maneuver a variety of EVA-type tools into position to interface with ORU fasteners, connectors, latches, and handles on the satellite, and to move workpieces and ORUs through 6 degree of freedom (dof) space from the Target Vehicle (TV) to the Support Module (SM) and back. Two cost efficient tools were combined to perform a study of robot manipulator design parameters. These tools are graphical computer simulations and Taguchi Design of Experiment methods. Using a graphics platform, an off-the-shelf robot simulation software package, and an experiment designed with Taguchi's approach, the sensitivities of various manipulator kinematic design parameters to performance characteristics are determined with minimal cost.

  9. Design of experiments applications in bioprocessing: concepts and approach.

    PubMed

    Kumar, Vijesh; Bhalla, Akriti; Rathore, Anurag S

    2014-01-01

    Most biotechnology unit operations are complex in nature with numerous process variables, feed material attributes, and raw material attributes that can have significant impact on the performance of the process. Design of experiments (DOE)-based approach offers a solution to this conundrum and allows for an efficient estimation of the main effects and the interactions with minimal number of experiments. Numerous publications illustrate application of DOE towards development of different bioprocessing unit operations. However, a systematic approach for evaluation of the different DOE designs and for choosing the optimal design for a given application has not been published yet. Through this work we have compared the I-optimal and D-optimal designs to the commonly used central composite and Box-Behnken designs for bioprocess applications. A systematic methodology is proposed for construction of the model and for precise prediction of the responses for the three case studies involving some of the commonly used unit operations in downstream processing. Use of Akaike information criterion for model selection has been examined and found to be suitable for the applications under consideration.

  10. Design of experiments applications in bioprocessing: concepts and approach.

    PubMed

    Kumar, Vijesh; Bhalla, Akriti; Rathore, Anurag S

    2014-01-01

    Most biotechnology unit operations are complex in nature with numerous process variables, feed material attributes, and raw material attributes that can have significant impact on the performance of the process. Design of experiments (DOE)-based approach offers a solution to this conundrum and allows for an efficient estimation of the main effects and the interactions with minimal number of experiments. Numerous publications illustrate application of DOE towards development of different bioprocessing unit operations. However, a systematic approach for evaluation of the different DOE designs and for choosing the optimal design for a given application has not been published yet. Through this work we have compared the I-optimal and D-optimal designs to the commonly used central composite and Box-Behnken designs for bioprocess applications. A systematic methodology is proposed for construction of the model and for precise prediction of the responses for the three case studies involving some of the commonly used unit operations in downstream processing. Use of Akaike information criterion for model selection has been examined and found to be suitable for the applications under consideration. PMID:24123959

  11. Experimental Design and Power Calculation for RNA-seq Experiments.

    PubMed

    Wu, Zhijin; Wu, Hao

    2016-01-01

    Power calculation is a critical component of RNA-seq experimental design. The flexibility of RNA-seq experiment and the wide dynamic range of transcription it measures make it an attractive technology for whole transcriptome analysis. These features, in addition to the high dimensionality of RNA-seq data, bring complexity in experimental design, making an analytical power calculation no longer realistic. In this chapter we review the major factors that influence the statistical power of detecting differential expression, and give examples of power assessment using the R package PROPER. PMID:27008024

  12. Beryllium ignition target design for indirect drive NIF experiments

    NASA Astrophysics Data System (ADS)

    Simakov, A. N.; Wilson, D. C.; Yi, S. A.; Kline, J. L.; Salmonson, J. D.; Clark, D. S.; Milovich, J. L.; Marinak, M. M.

    2016-03-01

    Beryllium (Be) ablator offers multiple advantages over carbon based ablators for indirectly driven NIF ICF ignition targets. These are higher mass ablation rate, ablation pressure and ablation velocity, lower capsule albedo, and higher thermal conductivity at cryogenic temperatures. Such advantages can be used to improve the target robustness and performance. While previous NIF Be target designs exist, they were obtained a long time ago and do not incorporate the latest improved physical understanding and models based upon NIF experiments. Herein, we propose a new NIF Be ignition target design at 1.45 MJ, 430 TW that takes all this knowledge into account.

  13. Skylab Medical Experiments Altitude Test /SMEAT/ facility design and operation.

    NASA Technical Reports Server (NTRS)

    Hinners, A. H., Jr.; Correale, J. V.

    1973-01-01

    This paper presents the design approaches and test facility operation methods used to successfully accomplish a 56-day test for Skylab to permit evaluation of selected Skylab medical experiments in a ground test simulation of the Skylab environment with an astronaut crew. The systems designed for this test include the two-gas environmental control system, the fire suppression and detection system, equipment transfer lock, ground support equipment, safety systems, potable water system, waste management system, lighting and power system, television monitoring, communications and recreation systems, and food freezer.

  14. Aircraft integrated design and analysis: A classroom experience

    NASA Technical Reports Server (NTRS)

    Weisshaar, Terrence A.

    1989-01-01

    AAE 451 is the capstone course required of all senior undergraduates in the School of Aeronautics and Astronautics at Purdue University. During the past year the first steps of a long evolutionary process were taken to change the content and expectations of this course. These changes are the result of the availability of advanced computational capabilities and sophisticated electronic media availability at Purdue. This presentation will describe both the long range objectives and this year's experience using the High Speed Commercial Transport design, the AIAA Long Duration Aircraft design and RPV design proposal as project objectives. The central goal of these efforts is to provide a user-friendly, computer-software-based environment to supplement traditional design course methodology. The Purdue University Computer Center (PUCC), the Engineering Computer Network (ECN) and stand-alone PC's are being used for this development. This year's accomplishments center primarily on aerodynamics software obtained from NASA/Langley and its integration into the classroom. Word processor capability for oral and written work and computer graphics were also blended into the course. A total of ten HSCT designs were generated, ranging from twin-fuselage aircraft, forward swept wing aircraft to the more traditional delta and double-delta wing aircraft. Four Long Duration Aircraft designs were submitted, together with one RPV design tailored for photographic surveillance.

  15. Scaling studies and conceptual experiment designs for NGNP CFD assessment

    SciTech Connect

    D. M. McEligot; G. E. McCreery

    2004-11-01

    The objective of this report is to document scaling studies and conceptual designs for flow and heat transfer experiments intended to assess CFD codes and their turbulence models proposed for application to prismatic NGNP concepts. The general approach of the project is to develop new benchmark experiments for assessment in parallel with CFD and coupled CFD/systems code calculations for the same geometry. Two aspects of the complex flow in an NGNP are being addressed: (1) flow and thermal mixing in the lower plenum ("hot streaking" issue) and (2) turbulence and resulting temperature distributions in reactor cooling channels ("hot channel" issue). Current prismatic NGNP concepts are being examined to identify their proposed flow conditions and geometries over the range from normal operation to decay heat removal in a pressurized cooldown. Approximate analyses have been applied to determine key non-dimensional parameters and their magnitudes over this operating range. For normal operation, the flow in the coolant channels can be considered to be dominant turbulent forced convection with slight transverse property variation. In a pressurized cooldown (LOFA) simulation, the flow quickly becomes laminar with some possible buoyancy influences. The flow in the lower plenum can locally be considered to be a situation of multiple hot jets into a confined crossflow -- with obstructions. Flow is expected to be turbulent with momentumdominated turbulent jets entering; buoyancy influences are estimated to be negligible in normal full power operation. Experiments are needed for the combined features of the lower plenum flows. Missing from the typical jet experiments available are interactions with nearby circular posts and with vertical posts in the vicinity of vertical walls - with near stagnant surroundings at one extreme and significant crossflow at the other. Two types of heat transfer experiments are being considered. One addresses the "hot channel" problem, if necessary

  16. Tracer experiment design for unique identification of nonlinear physiological systems.

    PubMed

    DiStefano, J J

    1976-02-01

    The design of tracer kinetic experiments, the purpose of which is to elucidate uniquely the internal couplings of a nonlinear dynamic system, is considered for a practical class of models of physiological systems. The extent of information about the real system contained in tracer kinetic data is a central issue. Criteria for determining whether nonlinear model parameters can be estimated from small-signal, "linearizing" tracer experiments are developed and illustrated by examples. The concept of "structural identifiability" is employed in this analysis to determine which model parameters can be and which cannot be determined "uniquely" from given input-output data; a step-by-step procedure based on an extension of this concept is presented for adapting the overall approach to the experimental design problem. Estimation of unmeasurable endogenous inputs and system state variables, problems that are intimately related to parameter estimation for physiological systems, are also considered. PMID:1259027

  17. Design and experiment performances of an inchworm type rotary actuator.

    PubMed

    Li, Jianping; Zhao, Hongwei; Shao, Mingkun; Zhou, Xiaoqin; Huang, Hu; Fan, Zunqiang

    2014-08-01

    A piezo-driven rotary actuator by means of inchworm principle is proposed in this paper. Six piezo-stacks and flexure hinges are used to realize large rotation ranges with high accuracy both in the forward and backward motions. Four right-angle flexure hinges and two right-circular flexure hinges are applied in the stator. The motion principle and theoretical analysis of the designed actuator are discussed. In order to investigate the working characteristics, a prototype actuator was manufactured and a series of experiment tests were carried out. The test results indicate that the maximum rotation velocity is 71,300 μrad/s; the maximum output torque is 19.6 N mm. The experiment results confirm that the designed actuator can obtain large rotation motion ranges with relatively high output torques and different rotation speeds on the condition of different driving voltages and frequencies.

  18. Statistical design of quantitative mass spectrometry-based proteomic experiments.

    PubMed

    Oberg, Ann L; Vitek, Olga

    2009-05-01

    We review the fundamental principles of statistical experimental design, and their application to quantitative mass spectrometry-based proteomics. We focus on class comparison using Analysis of Variance (ANOVA), and discuss how randomization, replication and blocking help avoid systematic biases due to the experimental procedure, and help optimize our ability to detect true quantitative changes between groups. We also discuss the issues of pooling multiple biological specimens for a single mass analysis, and calculation of the number of replicates in a future study. When applicable, we emphasize the parallels between designing quantitative proteomic experiments and experiments with gene expression microarrays, and give examples from that area of research. We illustrate the discussion using theoretical considerations, and using real-data examples of profiling of disease.

  19. Design and development of PROBA-3 rendezvous experiment

    NASA Astrophysics Data System (ADS)

    Bastante, Juan C.; Vasconcelos, José; Hagenfeldt, Miguel; Peñín, Luis F.; Dinis, João; Rebordão, José

    2014-09-01

    PROBA-3 is a technology demonstration mission with the objective of, among others, raising the Formation Flying (FF) technology up to Technology Readiness Level (TRL) 8 or 9. The context of this mission has strong synergies with the knowledge areas considered in Rendezvous (RV), namely the fields of GNC, metrology, actuator systems, etc. This common ground between FF and RV allowed for a dedicated Rendezvous Experiment (RVX) to be performed in the scope of PROBA-3. The RVX is based only on camera measurements, and designed for highly elliptical orbits with strong constraints on relative position and attitude. This paper presents the design and development of the RVX experiment, with the goal to demonstrate the feasibility of vision-based RV and to increase the associated TRL.

  20. Thermal design support for the Explorer gamma ray experiment telescope

    NASA Technical Reports Server (NTRS)

    Almgren, D. W.; Lee, W. D.; Mathias, S.

    1975-01-01

    The results of a thermal design definition study for the GSFC Explorer Gamma Ray Experiment Telescope (EGRET) were documented. A thermal computer model of EGRET with 241 nodes was developed and used to analyze the thermal performance of the experiment for a range of orbits, payload orientations and internal power dissipations. The recommended thermal design utilizes a small radiator with an area of 1.78 square foot on the anti-sun side of the mission adaptor and circumferential heat pipes on the interior of the same adaptor to transfer heat from the electronics compartments to the single radiator. Fifty watts of thermostatically controlled heater power are used to control the temperature level to 10 C + or - 20 C inside the insulated dome structure.

  1. Design of the NASA Lewis 4-Port Wave Rotor Experiment

    NASA Technical Reports Server (NTRS)

    Wilson, Jack

    1997-01-01

    Pressure exchange wave rotors, used in a topping stage, are currently being considered as a possible means of increasing the specific power, and reducing the specific fuel consumption of gas turbine engines. Despite this interest, there is very little information on the performance of a wave rotor operating on the cycle (i.e., set of waves) appropriate for use in a topping stage. One such cycle, which has the advantage of being relatively easy to incorporate into an engine, is the four-port cycle. Consequently, an experiment to measure the performance of a four-port wave rotor for temperature ratios relevant to application as a topping cycle for a gas turbine engine has been designed and built at NASA Lewis. The design of the wave rotor is described, together with the constraints on the experiment.

  2. Paradigms for adaptive statistical information designs: practical experiences and strategies.

    PubMed

    Wang, Sue-Jane; Hung, H M James; O'Neill, Robert

    2012-11-10

    design. We highlight the substantial risk of planning the sample size for confirmatory trials when information is very uninformative and stipulate the advantages of adaptive statistical information designs for planning exploratory trials. Practical experiences and strategies as lessons learned from more recent adaptive design proposals will be discussed to pinpoint the improved utilities of adaptive design clinical trials and their potential to increase the chance of a successful drug development.

  3. The Modern Design of Experiments: A Technical and Marketing Framework

    NASA Technical Reports Server (NTRS)

    DeLoach, R.

    2000-01-01

    A new wind tunnel testing process under development at NASA Langley Research Center, called Modern Design of Experiments (MDOE), differs from conventional wind tunnel testing techniques on a number of levels. Chief among these is that MDOE focuses on the generation of adequate prediction models rather than high-volume data collection. Some cultural issues attached to this and other distinctions between MDOE and conventional wind tunnel testing are addressed in this paper.

  4. Analysis of Variance in the Modern Design of Experiments

    NASA Technical Reports Server (NTRS)

    Deloach, Richard

    2010-01-01

    This paper is a tutorial introduction to the analysis of variance (ANOVA), intended as a reference for aerospace researchers who are being introduced to the analytical methods of the Modern Design of Experiments (MDOE), or who may have other opportunities to apply this method. One-way and two-way fixed-effects ANOVA, as well as random effects ANOVA, are illustrated in practical terms that will be familiar to most practicing aerospace researchers.

  5. Design and conduct of a windshear detection flight experiment

    NASA Technical Reports Server (NTRS)

    Lewis, Michael S.; Yenni, Kenneth R.; Verstynen, Harry A.; Person, Lee H.

    1992-01-01

    A description is presented of the design and conduct of a series of flight experiments that tested the performance of candidate windshear detection devices. A NASA 737 test aircraft with prototype windshear sensors installed flew numerous low altitude penetrations of microburst windshear conditions. These tests were preceded by extensive preparations including piloted simulations, determination of safe operating conditions, and the development of displays, unique flight test hardware, and procedures.

  6. Design choices for the integrated beam experiment (IBX)

    SciTech Connect

    Leitner, M.A.; Celata, C.M.; Lee, E.P.; Logan, B.G.; Sabbi, G.; Waldron, W.L.; Barnard, J.J.

    2003-05-01

    Over the next three years the research program of the Heavy Ion Fusion Virtual National Laboratory (HIF-VNL), a collaboration among LBNL, LLNL, and PPPL, is focused on separate scientific experiments in the injection, transport and focusing of intense heavy ion beams at currents from 100 mA to 1 A. As a next major step in the HIF-VNL program, they aim for a complete ''source-to-target'' experiment, the Integrated Beam Experiment (IBX). By combining the experience gained in the current separate beam experiments IBX would allow the integrated scientific study of the evolution of a high current ({approx}1 A) single heavy ion beam through all sections of a possible heavy ion fusion accelerator: the injection, acceleration, compression, and beam focusing. This paper describes the main parameters and technology choices of the proposed IBX experiment. IBX will accelerate singly charged potassium or argon ion beams up to 10 MeV final energy and a longitudinal beam compression ratio of 10, resulting in a beam current at the target of more than 10 Amperes. The different accelerator cell design options are described in detail, in particular the induction core modules incorporating either room temperature pulsed focusing-magnets or superconducting magnets.

  7. Optimal Color Design of Psychological Counseling Room by Design of Experiments and Response Surface Methodology

    PubMed Central

    Chen, Hua; Ye, Chenyu

    2014-01-01

    Color is one of the most powerful aspects of a psychological counseling environment. Little scientific research has been conducted on color design and much of the existing literature is based on observational studies. Using design of experiments and response surface methodology, this paper proposes an optimal color design approach for transforming patients’ perception into color elements. Six indices, pleasant-unpleasant, interesting-uninteresting, exciting-boring, relaxing-distressing, safe-fearful, and active-inactive, were used to assess patients’ impression. A total of 75 patients participated, including 42 for Experiment 1 and 33 for Experiment 2. 27 representative color samples were designed in Experiment 1, and the color sample (L = 75, a = 0, b = -60) was the most preferred one. In Experiment 2, this color sample was set as the ‘central point’, and three color attributes were optimized to maximize the patients’ satisfaction. The experimental results show that the proposed method can get the optimal solution for color design of a counseling room. PMID:24594683

  8. Regional climate: Design and analysis of computer experiments?

    NASA Astrophysics Data System (ADS)

    Nychka, D. W.

    2011-12-01

    As attention shifts from broad global summaries of climate change to more specific regional impacts there is a need for data sciences to quantify the uncertainty in regional predictions. This talk will provide an overview on regional climate experiments with an emphasis on the statistical problems for interpreting these large and complex simulations. A regional climate model is a computer code based on physics that simulates the detailed flow of the atmosphere in a particular region from the large scale information of a global climate model. One intent is to compare simulations under current climate to future scenarios to infer the nature of climate change expected at a location. There exists a mature sub-discipline in engineering and statistics on the design and analysis of computer experiments. This talk will sketch how general methods from this area may apply to the interpretation of climate model experiments and to what extent the problems of interpreting climate projections are unique and require new ideas.

  9. Recent developments in virtual experience design and production

    NASA Astrophysics Data System (ADS)

    Fisher, Scott S.

    1995-03-01

    Today, the media of VR and Telepresence are in their infancy and the emphasis is still on technology and engineering. But, it is not the hardware people might use that will determine whether VR becomes a powerful medium--instead, it will be the experiences that they are able to have that will drive its acceptance and impact. A critical challenge in the elaboration of these telepresence capabilities will be the development of environments that are as unpredictable and rich in interconnected processes as an actual location or experience. This paper will describe the recent development of several Virtual Experiences including: `Menagerie', an immersive Virtual Environment inhabited by virtual characters designed to respond to and interact with its users; and `The Virtual Brewery', an immersive public VR installation that provides multiple levels of interaction in an artistic interpretation of the brewing process.

  10. Optimal experiment design for model selection in biochemical networks

    PubMed Central

    2014-01-01

    Background Mathematical modeling is often used to formalize hypotheses on how a biochemical network operates by discriminating between competing models. Bayesian model selection offers a way to determine the amount of evidence that data provides to support one model over the other while favoring simple models. In practice, the amount of experimental data is often insufficient to make a clear distinction between competing models. Often one would like to perform a new experiment which would discriminate between competing hypotheses. Results We developed a novel method to perform Optimal Experiment Design to predict which experiments would most effectively allow model selection. A Bayesian approach is applied to infer model parameter distributions. These distributions are sampled and used to simulate from multivariate predictive densities. The method is based on a k-Nearest Neighbor estimate of the Jensen Shannon divergence between the multivariate predictive densities of competing models. Conclusions We show that the method successfully uses predictive differences to enable model selection by applying it to several test cases. Because the design criterion is based on predictive distributions, which can be computed for a wide range of model quantities, the approach is very flexible. The method reveals specific combinations of experiments which improve discriminability even in cases where data is scarce. The proposed approach can be used in conjunction with existing Bayesian methodologies where (approximate) posteriors have been determined, making use of relations that exist within the inferred posteriors. PMID:24555498

  11. An industrial approach to design compelling VR and AR experience

    NASA Astrophysics Data System (ADS)

    Richir, Simon; Fuchs, Philippe; Lourdeaux, Domitile; Buche, Cédric; Querrec, Ronan

    2013-03-01

    The convergence of technologies currently observed in the field of VR, AR, robotics and consumer electronic reinforces the trend of new applications appearing every day. But when transferring knowledge acquired from research to businesses, research laboratories are often at a loss because of a lack of knowledge of the design and integration processes in creating an industrial scale product. In fact, the innovation approaches that take a good idea from the laboratory to a successful industrial product are often little known to researchers. The objective of this paper is to present the results of the work of several research teams that have finalized a working method for researchers and manufacturers that allow them to design virtual or augmented reality systems and enable their users to enjoy "a compelling VR experience". That approach, called "the I2I method", present 11 phases from "Establishing technological and competitive intelligence and industrial property" to "Improvements" through the "Definition of the Behavioral Interface, Virtual Environment and Behavioral Software Assistance". As a result of the experience gained by various research teams, this design approach benefits from contributions from current VR and AR research. Our objective is to validate and continuously move such multidisciplinary design team methods forward.

  12. HTGR nuclear heat source component design and experience

    SciTech Connect

    Peinado, C.O.; Wunderlich, R.G.; Simon, W.A.

    1982-05-01

    The high-temperature gas-cooled reactor (HTGR) nuclear heat source components have been under design and development since the mid-1950's. Two power plants have been designed, constructed, and operated: the Peach Bottom Atomic Power Station and the Fort St. Vrain Nuclear Generating Station. Recently, development has focused on the primary system components for a 2240-MW(t) steam cycle HTGR capable of generating about 900 MW(e) electric power or alternately producing high-grade steam and cogenerating electric power. These components include the steam generators, core auxiliary heat exchangers, primary and auxiliary circulators, reactor internals, and thermal barrier system. A discussion of the design and operating experience of these components is included.

  13. Report on the first VLHC photon stop cryogenic design experiment

    SciTech Connect

    Michael Geynisman et al.

    2003-09-15

    As part of Fermilab's study of a Very Large Hadron Collider, a water-cooled photon stop was proposed as a device to intercept the synchrotron radiation emitted by the high-energy proton beams in the high field superconducting magnets with minimal plug-cooling power. Photon stops are radiation absorbers operating at room temperature that protrude into the beam tube at the end of each bending magnet to scrape the synchrotron light emitted by the beam one magnet up-stream. Among the technological challenges regarding photon stops is their cryo-design. The photon stop is water-cooled and operates in a cryogenic environment. A careful cryo-design is therefore essential to enable operation at minimum heat transfer between the room temperature sections and the cryogenic parts. A photon stop cryo-design was developed and a prototype was built. This paper presents the results of the cryogenic experiments conducted on the first VLHC photon stop prototype.

  14. Thermal Design and Analysis for the Cryogenic MIDAS Experiment

    NASA Technical Reports Server (NTRS)

    Amundsen, Ruth McElroy

    1997-01-01

    The Materials In Devices As Superconductors (MIDAS) spaceflight experiment is a NASA payload which launched in September 1996 on the Shuttle, and was transferred to the Mir Space Station for several months of operation. MIDAS was developed and built at NASA Langley Research Center (LaRC). The primary objective of the experiment was to determine the effects of microgravity and spaceflight on the electrical properties of high-temperature superconductive (HTS) materials. The thermal challenge on MIDAS was to maintain the superconductive specimens at or below 80 K for the entire operation of the experiment, including all ground testing and 90 days of spaceflight operation. Cooling was provided by a small tactical cryocooler. The superconductive specimens and the coldfinger of the cryocooler were mounted in a vacuum chamber, with vacuum levels maintained by an ion pump. The entire experiment was mounted for operation in a stowage locker inside Mir, with the only heat dissipation capability provided by a cooling fan exhausting to the habitable compartment. The thermal environment on Mir can potentially vary over the range 5 to 40 C; this was the range used in testing, and this wide range adds to the difficulty in managing the power dissipated from the experiment's active components. Many issues in the thermal design are discussed, including: thermal isolation methods for the cryogenic samples; design for cooling to cryogenic temperatures; cryogenic epoxy bonds; management of ambient temperature components self-heating; and fan cooling of the enclosed locker. Results of the design are also considered, including the thermal gradients across the HTS samples and cryogenic thermal strap, electronics and thermal sensor cryogenic performance, and differences between ground and flight performance. Modeling was performed in both SINDA-85 and MSC/PATRAN (with direct geometry import from the CAD design tool Pro/Engineer). Advantages of both types of models are discussed

  15. Aircraft integrated design and analysis: A classroom experience

    NASA Technical Reports Server (NTRS)

    1988-01-01

    AAE 451 is the capstone course required of all senior undergraduates in the School of Aeronautics and Astronautics at Purdue University. During the past year the first steps of a long evolutionary process were taken to change the content and expectations of this course. These changes are the result of the availability of advanced computational capabilities and sophisticated electronic media availability at Purdue. This presentation will describe both the long range objectives and this year's experience using the High Speed Commercial Transport (HSCT) design, the AIAA Long Duration Aircraft design and a Remotely Piloted Vehicle (RPV) design proposal as project objectives. The central goal of these efforts was to provide a user-friendly, computer-software-based, environment to supplement traditional design course methodology. The Purdue University Computer Center (PUCC), the Engineering Computer Network (ECN), and stand-alone PC's were used for this development. This year's accomplishments centered primarily on aerodynamics software obtained from the NASA Langley Research Center and its integration into the classroom. Word processor capability for oral and written work and computer graphics were also blended into the course. A total of 10 HSCT designs were generated, ranging from twin-fuselage and forward-swept wing aircraft, to the more traditional delta and double-delta wing aircraft. Four Long Duration Aircraft designs were submitted, together with one RPV design tailored for photographic surveillance. Supporting these activities were three video satellite lectures beamed from NASA/Langley to Purdue. These lectures covered diverse areas such as an overview of HSCT design, supersonic-aircraft stability and control, and optimization of aircraft performance. Plans for next year's effort will be reviewed, including dedicated computer workstation utilization, remote satellite lectures, and university/industrial cooperative efforts.

  16. Creating meaningful learning experiences: Understanding students' perspectives of engineering design

    NASA Astrophysics Data System (ADS)

    Aleong, Richard James Chung Mun

    , relevance, and transfer. With this framework of student learning, engineering educators can enhance learning experiences by engaging all three levels of students' understanding. The curriculum studies orientation applied the three holistic elements of curriculum---subject matter, society, and the individual---to conceptualize design considerations for engineering curriculum and teaching practice. This research supports the characterization of students' learning experiences to help educators and students optimize their teaching and learning of design education.

  17. Designing Experiments to Discriminate Families of Logic Models

    PubMed Central

    Videla, Santiago; Konokotina, Irina; Alexopoulos, Leonidas G.; Saez-Rodriguez, Julio; Schaub, Torsten; Siegel, Anne; Guziolowski, Carito

    2015-01-01

    Logic models of signaling pathways are a promising way of building effective in silico functional models of a cell, in particular of signaling pathways. The automated learning of Boolean logic models describing signaling pathways can be achieved by training to phosphoproteomics data, which is particularly useful if it is measured upon different combinations of perturbations in a high-throughput fashion. However, in practice, the number and type of allowed perturbations are not exhaustive. Moreover, experimental data are unavoidably subjected to noise. As a result, the learning process results in a family of feasible logical networks rather than in a single model. This family is composed of logic models implementing different internal wirings for the system and therefore the predictions of experiments from this family may present a significant level of variability, and hence uncertainty. In this paper, we introduce a method based on Answer Set Programming to propose an optimal experimental design that aims to narrow down the variability (in terms of input–output behaviors) within families of logical models learned from experimental data. We study how the fitness with respect to the data can be improved after an optimal selection of signaling perturbations and how we learn optimal logic models with minimal number of experiments. The methods are applied on signaling pathways in human liver cells and phosphoproteomics experimental data. Using 25% of the experiments, we obtained logical models with fitness scores (mean square error) 15% close to the ones obtained using all experiments, illustrating the impact that our approach can have on the design of experiments for efficient model calibration. PMID:26389116

  18. Designing Experiments to Discriminate Families of Logic Models.

    PubMed

    Videla, Santiago; Konokotina, Irina; Alexopoulos, Leonidas G; Saez-Rodriguez, Julio; Schaub, Torsten; Siegel, Anne; Guziolowski, Carito

    2015-01-01

    Logic models of signaling pathways are a promising way of building effective in silico functional models of a cell, in particular of signaling pathways. The automated learning of Boolean logic models describing signaling pathways can be achieved by training to phosphoproteomics data, which is particularly useful if it is measured upon different combinations of perturbations in a high-throughput fashion. However, in practice, the number and type of allowed perturbations are not exhaustive. Moreover, experimental data are unavoidably subjected to noise. As a result, the learning process results in a family of feasible logical networks rather than in a single model. This family is composed of logic models implementing different internal wirings for the system and therefore the predictions of experiments from this family may present a significant level of variability, and hence uncertainty. In this paper, we introduce a method based on Answer Set Programming to propose an optimal experimental design that aims to narrow down the variability (in terms of input-output behaviors) within families of logical models learned from experimental data. We study how the fitness with respect to the data can be improved after an optimal selection of signaling perturbations and how we learn optimal logic models with minimal number of experiments. The methods are applied on signaling pathways in human liver cells and phosphoproteomics experimental data. Using 25% of the experiments, we obtained logical models with fitness scores (mean square error) 15% close to the ones obtained using all experiments, illustrating the impact that our approach can have on the design of experiments for efficient model calibration.

  19. Tuning Parameters in Heuristics by Using Design of Experiments Methods

    NASA Technical Reports Server (NTRS)

    Arin, Arif; Rabadi, Ghaith; Unal, Resit

    2010-01-01

    With the growing complexity of today's large scale problems, it has become more difficult to find optimal solutions by using exact mathematical methods. The need to find near-optimal solutions in an acceptable time frame requires heuristic approaches. In many cases, however, most heuristics have several parameters that need to be "tuned" before they can reach good results. The problem then turns into "finding best parameter setting" for the heuristics to solve the problems efficiently and timely. One-Factor-At-a-Time (OFAT) approach for parameter tuning neglects the interactions between parameters. Design of Experiments (DOE) tools can be instead employed to tune the parameters more effectively. In this paper, we seek the best parameter setting for a Genetic Algorithm (GA) to solve the single machine total weighted tardiness problem in which n jobs must be scheduled on a single machine without preemption, and the objective is to minimize the total weighted tardiness. Benchmark instances for the problem are available in the literature. To fine tune the GA parameters in the most efficient way, we compare multiple DOE models including 2-level (2k ) full factorial design, orthogonal array design, central composite design, D-optimal design and signal-to-noise (SIN) ratios. In each DOE method, a mathematical model is created using regression analysis, and solved to obtain the best parameter setting. After verification runs using the tuned parameter setting, the preliminary results for optimal solutions of multiple instances were found efficiently.

  20. Engineering design of the National Spherical Torus Experiment

    SciTech Connect

    C. Neumeyer; P. Heitzenroeder; J. Spitzer, J. Chrzanowski; et al

    2000-05-11

    NSTX is a proof-of-principle experiment aimed at exploring the physics of the ``spherical torus'' (ST) configuration, which is predicted to exhibit more efficient magnetic confinement than conventional large aspect ratio tokamaks, amongst other advantages. The low aspect ratio (R/a, typically 1.2--2 in ST designs compared to 4--5 in conventional tokamaks) decreases the available cross sectional area through the center of the torus for toroidal and poloidal field coil conductors, vacuum vessel wall, plasma facing components, etc., thus increasing the need to deploy all components within the so-called ``center stack'' in the most efficient manner possible. Several unique design features have been developed for the NSTX center stack, and careful engineering of this region of the machine, utilizing materials up to their engineering allowables, has been key to meeting the desired objectives. The design and construction of the machine has been accomplished in a rapid and cost effective manner thanks to the availability of extensive facilities, a strong experience base from the TFTR era, and good cooperation between institutions.

  1. Modeling and design for a new ionospheric modification experiment

    NASA Astrophysics Data System (ADS)

    Sales, Gary S.; Platt, Ian G.; Haines, D. Mark; Huang, Yuming; Heckscher, John L.

    1990-10-01

    Plans are now underway to carry out new high frequency oblique ionospheric modification experiments with increased radiated power using a new high gain antenna system and a 1 MW transmitter. The output of this large transmitting system will approach 90 dBW. An important part of this program is to determine the existence of threshold for nonlinear effects by varying the transmitter output. For these experiments, a high frequency probe system, a low power oblique sounder, is introduced to be used along the same propagation path as the high power disturbing transmitter. The concept was first used by soviet researchers to insure that this diagnostic signal always passes through the modified region of the ionosphere. The HF probe system will use a low power (150 W) CW signal shifted by approximately 40 kHz from the frequency used by the high power system. The transmitter for the probe system will be at the same location as the multiple antennas to measure the vertical and azimuthal angle of arrival as well as the Doppler frequency shift of the arriving probe signal. The three antenna array will be in an 'L' configuration to measure the phase differences between the antennas. At the midpath point a vertical sounder will provide the ionospheric information necessary for the frequency management of the experiment. Real-time processing will permit the site operators to evaluate the performance of the system and make adjustments during the experiment. A special ray tracing computer will be used to provide real-time frequencies and elevation beam steering during the experiment. A description of the system and the analysis used in the design of the experiment are presented.

  2. Propagation of Computational Uncertainty Using the Modern Design of Experiments

    NASA Technical Reports Server (NTRS)

    DeLoach, Richard

    2007-01-01

    This paper describes the use of formally designed experiments to aid in the error analysis of a computational experiment. A method is described by which the underlying code is approximated with relatively low-order polynomial graduating functions represented by truncated Taylor series approximations to the true underlying response function. A resource-minimal approach is outlined by which such graduating functions can be estimated from a minimum number of case runs of the underlying computational code. Certain practical considerations are discussed, including ways and means of coping with high-order response functions. The distributional properties of prediction residuals are presented and discussed. A practical method is presented for quantifying that component of the prediction uncertainty of a computational code that can be attributed to imperfect knowledge of independent variable levels. This method is illustrated with a recent assessment of uncertainty in computational estimates of Space Shuttle thermal and structural reentry loads attributable to ice and foam debris impact on ascent.

  3. Vanguard/PLACE experiment system design and test plan

    NASA Technical Reports Server (NTRS)

    Taylor, R. E.

    1973-01-01

    A system design and test plan are described for operational evaluation of the NASA-Goddard position location and aircraft communications equipment (PLACE), at C band (4/6GHz), using NASA's ship, the USNS Vanguard, and the ATS 3 and ATS 5 synchronous satellites. The Sea Test phase, extending from March 29, 1973 to April 15, 1973 was successfully completed; the principal objectives of the experiment were achieved. Typical PLACE-computed, position-location data is shown for the Vanguard. Position location and voice-quality measurements were excellent; ship position was determined within 2 nmi; high-quality, 2-way voice transmissions resulted as determined from audience participation, intelligibility and articulation-index analysis. A C band/L band satellite trilateration experiment is discussed.

  4. Design challenges and safety concept for the AVANTI experiment

    NASA Astrophysics Data System (ADS)

    Gaias, G.; Ardaens, J.-S.

    2016-06-01

    AVANTI is a formation-flight experiment involving two noncooperative satellites. After a brief overview of the challenges that experiment design and scenario induce, this paper presents the safety concept retained to guarantee the safety of the formation. The peculiarity of the proposed approach is that it does not rely on the continuous availability of tracking data of the client spacecraft but rather exploits the concept of passive safety of special relative trajectories. To this end, the formation safety criterion based on the minimum distance normal to the flight direction has been extended in order to be applicable also to drifting relative orbits, resulting from non-vanishing relative semi-major axis encountered during a rendezvous or produced by the action of the differential aerodynamic drag.

  5. Design and status of the Mu2e experiment

    NASA Astrophysics Data System (ADS)

    Miscetti, Stefano

    2016-04-01

    The Mu2e experiment aims to measure the charged-lepton flavor violating neutrino-less conversion of a negative muon into an electron in the field of an aluminum nucleus. The objective is to improve the previous measurement by four orders of magnitude with the use of a similar technique. For the conversion process, the signal will appear as a mono-energetic electron very close to the muon rest mass. In the Standard Model these process have negligible rates. However, in many Beyond the Standard Model scenarios their rates are within the reach of next generation experiments. In this paper, we explain the sensitivity to new physics scale and the complementarity of approach and reach provided by MU2E with respect to Mu3e and MEG upgrade. Mu2e experimental technique, design and status will be the focus of this paper.

  6. CELSS experiment model and design concept of gas recycle system

    NASA Technical Reports Server (NTRS)

    Nitta, K.; Oguchi, M.; Kanda, S.

    1986-01-01

    In order to prolong the duration of manned missions around the Earth and to expand the human existing region from the Earth to other planets such as a Lunar Base or a manned Mars flight mission, the controlled ecological life support system (CELSS) becomes an essential factor of the future technology to be developed through utilization of space station. The preliminary system engineering and integration efforts regarding CELSS have been carried out by the Japanese CELSS concept study group for clarifying the feasibility of hardware development for Space station experiments and for getting the time phased mission sets after FY 1992. The results of these studies are briefly summarized and the design and utilization methods of a Gas Recycle System for CELSS experiments are discussed.

  7. Active array design for FAME: Freeform Active Mirror Experiment

    NASA Astrophysics Data System (ADS)

    Jaskó, Attila; Aitink-Kroes, Gabby; Agócs, Tibor; Venema, Lars; Hugot, Emmanuel; Schnetler, Hermine; Bányai, Evelin

    2014-07-01

    In this paper a status report is given on the development of the FAME (Freeform Active Mirror Experiment) active array. Further information regarding this project can be found in the paper by Venema et al. (this conference). Freeform optics provide the opportunity to drastically reduce the complexity of the future optical instruments. In order to produce these non-axisymmetric freeform optics with up to 1 mm deviation from the best fit sphere, it is necessary to come up with new design and manufacturing methods. The way we would like to create novel freeform optics is by fine tuning a preformed high surface-quality thin mirror using an array which is actively controlled by actuators. In the following we introduce the tools deployed to create and assess the individual designs. The result is an active array having optimal number and lay-out of actuators.

  8. Spacecraft and mission design for the SP-100 flight experiment

    NASA Technical Reports Server (NTRS)

    Deininger, William D.; Vondra, Robert J.

    1988-01-01

    The design and performance of a spacecraft employing arcjet nuclear electric propulsion, suitable for use in the SP-100 Space Reactor Power System (SRPS) Flight Experiment, are outlined. The vehicle design is based on a 93 kW(e) ammonia arcjet system operating at an experimentally measured specific impulse of 1031 s and an efficiency of 42.3 percent. The arcjet/gimbal assemblies, power conditioning subsystem, propellant feed system, propulsion system thermal control, spacecraft diagnostic instrumentation, and the telemetry requirements are described. A 100 kW(e) SRPS is assumed. The spacecraft mass is baselined at 5675 kg excluding the propellant and propellant feed system. Four mission scenarios are described which are capable of demonstrating the full capability of the SRPS. The missions considered include spacecraft deployment to possible surveillance platform orbits, a spacecraft storage mission, and an orbit raising round trip corresponding to possible orbit transfer vehicle (OTV) missions.

  9. Scaling and design of landslide and debris-flow experiments

    USGS Publications Warehouse

    Iverson, Richard M.

    2015-01-01

    Scaling plays a crucial role in designing experiments aimed at understanding the behavior of landslides, debris flows, and other geomorphic phenomena involving grain-fluid mixtures. Scaling can be addressed by using dimensional analysis or – more rigorously – by normalizing differential equations that describe the evolving dynamics of the system. Both of these approaches show that, relative to full-scale natural events, miniaturized landslides and debris flows exhibit disproportionately large effects of viscous shear resistance and cohesion as well as disproportionately small effects of excess pore-fluid pressure that is generated by debris dilation or contraction. This behavioral divergence grows in proportion to H3, where H is the thickness of a moving mass. Therefore, to maximize geomorphological relevance, experiments with wet landslides and debris flows must be conducted at the largest feasible scales. Another important consideration is that, unlike stream flows, landslides and debris flows accelerate from statically balanced initial states. Thus, no characteristic macroscopic velocity exists to guide experiment scaling and design. On the other hand, macroscopic gravity-driven motion of landslides and debris flows evolves over a characteristic time scale (L/g)1/2, where g is the magnitude of gravitational acceleration and L is the characteristic length of the moving mass. Grain-scale stress generation within the mass occurs on a shorter time scale, H/(gL)1/2, which is inversely proportional to the depth-averaged material shear rate. A separation of these two time scales exists if the criterion H/L < < 1 is satisfied, as is commonly the case. This time scale separation indicates that steady-state experiments can be used to study some details of landslide and debris-flow behavior but cannot be used to study macroscopic landslide or debris-flow dynamics.

  10. The high temperature superconductivity space experiment (HTSSE-II) design

    SciTech Connect

    Kawecki, T.G.; Golba, G.A.; Price, G.E.; Rose, V.S.; Meyers, W.J.

    1996-07-01

    The high temperature superconductivity space experiment (HTSSE) program, initiated by the Naval Research Laboratory (NRL) in 1988, is described. The HTSSE program focuses high temperature superconductor (HTS) technology applications on space systems. The program phases, goals, and objectives are discussed. The devices developed for the HTSSE-II phase of the program and their suppliers are enumerated. Eight space-qualified components were integrated as a cryogenic experimental payload on DOD`s ARGOS spacecraft. The payload was designed and built using a unique NRL/industry partnership and was integrated and space-qualified at NRL.

  11. Statistical design of experiments as a tool in mass spectrometry.

    PubMed

    Riter, Leah S; Vitek, Olga; Gooding, Karen M; Hodge, Barry D; Julian, Randall K

    2005-05-01

    This Tutorial is an introduction to statistical design of experiments (DOE) with focus on demonstration of how DOE can be useful to the mass spectrometrist. In contrast with the commonly used one factor at a time approach, DOE methods address the issue of interaction of variables and are generally more efficient. The complex problem of optimizing data-dependent acquisition parameters in a bottom-up proteomics LC-MS/MS analysis is used as an example of the power of the technique. Using DOE, a new data-dependent method was developed that improved the quantity of confidently identified peptides from rat serum.

  12. The design and analysis of transposon insertion sequencing experiments.

    PubMed

    Chao, Michael C; Abel, Sören; Davis, Brigid M; Waldor, Matthew K

    2016-02-01

    Transposon insertion sequencing (TIS) is a powerful approach that can be extensively applied to the genome-wide definition of loci that are required for bacterial growth under diverse conditions. However, experimental design choices and stochastic biological processes can heavily influence the results of TIS experiments and affect downstream statistical analysis. In this Opinion article, we discuss TIS experimental parameters and how these factors relate to the benefits and limitations of the various statistical frameworks that can be applied to the computational analysis of TIS data.

  13. On Becoming a Civic-Minded Instructional Designer: An Ethnographic Study of an Instructional Design Experience

    ERIC Educational Resources Information Center

    Yusop, Farrah Dina; Correia, Ana-Paula

    2014-01-01

    This ethnographic study took place in a graduate course at a large research university in the Midwestern United States. It presents an in-depth examination of the experiences and challenges of a group of four students learning to be Instructional Design and Technology professionals who are concerned with the well-being of all members of a society,…

  14. Designing Statistical Language Learners: Experiments on Noun Compounds

    NASA Astrophysics Data System (ADS)

    Lauer, Mark

    1996-09-01

    The goal of this thesis is to advance the exploration of the statistical language learning design space. In pursuit of that goal, the thesis makes two main theoretical contributions: (i) it identifies a new class of designs by specifying an architecture for natural language analysis in which probabilities are given to semantic forms rather than to more superficial linguistic elements; and (ii) it explores the development of a mathematical theory to predict the expected accuracy of statistical language learning systems in terms of the volume of data used to train them. The theoretical work is illustrated by applying statistical language learning designs to the analysis of noun compounds. Both syntactic and semantic analysis of noun compounds are attempted using the proposed architecture. Empirical comparisons demonstrate that the proposed syntactic model is significantly better than those previously suggested, approaching the performance of human judges on the same task, and that the proposed semantic model, the first statistical approach to this problem, exhibits significantly better accuracy than the baseline strategy. These results suggest that the new class of designs identified is a promising one. The experiments also serve to highlight the need for a widely applicable theory of data requirements.

  15. Frac-and-pack stimulation: Application, design, and field experience

    SciTech Connect

    Roodhart, L.P.; Fokker, P.A.; Davies, D.R.; Shlyapobersky, J.; Wong, G.K.

    1994-03-01

    This paper discusses the criteria for selecting wells to be frac-and-packed. The authors show how systematic study of the inflow performance can be used to assess the potential of frac-and-packed wells, to identify the controlling factors, and to optimize design parameters. They also show that fracture conductivity is often the key to successful treatment. This conductivity depends largely on proppant size; formation permeability damage around the created fracture has less effect. Appropriate allowance needs to be made for flow restrictions caused by the presence of the perforations, partial penetration, and non-Darcy effects. They describe the application of the overpressure-calibrated hydraulic fracture model in frac-and-pack treatment design, and discuss some operational considerations with reference to field examples. The full potential of this promising new completion method can be achieved only if the design is tailored to the individual well. This demands high-quality input data, which can be obtained only from a calibration test. This paper presents their strategy for frac-and-pack design, drawing on examples from field experience. They also point out several areas that the industry needs to address, such as the sizing of proppant in soft formations and the interaction between fracturing fluids and resin in resin-coated proppant.

  16. Optimal experiment design for time-lapse traveltime tomography

    SciTech Connect

    Ajo-Franklin, J.B.

    2009-10-01

    Geophysical monitoring techniques offer the only noninvasive approach capable of assessing both the spatial and temporal dynamics of subsurface fluid processes. Increasingly, permanent sensor arrays in boreholes and on the ocean floor are being deployed to improve the repeatability and increase the temporal sampling of monitoring surveys. Because permanent arrays require a large up-front capital investment and are difficult (or impossible) to re-configure once installed, a premium is placed on selecting a geometry capable of imaging the desired target at minimum cost. We present a simple approach to optimizing downhole sensor configurations for monitoring experiments making use of differential seismic traveltimes. In our case, we use a design quality metric based on the accuracy of tomographic reconstructions for a suite of imaging targets. By not requiring an explicit singular value decomposition of the forward operator, evaluation of this objective function scales to problems with a large number of unknowns. We also restrict the design problem by recasting the array geometry into a low dimensional form more suitable for optimization at a reasonable computational cost. We test two search algorithms on the design problem: the Nelder-Mead downhill simplex method and the Multilevel Coordinate Search algorithm. The algorithm is tested for four crosswell acquisition scenarios relevant to continuous seismic monitoring, a two parameter array optimization, several scenarios involving four parameter length/offset optimizations, and a comparison of optimal multi-source designs. In the last case, we also examine trade-offs between source sparsity and the quality of tomographic reconstructions. One general observation is that asymmetric array lengths improve localized image quality in crosswell experiments with a small number of sources and a large number of receivers. Preliminary results also suggest that high-quality differential images can be generated using only a small

  17. Accelerating Vaccine Formulation Development Using Design of Experiment Stability Studies.

    PubMed

    Ahl, Patrick L; Mensch, Christopher; Hu, Binghua; Pixley, Heidi; Zhang, Lan; Dieter, Lance; Russell, Ryann; Smith, William J; Przysiecki, Craig; Kosinski, Mike; Blue, Jeffrey T

    2016-10-01

    Vaccine drug product thermal stability often depends on formulation input factors and how they interact. Scientific understanding and professional experience typically allows vaccine formulators to accurately predict the thermal stability output based on formulation input factors such as pH, ionic strength, and excipients. Thermal stability predictions, however, are not enough for regulators. Stability claims must be supported by experimental data. The Quality by Design approach of Design of Experiment (DoE) is well suited to describe formulation outputs such as thermal stability in terms of formulation input factors. A DoE approach particularly at elevated temperatures that induce accelerated degradation can provide empirical understanding of how vaccine formulation input factors and interactions affect vaccine stability output performance. This is possible even when clear scientific understanding of particular formulation stability mechanisms are lacking. A DoE approach was used in an accelerated 37(°)C stability study of an aluminum adjuvant Neisseria meningitidis serogroup B vaccine. Formulation stability differences were identified after only 15 days into the study. We believe this study demonstrates the power of combining DoE methodology with accelerated stress stability studies to accelerate and improve vaccine formulation development programs particularly during the preformulation stage. PMID:27522919

  18. Control system design for spacecraft formation flying: Theory and experiment

    NASA Astrophysics Data System (ADS)

    Robertson, Andrew Dunbar

    Spacecraft formation flying is an enabling technology for many future space science missions, such as separated spacecraft interferometers (SSI). However the sensing, control and coordination of such instruments pose many new design challenges. SSI missions will require precise relative sensing and control, fuel-efficient, fuel-balanced operation to maximize mission life and group-level autonomy to reduce operations costs. Enabling these new formation flying capabilities requires precise relative sensing and estimation, enhanced control capabilities such as cooperative control (multiple independent spacecraft acting together), group-level formation management and informed design of a system architecture to manage distributed sensing and control-system resources. This research defines an end-to-end control system, including the key elements unique to the formation flying problem: cooperative control, relative sensing, coordination, and the control-system architecture. A new control-system design optimizes performance under typical spacecraft constraints (e.g., on-off actuators, finite fuel, limited computation power, limited contact with ground control, etc.). Standard control techniques have been extended, and new ones synthesized to meet these goals. In designing this control system, several contributions have been made to the field of spacecraft formation flying control including: an analytic two-vehicle fuel-time-optimal cooperative control algorithm, a fast numeric multi-vehicle, optimal cooperative control algorithm that can be used as a feedforward or a feedback controller, a fleet-level coordinator for autonomous fuel balancing, validation of GPS-based relative sensing for formation flying, and trade studies of the relative control and relative-estimation-architecture design problems. These research contributions are mapped to possible applications for three spacecraft formation flying missions currently in development. The lessons learned from this research

  19. Design of MagLIF experiments using the Z facility

    NASA Astrophysics Data System (ADS)

    Sefkow, Adam

    2013-10-01

    The MagLIF (Magnetized Liner Inertial Fusion) concept has been presented as a path toward obtaining substantial fusion yields using the Z facility, and related experiments have begun in earnest at Sandia National Laboratories. We present fully integrated numerical magnetohydrodynamic simulations of the MagLIF concept, which include laser preheating of the fuel, the presence of electrodes, and end loss effects. These simulations have been used to design neutron-producing integrated MagLIF experiments on the Z facility for the capabilities that presently exist, namely, D2 fuel, peak currents of Imax 15-18 MA, pre-seeded axial magnetic fields of Bz0 = 7-10 T, and laser preheat energies of Elaser = 2-3 kJ delivered in 2 ns. The first fully integrated experiments, based on these simulations, are planned to occur in 2013. Neutron yields in excess of 1011 are predicted with the available laser preheat energy and accelerator drive energy. In several years, we plan to upgrade the laser to increase Elaser = by several more kJ, provide Bz0 up to 30 T, deliver Imax 22 MA or more to the load, and develop the capability to use DT fuel. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  20. Parameter Screening and Optimisation for ILP Using Designed Experiments

    NASA Astrophysics Data System (ADS)

    Srinivasan, Ashwin; Ramakrishnan, Ganesh

    Reports of experiments conducted with an Inductive Logic Programming system rarely describe how specific values of parameters of the system are arrived at when constructing models. Usually, no attempt is made to identify sensitive parameters, and those that are used are often given "factory-supplied" default values, or values obtained from some non-systematic exploratory analysis. The immediate consequence of this is, of course, that it is not clear if better models could have been obtained if some form of parameter selection and optimisation had been performed. Questions follow inevitably on the experiments themselves: specifically, are all algorithms being treated fairly, and is the exploratory phase sufficiently well-defined to allow the experiments to be replicated? In this paper, we investigate the use of parameter selection and optimisation techniques grouped under the study of experimental design. Screening and "response surface" methods determine, in turn, sensitive parameters and good values for these parameters. This combined use of parameter selection and response surface-driven optimisation has a long history of application in industrial engineering, and its role in ILP is investigated using two well-known benchmarks. The results suggest that computational overheads from this preliminary phase are not substantial, and that much can be gained, both on improving system performance and on enabling controlled experimentation, by adopting well-established procedures such as the ones proposed here.

  1. Target Station Design for the Mu2e Experiment

    SciTech Connect

    Pronskikh, Vitaly; Ambrosio, Giorgio; Campbell, Michael; Coleman, Richard; Ginther, George; Kashikhin, Vadim; Krempetz, Kurt; Lamm, Michael; Lee, Ang; Leveling, Anthony; Mokhov, Nikolai; Nagaslaev, Vladimir; Stefanik, Andrew; Striganov, Sergei; Werkema, Steven; Bartoszek, Larry; Densham, Chris; Loveridge, Peter; Lynch, Kevin; Popp, James

    2014-07-01

    The Mu2e experiment at Fermilab is devoted to search for the conversion of a negative muon into an electron in the field of a nucleus without emission of neutrinos. One of the main parts of the Mu2e experimental setup is its Target Station in which negative pions are generated in interactions of the 8-GeV primary proton beam with a tungsten target. A large-aperture 5-T superconducting production solenoid (PS) enhances pion collection, and an S-shaped transport solenoid (TS) delivers muons and pions to the Mu2e detector. The heat and radiation shield (HRS) protects the PS and the first TS coils. A beam dump absorbs the spent beam. In order for the PS superconducting magnet to operate reliably the sophisticated HRS was designed and optimized for performance and cost. The beam dump was designed to absorb the spent beam and maintaining its temperature and air activation in the hall at the allowable level. Comprehensive MARS15 simulations have been carried out to optimize all the parts while maximizing muon yield. Results of simulations of critical radiation quantities and their implications on the overall Target Station design and integration will be reported.

  2. Design and Implementation of a Laboratory-Based Drug Design and Synthesis Advanced Pharmacy Practice Experience

    PubMed Central

    Philip, Ashok; Stephens, Mark; Mitchell, Sheila L.

    2015-01-01

    Objective. To provide students with an opportunity to participate in medicinal chemistry research within the doctor of pharmacy (PharmD) curriculum. Design. We designed and implemented a 3-course sequence in drug design or drug synthesis for pharmacy students consisting of a 1-month advanced elective followed by two 1-month research advanced pharmacy practice experiences (APPEs). To maximize student involvement, this 3-course sequence was offered to third-year and fourth-year students twice per calendar year. Assessment. Students were evaluated based on their commitment to the project’s success, productivity, and professionalism. Students also evaluated the course sequence using a 14-item course evaluation rubric. Student feedback was overwhelmingly positive. Students found the experience to be a valuable component of their pharmacy curriculum. Conclusion. We successfully designed and implemented a 3-course research sequence that allows PharmD students in the traditional 4-year program to participate in drug design and synthesis research. Students report the sequence enhanced their critical-thinking and problem-solving skills and helped them develop as independent learners. Based on the success achieved with this sequence, efforts are underway to develop research APPEs in other areas of the pharmaceutical sciences. PMID:25995518

  3. FLOSS UX Design: An Analysis of User Experience Design in Firefox and OpenOffice.org

    NASA Astrophysics Data System (ADS)

    Bach, Paula M.; Carroll, John M.

    We describe two cases of open user experience (UX) design using the Firefox web browser and OpenOffice.org office suite as case studies. We analyze the social complexity of integrating UX practices into the two open source projects using activity awareness, a framework for understanding team performance in collective endeavors of significant scope, duration, and complexity. The facets of activity awareness are common ground, community of practice, social capital, and human development. We found that differences between the communities include different strategies for community building, UX status in the community, type of open UX design, and different ways to share information.

  4. The MARTE VNIR imaging spectrometer experiment: design and analysis.

    PubMed

    Brown, Adrian J; Sutter, Brad; Dunagan, Stephen

    2008-10-01

    We report on the design, operation, and data analysis methods employed on the VNIR imaging spectrometer instrument that was part of the Mars Astrobiology Research and Technology Experiment (MARTE). The imaging spectrometer is a hyperspectral scanning pushbroom device sensitive to VNIR wavelengths from 400-1000 nm. During the MARTE project, the spectrometer was deployed to the Río Tinto region of Spain. We analyzed subsets of three cores from Río Tinto using a new band modeling technique. We found most of the MARTE drill cores to contain predominantly goethite, though spatially coherent areas of hematite were identified in Core 23. We also distinguished non Fe-bearing minerals that were subsequently analyzed by X-ray diffraction (XRD) and found to be primarily muscovite. We present drill core maps that include spectra of goethite, hematite, and non Fe-bearing minerals.

  5. The MARTE VNIR Imaging Spectrometer Experiment: Design and Analysis

    NASA Astrophysics Data System (ADS)

    Brown, Adrian J.; Sutter, Brad; Dunagan, Stephen

    2008-10-01

    We report on the design, operation, and data analysis methods employed on the VNIR imaging spectrometer instrument that was part of the Mars Astrobiology Research and Technology Experiment (MARTE). The imaging spectrometer is a hyperspectral scanning pushbroom device sensitive to VNIR wavelengths from 400-1000 nm. During the MARTE project, the spectrometer was deployed to the Río Tinto region of Spain. We analyzed subsets of three cores from Río Tinto using a new band modeling technique. We found most of the MARTE drill cores to contain predominantly goethite, though spatially coherent areas of hematite were identified in Core 23. We also distinguished non Fe-bearing minerals that were subsequently analyzed by X-ray diffraction (XRD) and found to be primarily muscovite. We present drill core maps that include spectra of goethite, hematite, and non Fe-bearing minerals.

  6. Design of experiments for thermal protection system process optimization

    NASA Astrophysics Data System (ADS)

    Longani, Hans R.

    2000-01-01

    Solid Rocket Booster (SRB) structures were protected from heating due to aeroshear, radiation and plume impingement by a Thermal Protection System (TPS) known as Marshall Sprayable Ablative (MSA-2). MSA-2 contains Volatile Organic Compounds (VOCs) which due to strict environmental legislation was eliminated. MSA-2 was also classified as hazardous waste, which makes the disposal very costly. Marshall Convergent Coating (MCC-1) replaced MSA-2, and eliminated the use of solvents by delivering the dry filler materials and the fluid resin system to a patented spray gun which utilizes Convergent Spray Technologies spray process. The selection of TPS material was based on risk assessment, performance comparisons, processing, application and cost. Design of Experiments technique was used to optimize the spraying parameters. .

  7. Computational design aspects of a NASP nozzle/afterbody experiment

    NASA Technical Reports Server (NTRS)

    Ruffin, Stephen M.; Venkatapathy, Ethiraj; Keener, Earl R.; Nagaraj, N.

    1989-01-01

    This paper highlights the influence of computational methods on design of a wind tunnel experiment which generically models the nozzle/afterbody flow field of the proposed National Aerospace Plane. The rectangular slot nozzle plume flow field is computed using a three-dimensional, upwind, implicit Navier-Stokes solver. Freestream Mach numbers of 5.3, 7.3, and 10 are investigated. Two-dimensional parametric studies of various Mach numbers, pressure ratios, and ramp angles are used to help determine model loads and afterbody ramp angle and length. It was found that the center of pressure on the ramp occurs at nearly the same location for all ramp angles and test conditions computed. Also, to prevent air liquefaction, it is suggested that a helium-air mixture be used as the jet gas for the highest Mach number test case.

  8. Design of Experiments Results for the Feedthru Insulator

    SciTech Connect

    BENAVIDES,GILBERT L.; VAN ORNUM,DAVID J.; BACA,MAUREEN R.; APPEL,PATRICIA E.

    1999-12-01

    A design of experiments (DoE) was performed at Ceramtec to improve the yield of a cermet part known as the feedthru insulator. The factors chosen to be varied in this DoE were syringe orifice size, fill condition, solvent, and surfactant. These factors were chosen because of their anticipated effect on the cermet slurry and its consequences to the feedthru insulator in succeeding fabrication operations. Response variables to the DoE were chosen to be indirect indicators of production yield for the feedthru insulator. The solvent amount used to mix the cermet slurry had the greatest overall effect on the response variables. Based upon this DoE, there is the potential to improve the yield not only for the feedthru insulator but for other cermet parts as well. This report thoroughly documents the DoE and contains additional information regarding the feedthru insulator.

  9. Gender Consideration in Experiment Design for Air Break in Prebreathe

    NASA Technical Reports Server (NTRS)

    Conkin, Johnny; Dervay, Joseph P.; Gernhardt, Michael L.

    2007-01-01

    If gender is a confounder of the decompression sickness (DCS) or venous gas emboli (VGE) outcomes of a proposed air break in oxygen prebreathe (PB) project, then decisions about the final experiment design must be made. We evaluated if the incidence of DCS and VGE from tests in altitude chambers over 20 years were different between men and women after resting and exercise PB protocols. Nitrogen washout during PB is our primary risk mitigation strategy to prevent subsequent DCS and VGE in subjects. Bubbles in the pulmonary artery (venous blood) were detected from the precordial position using Doppler ultrasound bubble detectors. The subjects were monitored for VGE for four min at about 15 min intervals for the duration of the altitude exposure, with maximum bubble grade assigned a Spencer Grade of IV.

  10. PV-Diesel Hybrid SCADA Experiment Network Design

    NASA Technical Reports Server (NTRS)

    Kalu, Alex; Durand, S.; Emrich, Carol; Ventre, G.; Wilson, W.; Acosta, R.

    1999-01-01

    The essential features of an experimental network for renewable power system satellite based supervisory, control and data acquisition (SCADA) are communication links, controllers, diagnostic equipment and a hybrid power system. Required components for implementing the network consist of two satellite ground stations, to satellite modems, two 486 PCs, two telephone receivers, two telephone modems, two analog telephone lines, one digital telephone line, a hybrid-power system equipped with controller and a satellite spacecraft. In the technology verification experiment (TVE) conducted by Savannah State University and Florida Solar Energy Center, the renewable energy hybrid system is the Apex-1000 Mini-Hybrid which is equipped with NGC3188 for user interface and remote control and the NGC2010 for monitoring and basic control tasks. This power system is connected to a satellite modem via a smart interface, RS232. Commands are sent to the power system control unit through a control PC designed as PC1. PC1 is thus connected to a satellite model through RS232. A second PC, designated PC2, the diagnostic PC is connected to both satellite modems via separate analog telephone lines for checking modems'health. PC2 is also connected to PC1 via a telephone line. Due to the unavailability of a second ground station for the ACTS, one ground station is used to serve both the sending and receiving functions in this experiment. Signal is sent from the control PC to the Hybrid system at a frequency f(sub 1), different from f(sub 2), the signal from the hybrid system to the control PC. f(sub l) and f(sub 2) are sufficiently separated to avoid interference.

  11. Designing an experiment to measure cellular interaction forces

    NASA Astrophysics Data System (ADS)

    McAlinden, Niall; Glass, David G.; Millington, Owain R.; Wright, Amanda J.

    2013-09-01

    Optical trapping is a powerful tool in Life Science research and is becoming common place in many microscopy laboratories and facilities. The force applied by the laser beam on the trapped object can be accurately determined allowing any external forces acting on the trapped object to be deduced. We aim to design a series of experiments that use an optical trap to measure and quantify the interaction force between immune cells. In order to cause minimum perturbation to the sample we plan to directly trap T cells and remove the need to introduce exogenous beads to the sample. This poses a series of challenges and raises questions that need to be answered in order to design a set of effect end-point experiments. A typical cell is large compared to the beads normally trapped and highly non-uniform - can we reliably trap such objects and prevent them from rolling and re-orientating? In this paper we show how a spatial light modulator can produce a triple-spot trap, as opposed to a single-spot trap, giving complete control over the object's orientation and preventing it from rolling due, for example, to Brownian motion. To use an optical trap as a force transducer to measure an external force you must first have a reliably calibrated system. The optical trapping force is typically measured using either the theory of equipartition and observing the Brownian motion of the trapped object or using an escape force method, e.g. the viscous drag force method. In this paper we examine the relationship between force and displacement, as well as measuring the maximum displacement from equilibrium position before an object falls out of the trap, hence determining the conditions under which the different calibration methods should be applied.

  12. Design, Construction, Alignment, and Calibration of a Compact Velocimetry Experiment

    SciTech Connect

    Kaufman, Morris I.; Malone, Robert M.; Frogget, Brent C.; Esquibel, David L.; Romero, Vincent T.; Lare, Gregory A.; Briggs, Bart; Iverson, Adam J.; Frayer, Daniel K.; DeVore, Douglas; Cata, Brian

    2007-09-21

    A velocimetry experiment has been designed to measure shock properties for small cylindrical metal targets (8-mm-diameter by 2-mm thick). A target is accelerated by high explosives, caught, and retrieved for later inspection. The target is expected to move at a velocity of 0.1 to 3 km/sec. The complete experiment canister is approximately 105 mm in diameter and 380 mm long. Optical velocimetry diagnostics include the Velocity Interferometer System for Any Reflector (VISAR) and Photon Doppler Velocimetry (PDV). The packaging of the velocity diagnostics is not allowed to interfere with the catchment or an X-ray imaging diagnostic. A single optical relay, using commercial lenses, collects Doppler-shifted light for both VISAR and PDV. The use of fiber optics allows measurement of point velocities on the target surface during accelerations occurring over 15 mm of travel. The VISAR operates at 532 nm and has separate illumination fibers requiring alignment. The PDV diagnostic operates at 1550 nm, but is aligned and focused at 670 nm. The VISAR and PDV diagnostics are complementary measurements and they image spots in close proximity on the target surface. Because the optical relay uses commercial glass, the axial positions of the optical fibers for PDV and VISAR are offset to compensate for chromatic aberrations. The optomechanical design requires careful attention to fiber management, mechanical assembly and disassembly, positioning of the foam catchment, and X-ray diagnostic field-of-view. Calibration and alignment data are archived at each stage of the assembly sequence.

  13. Laser communication experiment. Volume 1: Design study report: Spacecraft transceiver. Part 3: LCE design specifications

    NASA Technical Reports Server (NTRS)

    1970-01-01

    The requirements for the design, fabrication, performance, and testing of a 10.6 micron optical heterodyne receiver subsystem for use in a laser communication system are presented. The receiver subsystem, as a part of the laser communication experiment operates in the ATS 6 satellite and in a transportable ground station establishing two-way laser communications between the spacecraft and the transportable ground station. The conditions under which environmental tests are conducted are reported.

  14. Fast ignition integrated experiments and high-gain point design

    SciTech Connect

    Shiraga, H.; Nagatomo, H.; Theobald, W.; Solodov, A. A.; Tabak, M.

    2014-04-17

    Here, integrated fast ignition experiments were performed at ILE, Osaka, and LLE, Rochester, in which a nanosecond driver laser implodes a deuterated plastic shell in front of the tip of a hollow metal cone and an intense ultrashort-pulse laser is injected through the cone to heat the compressed plasma. Based on the initial successful results of fast electron heating of cone-in-shell targets, large-energy short-pulse laser beam lines were constructed and became operational: OMEGA-EP at Rochester and LFEX at Osaka. Neutron enhancement due to heating with a ~kJ short-pulse laser has been demonstrated in the integrated experiments at Osaka and Rochester. The neutron yields are being analyzed by comparing the experimental results with simulations. Details of the fast electron beam transport and the electron energy deposition in the imploded fuel plasma are complicated and further studies are imperative. The hydrodynamics of the implosion was studied including the interaction of the imploded core plasma with the cone tip. Theory and simulation studies are presented on the hydrodynamics of a high-gain target for a fast ignition point design.

  15. Design and Assembly of the Magnetized Dusty Plasma Experiment (MDPX)

    NASA Astrophysics Data System (ADS)

    Fisher, Ross; Artis, Darrick; Lynch, Brian; Wood, Keith; Shaw, Joseph; Gilmore, Kevin; Robinson, Daniel; Polka, Christian; Konopka, Uwe; Thomas, Edward; Merlino, Robert; Rosenberg, Marlene

    2013-10-01

    Over the last two years, the Magnetized Dusty Plasma Experiment (MDPX) has been under construction at Auburn University. This new research device, whose assembly will be completed in late Summer, 2013, uses a four-coil, superconducting, high magnetic field system (|B | >= 4 Tesla) to investigate the confinement, charging, transport, and instabilities in a dusty plasma. A new feature of the MDPX device is the ability to operate the magnetic coils independently to allow a variety of magnetic configurations from highly uniform to quadrapole-like. Envisioned as a multi-user facility, the MDPX device features a cylindrical vacuum vessel whose primary experimental region is an octagonal chamber that has a 35.5 cm inner diameter and is 19 cm tall. There is substantial diagnostics and optical access through eight, 10.2 cm × 12.7 cm side ports. The chamber can also be equipped with two 15.2 cm diameter, 76 cm long extensions to allow long plasma column experiments, particularly long wavelength dust wave studies. This presentation will discuss the final design, assembly, and installation of the MDPX device and will describe its supporting laboratory facility. This work is supported by a National Science Foundation - Major Research Instrumentation (NSF-MRI) award, PHY-1126067.

  16. Design study for a diverging supernova explosion experiment on NIF

    NASA Astrophysics Data System (ADS)

    Flaig, Markus; Plewa, Tomasz; Keiter, Paul; Grosskopf, Michael; Kuranz, Carolyn; Drake, Paul; Park, Hye-Sook

    2013-10-01

    We report on design simulations of a spherically-diverging, multi-interface, supernova-relevant Rayleigh-Taylor experiment (DivSNRT) to be carried out at the National Ignition Facility (NIF). The simulations are performed in two and three dimensions using the block-adaptive, multi-group radiative diffusion hydrodynamics code CRASH and the FLASH-based MHD code Proteus. In the present study, we concentrate mainly on a planar variant of the experiment. We assess the sensitivity of the Rayleigh-Taylor instability growth on numerical discretization, variations in the laser drive energy and the manufacturing noise at the material interfaces. We find that a simple buoyancy-drag model accurately predicts the mixed-layer width obtained in the simulations. We use synthetic radiographs to optimize the diagnostic system and the experimental setup. Finally, we perform a series of exploratory MHD simulations and investigate the self-generation of magnetic fields and their role in the system evolution. Supported by the DOE grant DE-SC0008823.

  17. Design of a miniature explosive isentropic compression experiment

    SciTech Connect

    Tasker, Douglas G

    2010-01-01

    The purpose of this design study is to adapt the High Explosive Pulsed Power Isentropic Compression Experiment (HEPP-ICE) to milligram quantities of materials at stresses of {approx}100 GPa. For this miniature application we assume that a parallel plate stripline of {approx}2.5 mm width is needed to compress the samples. In any parallel plate load, the rising currents flow preferentially along the outside edges of the load where the specific impedance is a minimum [1]. Therefore, the peak current must be between 1 and 2 MA to reach a stress of 100 GPa in the center of a 2.5 mm wide parallel plate load; these are small relative to typical HEPP-ICE currents. We show that a capacitor bank alone exceeds the requirements of this miniature ICE experiment and a flux compression generator (FCG) is not necessary. The proposed circuit will comprise one half of the 2.4-MJ bank, i.e., the 6-mF, 20-kV, 1.2 MJ capacitor bank used in the original HEPP-ICE circuit. Explosive opening and closing switches will still be required because the rise time of the capacitor circuit would be of the order of 30 {micro}s without them. For isentropic loading in these small samples, stress rise times of {approx}200 ns are required.

  18. The application of statistically designed experiments to resistance spot welding

    NASA Technical Reports Server (NTRS)

    Hafley, Robert A.; Hales, Stephen J.

    1991-01-01

    State-of-the-art Resistance Spot Welding (RSW) equipment has the potential to permit realtime monitoring of operations through advances in computerized process control. In order to realize adaptive feedback capabilities, it is necessary to establish correlations among process variables, welder outputs, and weldment properties. The initial step toward achieving this goal must involve assessment of the effect of specific process inputs and the interactions among these variables on spot weld characteristics. This investigation evaluated these effects through the application of a statistically designed experiment to the RSW process. A half-factorial, Taguchi L sub 16 design was used to understand and refine a RSW schedule developed for welding dissimilar aluminum-lithium alloys of different thickness. The baseline schedule had been established previously by traditional trial and error methods based on engineering judgment and one-factor-at-a-time studies. A hierarchy of inputs with respect to each other was established, and the significance of these inputs with respect to experimental noise was determined. Useful insight was gained into the effect of interactions among process variables, particularly with respect to weldment defects. The effects of equipment related changes associated with disassembly and recalibration were also identified. In spite of an apparent decrease in equipment performance, a significant improvement in the maximum strength for defect-free welds compared to the baseline schedule was achieved.

  19. Propagation-related AMT design aspects and supporting experiments

    NASA Technical Reports Server (NTRS)

    Dessouky, Khaled; Estabrook, Polly

    1991-01-01

    The ACTS Mobile Terminal (AMT) is presently being developed with the goal of significantly extending commercial satellite applications and their user base. A thorough knowledge of the Ka-band channel characteristics is essential to the proper design of a commercially viable system that efficiently utilizes the valuable resources. To date, only limited tests have been performed to characterize the Ka-band channel, and they have focused on the needs of fixed terminals. As part of the value of the AMT as a Ka-band test bed is its function as a vehicle through which tests specifically applicable to the mobile satellite communications can be performed. The exact propagation environment with the proper set of elevation angles, vehicle antenna gains and patterns, roadside shadowing, rain, and Doppler is encountered. The ability to measure all of the above, as well as correlate their effects with observed communication system performance, creates an invaluable opportunity to understand in depth Ka-band's potential in supporting mobile and personal communications. This paper discusses the propagation information required for system design, the setup with ACTS that will enable obtaining this information, and finally the types of experiments to be performed and data to be gathered by the AMT to meet this objective.

  20. Thermal Design of a Bitter-Type Electromagnet for Dusty Plasma Experiments: Prototype Design and Construction

    NASA Astrophysics Data System (ADS)

    Birmingham, W. J.; Bates, E. M.; Romero-Talamás, Carlos; Rivera, W. F.

    2015-11-01

    For the purpose of analyzing magnetized dusty plasma at the University of Maryland Baltimore County (UMBC) Dusty Plasma Laboratory, we are designing a resistive water cooled Bitter-Type electromagnet. When completed, the magnet will be programmable to generate fields of up to 10 T for at least 10 seconds and up to several minutes. An analytic thermal design method was developed for establishing the location of elongated axial cooling passages. Comparisons with finite element analysis (FEA) data reveals that the thermal design method was capable of generating cooling channel patterns which establish manageable temperature profiles within the magnet. With our analytic method, cooling hole patterns can be generated in seconds instead of hours with FEA software. To further validate our thermal analysis as well as manufacturing techniques of our magnet design, we are now constructing a prototype electromagnet. The prototype is designed to operate continuously at 1 T with a current of 750 A, and has four diagnostic ports that can accommodate thermocouples and optical access to the water flow. A 1.25 inch diameter bore allows for axial field measurements and provides space for small scale experiments. Thermal analysis and specifics of the electromagnet design are presented.

  1. Interim Service ISDN Satellite (ISIS) hardware experiment development for advanced ISDN satellite designs and experiments

    NASA Technical Reports Server (NTRS)

    Pepin, Gerard R.

    1992-01-01

    The Interim Service Integrated Service Digital Network (ISDN) Satellite (ISIS) Hardware Experiment Development for Advanced Satellite Designs describes the development of the ISDN Satellite Terminal Adapter (ISTA) capable of translating ISDN protocol traffic into Time Division Multiple Access (TDMA) signals for use by a communications satellite. The ISTA connects the Type 1 Network Termination (NT1) via the U-interface on the line termination side of the CPE to the RS-499 interface for satellite uplink. The same ISTA converts in the opposite direction the RS-499 to U-interface data with a simple switch setting.

  2. Design issues in toxicogenomics using DNA microarray experiment

    SciTech Connect

    Lee, Kyoung-Mu; Kim, Ju-Han; Kang, Daehee . E-mail: dhkang@snu.ac.kr

    2005-09-01

    The methods of toxicogenomics might be classified into omics study (e.g., genomics, proteomics, and metabolomics) and population study focusing on risk assessment and gene-environment interaction. In omics study, microarray is the most popular approach. Genes falling into several categories (e.g., xenobiotics metabolism, cell cycle control, DNA repair etc.) can be selected up to 20,000 according to a priori hypothesis. The appropriate type of samples and species should be selected in advance. Multiple doses and varied exposure durations are suggested to identify those genes clearly linked to toxic response. Microarray experiments can be affected by numerous nuisance variables including experimental designs, sample extraction, type of scanners, etc. The number of slides might be determined from the magnitude and variance of expression change, false-positive rate, and desired power. Instead, pooling samples is an alternative. Online databases on chemicals with known exposure-disease outcomes and genetic information can aid the interpretation of the normalized results. Gene function can be inferred from microarray data analyzed by bioinformatics methods such as cluster analysis. The population study often adopts hospital-based or nested case-control design. Biases in subject selection and exposure assessment should be minimized, and confounding bias should also be controlled for in stratified or multiple regression analysis. Optimal sample sizes are dependent on the statistical test for gene-to-environment or gene-to-gene interaction. The design issues addressed in this mini-review are crucial in conducting toxicogenomics study. In addition, integrative approach of exposure assessment, epidemiology, and clinical trial is required.

  3. Experiments and other methods for developing expertise with design of experiments in a classroom setting

    NASA Technical Reports Server (NTRS)

    Patterson, John W.

    1990-01-01

    The only way to gain genuine expertise in Statistical Process Control (SPC) and the design of experiments (DOX) is with repeated practice, but not on canned problems with dead data sets. Rather, one must negotiate a wide variety of problems each with its own peculiarities and its own constantly changing data. The problems should not be of the type for which there is a single, well-defined answer that can be looked up in a fraternity file or in some text. The problems should match as closely as possible the open-ended types for which there is always an abundance of uncertainty. These are the only kinds that arise in real research, whether that be basic research in academe or engineering research in industry. To gain this kind of experience, either as a professional consultant or as an industrial employee, takes years. Vast amounts of money, not to mention careers, must be put at risk. The purpose here is to outline some realistic simulation-type lab exercises that are so simple and inexpensive to run that the students can repeat them as often as desired at virtually no cost. Simulations also allow the instructor to design problems whose outcomes are as noisy as desired but still predictable within limits. Also the instructor and the students can learn a great deal more from the postmortum conducted after the exercise is completed. One never knows for sure what the true data should have been when dealing only with real life experiments. To add a bit more realism to the exercises, it is sometimes desirable to make the students pay for each experimental result from a make-believe budget allocation for the problem.

  4. Having One's Cake and Eating It, Too: Combining True Experiments with Regression Discontinuity Designs

    ERIC Educational Resources Information Center

    Mandell, Marvin B.

    2008-01-01

    Both true experiments and regression discontinuity (RD) designs produce unbiased estimates of effects. However, true experiments are, of course, often criticized on equity grounds, whereas RD designs entail sacrifices in terms of statistical precision. In this article, a hybrid of true experiments and RD designs is considered. This hybrid entails…

  5. Interlopers 3D: experiences designing a stereoscopic game

    NASA Astrophysics Data System (ADS)

    Weaver, James; Holliman, Nicolas S.

    2014-03-01

    Background In recent years 3D-enabled televisions, VR headsets and computer displays have become more readily available in the home. This presents an opportunity for game designers to explore new stereoscopic game mechanics and techniques that have previously been unavailable in monocular gaming. Aims To investigate the visual cues that are present in binocular and monocular vision, identifying which are relevant when gaming using a stereoscopic display. To implement a game whose mechanics are so reliant on binocular cues that the game becomes impossible or at least very difficult to play in non-stereoscopic mode. Method A stereoscopic 3D game was developed whose objective was to shoot down advancing enemies (the Interlopers) before they reached their destination. Scoring highly required players to make accurate depth judgments and target the closest enemies first. A group of twenty participants played both a basic and advanced version of the game in both monoscopic 2D and stereoscopic 3D. Results The results show that in both the basic and advanced game participants achieved higher scores when playing in stereoscopic 3D. The advanced game showed that by disrupting the depth from motion cue the game became more difficult in monoscopic 2D. Results also show a certain amount of learning taking place over the course of the experiment, meaning that players were able to score higher and finish the game faster over the course of the experiment. Conclusions Although the game was not impossible to play in monoscopic 2D, participants results show that it put them at a significant disadvantage when compared to playing in stereoscopic 3D.

  6. Designing Effective Field Experiences for Nontraditional Preservice Special Educators.

    ERIC Educational Resources Information Center

    Rosenberg, Michael S.; Jackson, Lewis; Yeh, Chyong-Hwa

    1996-01-01

    Five alternatives to traditional field experiences in special education are described and contrasted, including: infusion of experiences within content courses; traditional experiences offered during the summer session; working practica; roving practica; and specialized field-experiences. It is argued that these alternatives can provide…

  7. GPS Antenna Characterization Experiment (ACE): Receiver Design and Initial Results

    NASA Technical Reports Server (NTRS)

    Martzen, Phillip; Highsmith, Dolan E.; Valdez, Jennifer E.; Parker, Joel J. K.; Moreau, Michael C.

    2015-01-01

    The GPS Antenna Characterization Experiment (ACE) is a research collaboration between Aerospace and NASA Goddard to characterize the gain patterns of the GPS L1 transmit antennas. High altitude GPS observations are collected at a ground station through a transponder-based or "bent-pipe" architecture where the GPS L1 RF spectrum is received at a platform in geosynchronous orbit and relayed to the ground for processing. The focus of this paper is the unique receiver algorithm design and implementation. The high-sensitivity GPS C/A-code receiver uses high fidelity code and carrier estimates and externally supplied GPS message bit data in a batch algorithm with settings for a 0 dB-Hz threshold. The resulting carrier-to-noise measurements are used in a GPS L1 transmit antenna pattern reconstruction. This paper shows initial transmit gain patterns averaged over each block of GPS satellites, including comparisons to available pre-flight gain measurements from the GPS vehicle contractors. These results provide never-before-seen assessments of the full, in-flight transmit gain patterns.

  8. Irradiation Experiment Conceptual Design Parameters for NBSR Fuel Conversion

    SciTech Connect

    Brown, N. R.; Brown, N. R.; Baek, J. S; Hanson, A. L.; Cuadra, A.; Cheng, L. Y.; Diamond, D. J.

    2014-04-30

    It has been proposed to convert the National Institute of Standards and Technology (NIST) research reactor, known as the NBSR, from high-enriched uranium (HEU) fuel to low-Enriched uranium (LEU) fuel. The motivation to convert the NBSR to LEU fuel is to reduce the risk of proliferation of special nuclear material. This report is a compilation of relevant information from recent studies related to the proposed conversion using a metal alloy of LEU with 10 w/o molybdenum. The objective is to inform the design of the mini-plate and full-size-Plate irradiation experiments that are being planned. This report provides relevant dimensions of the fuel elements, and the following parameters at steady state: average and maximum fission rate density and fission density, fuel temperature distribution for the plate with maximum local temperature, and two-dimensional heat flux profiles of fuel plates with high power densities. The latter profiles are given for plates in both the inner and outer core zones and for cores with both fresh and depleted shim arms (reactivity control devices). A summary of the methodology to obtain these results is presented. Fuel element tolerance assumptions and hot channel factors used in the safety analysis are also given.

  9. Irradiation Experiment Conceptual Design Parameters for NBSR Fuel Conversion

    SciTech Connect

    Brown N. R.; Brown,N.R.; Baek,J.S; Hanson, A.L.; Cuadra,A.; Cheng,L.Y.; Diamond, D.J.

    2013-03-31

    It has been proposed to convert the National Institute of Standards and Technology (NIST) research reactor, known as the NBSR, from high-enriched uranium (HEU) fuel to low-enriched uranium (LEU) fuel. The motivation to convert the NBSR to LEU fuel is to reduce the risk of proliferation of special nuclear material. This report is a compilation of relevant information from recent studies related to the proposed conversion using a metal alloy of LEU with 10 w/o molybdenum. The objective is to inform the design of the mini-plate and full-size plate irradiation experiments that are being planned. This report provides relevant dimensions of the fuel elements, and the following parameters at steady state: average and maximum fission rate density and fission density, fuel temperature distribution for the plate with maximum local temperature, and two-dimensional heat flux profiles of fuel plates with high power densities. . The latter profiles are given for plates in both the inner and outer core zones and for cores with both fresh and depleted shim arms (reactivity control devices). In addition, a summary of the methodology to obtain these results is presented.

  10. Bio-inspired design of dental multilayers: experiments and model.

    PubMed

    Niu, Xinrui; Rahbar, Nima; Farias, Stephen; Soboyejo, Wole

    2009-12-01

    This paper combines experiments, simulations and analytical modeling that are inspired by the stress reductions associated with the functionally graded structures of the dentin-enamel-junctions (DEJs) in natural teeth. Unlike conventional crown structures in which ceramic crowns are bonded to the bottom layer with an adhesive layer, real teeth do not have a distinct "adhesive layer" between the enamel and the dentin layers. Instead, there is a graded transition from enamel to dentin within a approximately 10 to 100 microm thick regime that is called the Dentin Enamel Junction (DEJ). In this paper, a micro-scale, bio-inspired functionally graded structure is used to bond the top ceramic layer (zirconia) to a dentin-like ceramic-filled polymer substrate. The bio-inspired functionally graded material (FGM) is shown to exhibit higher critical loads over a wide range of loading rates. The measured critical loads are predicted using a rate dependent slow crack growth (RDEASCG) model. The implications of the results are then discussed for the design of bio-inspired dental multilayers.

  11. Design Criteria and Machine Integration of the Ignitor Experiment

    NASA Astrophysics Data System (ADS)

    Bianchi, A.; Coppi, B.

    2010-11-01

    High field, high density compact experiments are the only ones capable of producing, on the basis of available technology and knowledge of plasma physics, plasmas that can reach ignition conditions. The Ignitor machine (R01.32 m, a xb0.47x0.83 m^2, BT<=13 T, Ip<=11 MA) is characterized by a complete structural integration of its major components. A sophisticated Poloidal Field system provides the flexibility to produce the expected sequence of plasma equilibrium configurations during the plasma current and pressure rise. The structural concept of the machine is based on an optimized combination of ``bucking'' and ``wedging''. All components, with the exception of the vacuum vessel, are cooled before each plasma pulse by means of He gas, to an optimal temperature of 30 K, at which the ratio of the electrical resistivity to the specific heat of copper is minimum. The 3D and 2D design and integration of all the core machine components, including electro-fluidic and fluidic lines, has been produced using the Dassault CATIA-V software. A complete structural analysis has verified that the machine can withstand the forces produced for all the main operational scenarios.

  12. Magnetohydrodynamic Augmented Propulsion Experiment: I. Performance Analysis and Design

    NASA Technical Reports Server (NTRS)

    Litchford, R. J.; Cole, J. W.; Lineberry, J. T.; Chapman, J. N.; Schmidt, H. J.; Lineberry, C. W.

    2003-01-01

    The performance of conventional thermal propulsion systems is fundamentally constrained by the specific energy limitations associated with chemical fuels and the thermal limits of available materials. Electromagnetic thrust augmentation represents one intriguing possibility for improving the fuel composition of thermal propulsion systems, thereby increasing overall specific energy characteristics; however, realization of such a system requires an extremely high-energy-density electrical power source as well as an efficient plasma acceleration device. This Technical Publication describes the development of an experimental research facility for investigating the use of cross-field magnetohydrodynamic (MHD) accelerators as a possible thrust augmentation device for thermal propulsion systems. In this experiment,a 1.5-MW(sub e) Aerotherm arc heater is used to drive a 2-MW(sub e) MHD accelerator. The heatsink MHD accelerator is configured as an externally diagonalized, segmented channel, which is inserted into a large-bore, 2-T electromagnet. The performance analysis and engineering design of the flow path are described as well as the parameter measurements and flow diagnostics planned for the initial series of test runs.

  13. The OECI certification/designation program: the Genoa experience.

    PubMed

    Orengo, Giovanni; Pronzato, Paolo; Ferrarini, Manlio

    2015-01-01

    Accreditation and designation procedures by the Organisation of European Cancer Institutes (OECI) have represented a considerable challenge for most of the Italian cancer centers. We summarize the experience of the San Martino-IST in Genoa, which, on the whole, was satisfactory, albeit demanding for the staff. The reorganization of most oncology/hematology operations within the disease management teams was probably the key point that allowed us to obtain approval as it brought about the possibility of bringing in uniform methods of diagnosis/treatment, increasing patient recruitment in clinical trials, and fostering translational research by promoting collaboration between clinicians and laboratory investigators. The creation of a more cohesive supportive and terminal care team facilitated both the OECI procedures as well as the operations within the institution. Finally, some considerations are added to the doctor and nurse management roles in Italian hospitals characterized by noticeable differences from northern Europe. These differences may represent an extra challenge for hospital management and evaluator teams more used to the northern European type of organization. PMID:27096267

  14. The LHCb Simulation Application, Gauss: Design, Evolution and Experience

    NASA Astrophysics Data System (ADS)

    Clemencic, M.; Corti, G.; Easo, S.; Jones, C. R.; Miglioranzi, S.; Pappagallo, M.; Robbe, P.; LHCb Collaboration

    2011-12-01

    The LHCb simulation application, Gauss, is based on the Gaudi framework and on experiment basic components such as the Event Model and Detector Description. Gauss also depends on external libraries for the generation of the primary events (PYTHIA 6, EvtGen, etc.) and on GEANT4 for particle transport in the experimental setup. The application supports the production of different types of events from minimum bias to B physics signals and particle guns. It is used for purely generator-level studies as well as full simulations. Gauss is used both directly by users and in massive central productions on the grid. The design and implementation of the application and its evolution due to evolving requirements will be described as in the case of the recently adopted Python-based configuration or the possibility of taking into account detectors conditions via a Simulation Conditions database. The challenge of supporting at the same time the flexibililty needed for the different tasks for which it is used, from evaluation of physics reach to background modeling, together with the stability and reliabilty of the code will also be described.

  15. Conceptual design study for Infrared Limb Experiment (IRLE)

    NASA Technical Reports Server (NTRS)

    Baker, Doran J.; Ulwick, Jim; Esplin, Roy; Batty, J. C.; Ware, Gene; Tew, Craig

    1989-01-01

    The phase A engineering design study for the Infrared Limb Experiment (IRLE) instrument, the infrared portion of the Mesosphere-Lower Thermosphere Explorer (MELTER) satellite payload is given. The IRLE instrument is a satellite instrument, based on the heritage of the Limb Infrared Monitor of the Stratosphere (LIMS) program, that will make global measurements of O3, CO2, NO, NO2, H2O, and OH from earth limb emissions. These measurements will be used to provide improved understanding of the photochemistry, radiation, dynamics, energetics, and transport phenomena in the lower thermosphere, mesosphere, and stratosphere. The IRLE instrument is the infrared portion of the MELTER satellite payload. MELTER is being proposed to NASA Goddard by a consortium consisting of the University of Michigan, University of Colorado and NASA Langley. It is proposed that the Space Dynamics Laboratory at Utah State University (SDL/USU) build the IRLE instrument for NASA Langley. MELTER is scheduled for launch in November 1994 into a sun-synchronous, 650-km circular orbit with an inclination angle of 97.8 deg and an ascending node at 3:00 p.m. local time.

  16. Introduction to the Design and Optimization of Experiments Using Response Surface Methodology. A Gas Chromatography Experiment for the Instrumentation Laboratory

    ERIC Educational Resources Information Center

    Lang, Patricia L.; Miller, Benjamin I.; Nowak, Abigail Tuttle

    2006-01-01

    The study describes how to design and optimize an experiment with multiple factors and multiple responses. The experiment uses fractional factorial analysis as a screening experiment only to identify important instrumental factors and does not use response surface methodology to find the optimal set of conditions.

  17. Design and Implementation of an experiment-specific Payload Orientation Platform for balloon-borne Experiment .

    NASA Astrophysics Data System (ADS)

    Devarajan, Anand; Rodi, Ashish; Ojha, Devendra

    2012-07-01

    To investigate the mesospheric dynamics and its coupling to the upper atmospheric regions above, a Balloon-borne optical Investigation of Regional-atmospheric Dynamics (BIRD) experiment was jointly conducted by Physical Research Laboratory Ahmedabad and Boston University, on 08 March 2010 from TIFR Balloon Facility, Hyderabad. Along with the BIRD payload, a nano payload of University of York, Canada was also flown for aerosol studies during sunset. The balloon carrying a 335kg BIRD payload was launched at 1052 hrs, reached a float altitude of 34.8km amsl at 1245 hrs and was allowed to float till 1825 hrs before it was parachuted down. To achieve the experimental objectives, it was essential that the payload Gandola, comprising of two optical spectrographs, is programmed to rotate azimuthally in 3 steps of 30 degrees each from East-West (E-W) to North-South (N-S) direction, stop at each step for 5 minutes for data acquisition, return to the original E-W position and keep repeating the sequence continuously with a provision to start or stop the orientation from Ground station through telecommand. To meet these unique requirements, we designed developed and implemented a Payload Orientation Platform (POP), using flux-gate magnetometer for direction-finding, which worked satisfactorily in the BIRD flight. This paper presents an overview of the POP implemented, focuses on the design considerations of the associated electronics and finally presents the results of the performance during the entire balloon flight.

  18. Stillbirth Collaborative Research Network: design, methods and recruitment experience.

    PubMed

    Parker, Corette B; Hogue, Carol J R; Koch, Matthew A; Willinger, Marian; Reddy, Uma M; Thorsten, Vanessa R; Dudley, Donald J; Silver, Robert M; Coustan, Donald; Saade, George R; Conway, Deborah; Varner, Michael W; Stoll, Barbara; Pinar, Halit; Bukowski, Radek; Carpenter, Marshall; Goldenberg, Robert

    2011-09-01

    The Stillbirth Collaborative Research Network (SCRN) has conducted a multisite, population-based, case-control study, with prospective enrollment of stillbirths and livebirths at the time of delivery. This paper describes the general design, methods and recruitment experience. The SCRN attempted to enroll all stillbirths and a representative sample of livebirths occurring to residents of pre-defined geographical catchment areas delivering at 59 hospitals associated with five clinical sites. Livebirths <32 weeks gestation and women of African descent were oversampled. The recruitment hospitals were chosen to ensure access to at least 90% of all stillbirths and livebirths to residents of the catchment areas. Participants underwent a standardised protocol including maternal interview, medical record abstraction, placental pathology, biospecimen testing and, in stillbirths, post-mortem examination. Recruitment began in March 2006 and was completed in September 2008 with 663 women with a stillbirth and 1932 women with a livebirth enrolled, representing 69% and 63%, respectively, of the women identified. Additional surveillance for stillbirths continued until June 2009 and a follow-up of the case-control study participants was completed in December 2009. Among consenting women, there were high consent rates for the various study components. For the women with stillbirths, 95% agreed to a maternal interview, chart abstraction and a placental pathological examination; 91% of the women with a livebirth agreed to all of these components. Additionally, 84% of the women with stillbirths agreed to a fetal post-mortem examination. This comprehensive study is poised to systematically study a wide range of potential causes of, and risk factors for, stillbirths and to better understand the scope and incidence of the problem.

  19. Stillbirth Collaborative Research Network: Design, Methods and Recruitment Experience

    PubMed Central

    Parker, Corette B.; Hogue, Carol J. Rowland; Koch, Matthew A.; Willinger, Marian; Reddy, Uma; Thorsten, Vanessa R.; Dudley, Donald J.; Silver, Robert M.; Coustan, Donald; Saade, George R.; Conway, Deborah; Varner, Michael W.; Stoll, Barbara; Pinar, Halit; Bukowski, Radek; Carpenter, Marshall; Goldenberg, Robert

    2013-01-01

    SUMMARY The Stillbirth Collaborative Research Network (SCRN) has conducted a multisite, population-based, case-control study, with prospective enrollment of stillbirths and live births at the time of delivery. This paper describes the general design, methods, and recruitment experience. The SCRN attempted to enroll all stillbirths and a representative sample of live births occurring to residents of pre-defined geographic catchment areas delivering at 59 hospitals associated with five clinical sites. Live births <32 weeks gestation and women of African descent were oversampled. The recruitment hospitals were chosen to ensure access to at least 90% of all stillbirths and live births to residents of the catchment areas. Participants underwent a standardized protocol including maternal interview, medical record abstraction, placental pathology, biospecimen testing, and, in stillbirths, postmortem examination. Recruitment began in March 2006 and was completed in September 2008 with 663 women with a stillbirth and 1932 women with a live birth enrolled, representing 69% and 63%, respectively, of the women identified. Additional surveillance for stillbirth continued through June 2009 and a follow-up of the case-control study participants was completed in December 2009. Among consenting women, there were high consent rates for the various study components. For the women with stillbirth, 95% agreed to maternal interview, chart abstraction, and placental pathologic examination; 91% of the women with live birth agreed to all of these components. Additionally, 84% of the women with stillbirth agreed to a fetal postmortem examination. This comprehensive study is poised to systematically study a wide range of potential causes of, and risk factors for, stillbirth and to better understand the scope and incidence of the problem. PMID:21819424

  20. Gender Consideration in Experiment Design for Airbrake in Prebreathe

    NASA Technical Reports Server (NTRS)

    Conkin, Johnny; Gernhardt, Michael I.; Dervay, Joseph P.

    2007-01-01

    If gender is a confounder of the decompression sickness (DCS) or venous gas emboli (VGE) outcomes of a proposed air break in oxygen prebreathe (PB) project, then decisions about the final experiment design must be made. We evaluated if the incidence of DCS and VGE from tests in altitude chambers over 20 years were different between men and women after resting and exercise prebreathe protocols. Nitrogen washout during PB is our primary risk mitigation strategy to prevent subsequent DCS and VGE in subjects. Bubbles in the pulmonary artery (venous blood) were detected from the precordial position using Doppler ultrasound bubble detectors. The subjects were monitored for VGE for four min at about 15 min intervals for the duration of the altitude exposure, with maximum bubble grade assigned a Spencer Grade of IV. There was no difference in DCS incidence between men and women in either PB protocol. The incidence of VGE and Grade IV VGE is statistically lower in women compared to men after resting PB. Even when 10 tests were compared with Mantel-Haenszel 2 where both men (n = 168) and women (n = 92) appeared, the p-value for VGE incidence was still significant at 0.03. The incidence of VGE and Grade IV VGE is not statistically lower in women compared to men after exercise PB. Even when six tests were compared with Mantel-Haenszel x2 where both men (n = 165) and women (n = 49) appeared, the p-value for VGE incidence was still not significant at 0.90. Our goal is to understand the risk of brief air breaks during PB without other confounding variables invalidating our conclusions. The cost to additionally account for the confounding role of gender on VGE outcome after resting PB is judged excessive. Our decision is to only evaluate air breaks in the exercise PB protocol. So there is no restriction to recruiting women as test subjects.

  1. Skylab Earth Resource Experiment Package critical design review. [conference

    NASA Technical Reports Server (NTRS)

    1973-01-01

    An outline of the conference for reviewing the design of the EREP is presented. Systems design for review include: tape recorder, support equipment, view finder/tracking, support hardware, and control and display panel.

  2. Viking dynamics experience with application to future payload design

    NASA Technical Reports Server (NTRS)

    Barrett, S.; Rader, W. P.; Payne, K. R.

    1978-01-01

    Analytical and test techniques are discussed. Areas in which hindsight indicated erroneous, redundant, or unnecessarily severe design and test specifications are identified. Recommendations are made for improvements in the dynamic design and criteria philosophy, aimed at reducing costs for payloads.

  3. Predictive Model for the Design of Zwitterionic Polymer Brushes: A Statistical Design of Experiments Approach.

    PubMed

    Kumar, Ramya; Lahann, Joerg

    2016-07-01

    The performance of polymer interfaces in biology is governed by a wide spectrum of interfacial properties. With the ultimate goal of identifying design parameters for stem cell culture coatings, we developed a statistical model that describes the dependence of brush properties on surface-initiated polymerization (SIP) parameters. Employing a design of experiments (DOE) approach, we identified operating boundaries within which four gel architecture regimes can be realized, including a new regime of associated brushes in thin films. Our statistical model can accurately predict the brush thickness and the degree of intermolecular association of poly[{2-(methacryloyloxy) ethyl} dimethyl-(3-sulfopropyl) ammonium hydroxide] (PMEDSAH), a previously reported synthetic substrate for feeder-free and xeno-free culture of human embryonic stem cells. DOE-based multifunctional predictions offer a powerful quantitative framework for designing polymer interfaces. For example, model predictions can be used to decrease the critical thickness at which the wettability transition occurs by simply increasing the catalyst quantity from 1 to 3 mol %.

  4. Predictive Model for the Design of Zwitterionic Polymer Brushes: A Statistical Design of Experiments Approach.

    PubMed

    Kumar, Ramya; Lahann, Joerg

    2016-07-01

    The performance of polymer interfaces in biology is governed by a wide spectrum of interfacial properties. With the ultimate goal of identifying design parameters for stem cell culture coatings, we developed a statistical model that describes the dependence of brush properties on surface-initiated polymerization (SIP) parameters. Employing a design of experiments (DOE) approach, we identified operating boundaries within which four gel architecture regimes can be realized, including a new regime of associated brushes in thin films. Our statistical model can accurately predict the brush thickness and the degree of intermolecular association of poly[{2-(methacryloyloxy) ethyl} dimethyl-(3-sulfopropyl) ammonium hydroxide] (PMEDSAH), a previously reported synthetic substrate for feeder-free and xeno-free culture of human embryonic stem cells. DOE-based multifunctional predictions offer a powerful quantitative framework for designing polymer interfaces. For example, model predictions can be used to decrease the critical thickness at which the wettability transition occurs by simply increasing the catalyst quantity from 1 to 3 mol %. PMID:27268965

  5. Model-Driven Design: Systematically Building Integrated Blended Learning Experiences

    ERIC Educational Resources Information Center

    Laster, Stephen

    2010-01-01

    Developing and delivering curricula that are integrated and that use blended learning techniques requires a highly orchestrated design. While institutions have demonstrated the ability to design complex curricula on an ad-hoc basis, these projects are generally successful at a great human and capital cost. Model-driven design provides a…

  6. Design of experiments for zeroth and first-order reaction rates.

    PubMed

    Amo-Salas, Mariano; Martín-Martín, Raúl; Rodríguez-Aragón, Licesio J

    2014-09-01

    This work presents optimum designs for reaction rates experiments. In these experiments, time at which observations are to be made and temperatures at which reactions are to be run need to be designed. Observations are performed along time under isothermal conditions. Each experiment needs a fixed temperature and so the reaction can be measured at the designed times. For these observations under isothermal conditions over the same reaction a correlation structure has been considered. D-optimum designs are the aim of our work for zeroth and first-order reaction rates. Temperatures for the isothermal experiments and observation times, to obtain the most accurate estimates of the unknown parameters, are provided in these designs. D-optimum designs for a single observation in each isothermal experiment or for several correlated observations have been obtained. Robustness of the optimum designs for ranges of the correlation parameter and comparisons of the information gathered by different designs are also shown.

  7. Design of experiments for zeroth and first-order reaction rates.

    PubMed

    Amo-Salas, Mariano; Martín-Martín, Raúl; Rodríguez-Aragón, Licesio J

    2014-09-01

    This work presents optimum designs for reaction rates experiments. In these experiments, time at which observations are to be made and temperatures at which reactions are to be run need to be designed. Observations are performed along time under isothermal conditions. Each experiment needs a fixed temperature and so the reaction can be measured at the designed times. For these observations under isothermal conditions over the same reaction a correlation structure has been considered. D-optimum designs are the aim of our work for zeroth and first-order reaction rates. Temperatures for the isothermal experiments and observation times, to obtain the most accurate estimates of the unknown parameters, are provided in these designs. D-optimum designs for a single observation in each isothermal experiment or for several correlated observations have been obtained. Robustness of the optimum designs for ranges of the correlation parameter and comparisons of the information gathered by different designs are also shown. PMID:27535778

  8. A Learning, Research and Development Framework to Design for a "Holistic" Learning Experience

    ERIC Educational Resources Information Center

    Carroll, Fiona; Kop, Rita

    2011-01-01

    The design of experiences and, in particular, educational experiences is a complex matter and involves not only using effective technologies and applying cognitive triggers, but there is a need to "think outside the box" in order to also design for the affective dimension of human experiences; the impressions, feelings and interactions that a…

  9. The Historical and Situated Nature Design Experiments--Implications for Data Analysis

    ERIC Educational Resources Information Center

    Krange, I.; Ludvigsen, Sten

    2009-01-01

    This article is a methodological contribution to the use of design experiments in educational research. We will discuss the implications of a historical and situated interpretation to design experiments, the consequences this has for the analysis of the collected data and empirically based suggestions to improve the designs of the computer-based…

  10. The BWR advanced fuel design experience using Studsvik CMS

    SciTech Connect

    DiGiovine, A.S.; Gibbon, S.H.; Wiksell, G.

    1996-12-31

    The current trend within the nuclear industry is to maximize generation by extending cycle lengths and taking outages as infrequently as possible. As a result, many utilities have begun to use fuel designed to meet these more demanding requirements. These fuel designs are significantly more heterogeneous in mechanical and neutronic detail than prior designs. The question arises as to how existing in-core fuel management codes, such as Studsvik CMS perform in modeling cores containing these designs. While this issue pertains to both pressurized water reactors (PWRs) and boiling water reactors (BWRs), this summary focuses on BWR applications.

  11. Open Source Software for Experiment Design and Control. (tutorial)

    ERIC Educational Resources Information Center

    Hillenbrand, James M.; Gayvert, Robert T.

    2005-01-01

    The purpose of this paper is to describe a software package that can be used for performing such routine tasks as controlling listening experiments (e.g., simple labeling, discrimination, sentence intelligibility, and magnitude estimation), recording responses and response latencies, analyzing and plotting the results of those experiments,…

  12. Applying modeling Results in designing a global tropospheric experiment

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A set of field experiments and advanced modeling studies which provide a strategy for a program of global tropospheric experiments was identified. An expanded effort to develop space applications for trospheric air quality monitoring and studies was recommended. The tropospheric ozone, carbon, nitrogen, and sulfur cycles are addressed. Stratospheric-tropospheric exchange is discussed. Fast photochemical processes in the free troposphere are considered.

  13. On the design of experiments to study extreme field limits

    NASA Astrophysics Data System (ADS)

    Bulanov, S. S.; Chen, M.; Schroeder, C. B.; Esarey, E.; Leemans, W. P.; Bulanov, S. V.; Esirkepov, T. Zh.; Kando, M.; Koga, J. K.; Zhidkov, A. G.; Chen, P.; Mur, V. D.; Narozhny, N. B.; Popov, V. S.; Thomas, A. G. R.; Korn, G.

    2012-12-01

    We propose experiments on the collision of high intensity electromagnetic pulses with electron bunches and on the collision of multiple electromagnetic pulses for studying extreme field limits in the nonlinear interaction of electromagnetic waves. The effects of nonlinear QED will be revealed in these laser plasma experiments.

  14. Laser communication experiment. Volume 1: Design study report: Spacecraft transceiver. Part 1: Transceiver design

    NASA Technical Reports Server (NTRS)

    1970-01-01

    The ATS-F Laser Communications Experiment (LCE) is the first significant step in the application of laser systems to space communications. The space-qualified laser communications system being developed in this experiment, and the data resulting from its successful deployment in space, will be applicable to the use of laser communications systems in a wide variety of manned as well as unmanned space missions, both near earth and in deep space. Particular future NASA missions which can benefit from this effort are the Tracking and Data Relay Satellite System and the Earth Resources Satellites. The LCE makes use of carbon dioxide lasers to establish simultaneous, two-way communication between the ATS-F synchronous satellite and a ground station. In addition, the LCE is designed to permit communication with a similar spacecraft transceiver proposed to be flown on ATS-G, nominally one year after the launch of ATS-F. This would be the first attempt to employ lasers for satellite-to-satellite communications.

  15. Kinetics experiments and bench-scale system: Background, design, and preliminary experiments

    SciTech Connect

    Rofer, C.K.

    1987-10-01

    The project, Supercritical Water Oxidation of Hazardous Chemical Waste, is a Hazardous Waste Remedial Actions Program (HAZWRAP) Research and Development task being carried out by the Los Alamos National Laboratory. Its objective is to obtain information for use in understanding the basic technology and for scaling up and applying oxidation in supercritical water as a viable process for treating a variety of DOE-DP waste streams. This report gives the background and rationale for kinetics experiments on oxidation in supercritical water being carried out as a part of this HAZWRAP Research and Development task. It discusses supercritical fluid properties and their relevance to applying this process to the destruction of hazardous wastes. An overview is given of the small emerging industry based on applications of supercritical water oxidation. Factors that could lead to additional applications are listed. Modeling studies are described as a basis for the experimental design. The report describes plug flow reactor and batch reactor systems, and presents preliminary results. 28 refs., 4 figs., 5 tabs.

  16. Space Shuttle Orbiter thermal protection system design and flight experience

    NASA Technical Reports Server (NTRS)

    Curry, Donald M.

    1993-01-01

    The Space Shuttle Orbiter Thermal Protection System materials, design approaches associated with each material, and the operational performance experienced during fifty-five successful flights are described. The flights to date indicate that the thermal and structural design requirements were met and that the overall performance was outstanding.

  17. The Future of Management as Design: A Thought Experiment

    ERIC Educational Resources Information Center

    Bouchard, Veronique; del Forno, Leon

    2012-01-01

    Purpose: Management practices and education are presently in a stage of reappraisal and a growing number of scholars and experts are suggesting that managers should be taught and adopt the approach and methodologies of designers. The purpose of this paper is to imagine the impact of this move and to try and foresee whether "management as design"…

  18. Redesigning the Urban Design Studio: Two Learning Experiments

    ERIC Educational Resources Information Center

    Pak, Burak; Verbeke, Johan

    2013-01-01

    The main aim of this paper is to discuss how the combination of Web 2.0, social media and geographic technologies can provide opportunities for learning and new forms of participation in an urban design studio. This discussion is mainly based on our recent findings from two experimental urban design studio setups as well as former research and…

  19. Advances in growth chart design and use: the UK experience.

    PubMed

    Wright, Charlotte M; Williams, Anthony F; Cole, Tim J

    2013-01-01

    As part of the process of adopting the WHO standard in the United Kingdom, the Royal College of Paediatrics and Child Health (RCPCH) was commissioned by the UK Department of Health to design new UK-WHO growth charts. The working group for this project combined expertise ranging from statistics and graphic design to qualitative research, as well as paediatrics, nursing and dietetics. New charts for children under 4 years were published in 2009 and are now widely used in the UK and beyond (www.growthcharts.rcpch.ac.uk). This paper will describe what we have learned in general about the process of designing charts and how these principles were applied to the design of a novel chart designed specifically for sick and premature infants. A successful design first requires clarity about the exact purpose of the chart and who will use it. The layout of the chart can then be varied in many ways to fit that use and ensure users are not misled. Users need consistent and well-evidenced rules for chart use. Drafting the instructions serves as a powerful test of the validity and clarity of the design. However, charts need also to be formally evaluated, as expert views will not reflect those of the average user. The Neonatal and Infant Close Monitoring (NICM) chart included various novel design features, including date boxes for gestational age adjustment and low SD lines to help assess very small infants. It was evaluated at three stages using plotting exercises and each phase led to substantial design changes. Growth charts are conceptually very complex, with the capacity to mislead as well as inform and should always be formally evaluated before implementation.

  20. Design reuse experience of space and hazardous operations robots

    NASA Technical Reports Server (NTRS)

    Oneil, P. Graham

    1994-01-01

    A comparison of design drivers for space and hazardous nuclear waste operating robots details similarities and differences in operations, performance and environmental parameters for these critical environments. The similarities are exploited to provide low risk system components based on reuse principles and design knowledge. Risk reduction techniques are used for bridging areas of significant differences. As an example, risk reduction of a new sensor design for nuclear environment operations is employed to provide upgradeable replacement units in a reusable architecture for significantly higher levels of radiation.

  1. KiloPower Project - KRUSTY Experiment Nuclear Design

    SciTech Connect

    Poston, David Irvin; Godfroy, Thomas; Mcclure, Patrick Ray; Sanchez, Rene Gerardo

    2015-07-20

    This PowerPoint presentation covers the following topics: Reference Kilopower configuration; Reference KRUSTY configuration; KRUSTY design sensitivities; KRUSTY reactivity coefficients; KRUSTY criticality safety and control; KRUSTY core activation/dose; and KRUSTY shielding, room activation/dose.

  2. Geothermal FIT Design: International Experience and U.S. Considerations

    SciTech Connect

    Rickerson, W.; Gifford, J.; Grace, R.; Cory, K.

    2012-08-01

    Developing power plants is a risky endeavor, whether conventional or renewable generation. Feed-in tariff (FIT) policies can be designed to address some of these risks, and their design can be tailored to geothermal electric plant development. Geothermal projects face risks similar to other generation project development, including finding buyers for power, ensuring adequate transmission capacity, competing to supply electricity and/or renewable energy certificates (RECs), securing reliable revenue streams, navigating the legal issues related to project development, and reacting to changes in existing regulations or incentives. Although FITs have not been created specifically for geothermal in the United States to date, a variety of FIT design options could reduce geothermal power plant development risks and are explored. This analysis focuses on the design of FIT incentive policies for geothermal electric projects and how FITs can be used to reduce risks (excluding drilling unproductive exploratory wells).

  3. An R package for simulation experiments evaluating clinical trial designs.

    PubMed

    Wang, Yuanyuan; Day, Roger

    2010-01-01

    This paper presents an open-source application for evaluating competing clinical trial (CT) designs using simulations. The S4 system of classes and methods is utilized. Using object-oriented programming provides extensibility through careful, clear interface specification; using R, an open-source widely-used statistical language, makes the application extendible by the people who design CTs: biostatisticians. Four key classes define the specifications of the population models, CT designs, outcome models and evaluation criteria. Five key methods define the interfaces for generating patient baseline characteristics, stopping rule, assigning treatment, generating patient outcomes and calculating the criteria. Documentation of their connections with the user input screens, with the central simulation loop, and with each other faciliates the extensibility. New subclasses and instances of existing classes meeting these interfaces can integrate immediately into the application. To illustrate the application, we evaluate the effect of patient pharmacokinetic heterogeneity on the performance of a common Phase I "3+3" design. PMID:21347151

  4. Review Committee report on the conceptual design of the Tokamak Physics Experiment

    SciTech Connect

    Not Available

    1993-04-01

    This report discusses the following topics on the conceptual design of the Tokamak Physics Experiment: Role and mission of TPX; overview of design; physics design assessment; engineering design assessment; evaluation of cost, schedule, and management plans; and, environment safety and health.

  5. Being in the Users' Shoes: Anticipating Experience while Designing Online Courses

    ERIC Educational Resources Information Center

    Rapanta, Chrysi; Cantoni, Lorenzo

    2014-01-01

    While user-centred design and user experience are given much attention in the e-learning design field, no research has been found on how users are actually represented in the discussions during the design of online courses. In this paper we identify how and when end-users' experience--be they students or tutors--emerges in designers'…

  6. Structural Optimization of a Force Balance Using a Computational Experiment Design

    NASA Technical Reports Server (NTRS)

    Parker, P. A.; DeLoach, R.

    2002-01-01

    This paper proposes a new approach to force balance structural optimization featuring a computational experiment design. Currently, this multi-dimensional design process requires the designer to perform a simplification by executing parameter studies on a small subset of design variables. This one-factor-at-a-time approach varies a single variable while holding all others at a constant level. Consequently, subtle interactions among the design variables, which can be exploited to achieve the design objectives, are undetected. The proposed method combines Modern Design of Experiments techniques to direct the exploration of the multi-dimensional design space, and a finite element analysis code to generate the experimental data. To efficiently search for an optimum combination of design variables and minimize the computational resources, a sequential design strategy was employed. Experimental results from the optimization of a non-traditional force balance measurement section are presented. An approach to overcome the unique problems associated with the simultaneous optimization of multiple response criteria is described. A quantitative single-point design procedure that reflects the designer's subjective impression of the relative importance of various design objectives, and a graphical multi-response optimization procedure that provides further insights into available tradeoffs among competing design objectives are illustrated. The proposed method enhances the intuition and experience of the designer by providing new perspectives on the relationships between the design variables and the competing design objectives providing a systematic foundation for advancements in structural design.

  7. DESIGN AND STATUS OF THE VISA II EXPERIMENT.

    SciTech Connect

    ANDONIAN,G.BABZIEN,MLBEN-ZVI,I.YAKIMENKO,Y.ET AL.

    2004-03-24

    VISA II is the follow-up project to the successful Visible to Infrared SASE Amplifier (VISA) experiment at the Accelerator Test Facility (ATF) in Brookhaven National Lab (BNL). This paper will report the motivation for and status of the two main experiments associated with the VISA II program. One goal of VISA II is to perform an experimental study of the physics of a chirped beam SASE FEL at the upgraded facilities of the ATF. This requires a linearization of the transport line to preserve energy chirping of the electron beam at injection. The other planned project is a strong bunch compression experiment, where the electron bunch is compressed in the chicane, and the dispersive beamline transport, allowing studies of deep saturation.

  8. The prototype design of the Stanford Relativity Gyro Experiment

    NASA Technical Reports Server (NTRS)

    Parkinson, Bradford W.; Everitt, C. W. Francis; Turneaure, John P.; Parmley, Richard T.

    1987-01-01

    The Stanford Relativity Gyroscope Experiment constitutes a fundamental test of Einstein's General Theory of Relativity, probing such heretofore untested aspects of the theory as those that relate to spin by means of drag-free satellite-borne gyroscopes. General Relativity's prediction of two orthogonal precessions (motional and geodetic) for a perfect Newtonian gyroscope in polar orbit has not yet been experimentally assessed, and will mark a significant advancement in experimental gravitation. The technology employed in the experiment has been under development for 25 years at NASA's Marshall Space Flight Center. Four fused quartz gyroscopes will be used.

  9. AGC-1 Experiment and Final Preliminary Design Report

    SciTech Connect

    Robert L. Bratton; Tim Burchell

    2006-08-01

    This report details the experimental plan and design as of the preliminary design review for the Advanced Test Reactor Graphite Creep-1 graphite compressive creep capsule. The capsule will contain five graphite grades that will be irradiated in the Advanced Test Reactor at the Idaho National Laboratory to determine the irradiation induced creep constants. Seven other grades of graphite will be irradiated to determine irradiated physical properties. The capsule will have an irradiation temperature of 900 C and a peak irradiation dose of 5.8 x 10{sup 21} n/cm{sup 2} [E > 0.1 MeV], or 4.2 displacements per atom.

  10. An Improved Design of a Simple Tubular Reactor Experiment.

    ERIC Educational Resources Information Center

    Asfour, Abdul-Fattah A.

    1985-01-01

    Background information, procedures used, and typical results obtained are provided for an experiment which: (1) examines the effect of residence time on conversion in a tubular flow reactor; and (2) compares the experimental conversions with those obtained from plug-flow and laminar-flow reactor models. (JN)

  11. Conditional Optimal Design in Three- and Four-Level Experiments

    ERIC Educational Resources Information Center

    Hedges, Larry V.; Borenstein, Michael

    2014-01-01

    The precision of estimates of treatment effects in multilevel experiments depends on the sample sizes chosen at each level. It is often desirable to choose sample sizes at each level to obtain the smallest variance for a fixed total cost, that is, to obtain optimal sample allocation. This article extends previous results on optimal allocation to…

  12. An experiment on simplifying conjoint analysis designs for measuring preferences.

    PubMed

    Maddala, Tara; Phillips, Kathryn A; Reed Johnson, F

    2003-12-01

    In conjoint analysis (CA) studies, choosing between scenarios with multiple health attributes may be demanding for respondents. This study examined whether simplifying the choice task in CA designs, by using a design with more overlap of attribute levels, provides advantages over standard minimal-overlap methods. Two experimental conditions, minimal and increased-overlap discrete choice CA designs, were administered to 353 respondents as part of a larger HIV testing preference survey. In the minimal-overlap survey, all six attribute levels were allowed to vary. In the increased-overlap survey, an average of two attribute levels were the same between each set of scenarios. We hypothesized that the increased-overlap design would reduce cognitive burden, while minimally impacting statistical efficiency. We did not find any significant improvement in consistency, willingness to trade, perceived difficulty, fatigue, or efficiency, although several results were in the expected direction. However, evidence suggested that there were differences in stated preferences. The results increase our understanding of how respondents answer CA questions and how to improve future surveys.

  13. Structuring the Group Experience: A Format for Designing Psychoeducational Groups.

    ERIC Educational Resources Information Center

    Furr, Susan R.

    2000-01-01

    Presents six-step model for moving from a general statement of purpose to a psychoeducational group design that includes didactic content, experiential activities, and processing. By following this model the group facilitator will be able to develop a psychoeducational group that provides a logical sequence of learning activities fostering…

  14. Unknown Gases: Student-Designed Experiments in the Introductory Laboratory.

    ERIC Educational Resources Information Center

    Hanson, John; Hoyt, Tim

    2002-01-01

    Introductory students design and carry-out experimental procedures to determine the identity of three unknown gases from a list of eight possibilities: air, nitrogen, oxygen, argon, carbon dioxide, helium, methane, and hydrogen. Students are excited and motivated by the opportunity to come up with their own experimental approach to solving a…

  15. Managerial-Skills Development: An Experience in Program Design

    ERIC Educational Resources Information Center

    Thorne, Edward H.; Marshall, Jean L.

    1976-01-01

    The article is an overview of the design of a Managerial Skills Development Program Model in an industrial setting which was based on adult education principles. Discussed are: program objectives and philosophy, educative environment, group commitment, group-centered action, program evaluation and revision, manager/instructor teams, and…

  16. User-Centered Design in Practice: The Brown University Experience

    ERIC Educational Resources Information Center

    Bordac, Sarah; Rainwater, Jean

    2008-01-01

    This article presents a case study in user-centered design that explores the needs and preferences of undergraduate users. An analysis of LibQual+ and other user surveys, interviews with public service staff, and a formal American with Disabilities Act accessibility review served as the basis for planning a redesign of the Brown University…

  17. Modified modular imaging system designed for a sounding rocket experiment

    NASA Astrophysics Data System (ADS)

    Veach, Todd J.; Scowen, Paul A.; Beasley, Matthew; Nikzad, Shouleh

    2012-09-01

    We present the design and system calibration results from the fabrication of a charge-coupled device (CCD) based imaging system designed using a modified modular imager cell (MIC) used in an ultraviolet sounding rocket mission. The heart of the imaging system is the MIC, which provides the video pre-amplifier circuitry and CCD clock level filtering. The MIC is designed with standard four-layer FR4 printed circuit board (PCB) with surface mount and through-hole components for ease of testing and lower fabrication cost. The imager is a 3.5k by 3.5k LBNL p-channel CCD with enhanced quantum efficiency response in the UV using delta-doping technology at JPL. The recently released PCIe/104 Small-Cam CCD controller from Astronomical Research Cameras, Inc (ARC) performs readout of the detector. The PCIe/104 Small-Cam system has the same capabilities as its larger PCI brethren, but in a smaller form factor, which makes it ideally suited for sub-orbital ballistic missions. The overall control is then accomplished using a PCIe/104 computer from RTD Embedded Technologies, Inc. The design, fabrication, and testing was done at the Laboratory for Astronomical and Space Instrumentation (LASI) at Arizona State University. Integration and flight calibration are to be completed at the University of Colorado Boulder before integration into CHESS.

  18. Staying True to the Core: Designing the Future Academic Library Experience

    ERIC Educational Resources Information Center

    Bell, Steven J.

    2014-01-01

    In 2014, the practice of user experience design in academic libraries continues to evolve. It is typically applied in the context of interactions with digital interfaces. Some academic librarians are applying user experience approaches more broadly to design both environments and services with human-centered strategies. As the competition for the…

  19. The design of experiments in the rubber industry: A European viewpoint

    SciTech Connect

    Hill, A.

    1991-04-01

    This article discusses the evolution of experiment design in Europe and Japan beginning about 1947. The topics include the use of statistics, optimization, relevant publications, experiment design in rubber manufacturing, product development, use of computers in compound development, and data analysis by computer. The review covers manufacturing from rubber to pharmaceuticals to automobiles.

  20. Carbon Taxes. A Review of Experience and Policy Design Considerations

    SciTech Connect

    Sumner, Jenny; Bird, Lori; Smith, Hillary

    2009-12-01

    State and local governments in the United States are evaluating a wide range of policies to reduce carbon emissions, including, in some instances, carbon taxes, which have existed internationally for nearly 20 years. This report reviews existing carbon tax policies both internationally and in the United States. It also analyzes carbon policy design and effectiveness. Design considerations include which sectors to tax, where to set the tax rate, how to use tax revenues, what the impact will be on consumers, and how to ensure emissions reduction goals are achieved. Emission reductions that are due to carbon taxes can be difficult to measure, though some jurisdictions have quantified reductions in overall emissions and other jurisdictions have examined impacts that are due to programs funded by carbon tax revenues.

  1. Carbon Taxes: A Review of Experience and Policy Design Considerations

    SciTech Connect

    Sumner, J.; Bird, L.; Smith, H.

    2009-12-01

    State and local governments in the United States are evaluating a wide range of policies to reduce carbon emissions, including, in some instances, carbon taxes, which have existed internationally for nearly 20 years. This report reviews existing carbon tax policies both internationally and in the United States. It also analyzes carbon policy design and effectiveness. Design considerations include which sectors to tax, where to set the tax rate, how to use tax revenues, what the impact will be on consumers, and how to ensure emissions reduction goals are achieved. Emission reductions that are due to carbon taxes can be difficult to measure, though some jurisdictions have quantified reductions in overall emissions and other jurisdictions have examined impacts that are due to programs funded by carbon tax revenues.

  2. Vanguard/PLACE experiment system design and test plan

    NASA Technical Reports Server (NTRS)

    Taylor, R. E.

    1973-01-01

    The design, development, and testing of the NASA-GFSC Position Location and Aircraft Communications Equipment (PLACE) at C band frequency are discussed. The equipment was installed on the USNS Vanguard. The tests involved a sea test to evalute the position-location, 2-way voice, and 2-way data communications capability of PLACE and a trilateration test to position-fix the ATS-5 satellite using the PLACE system.

  3. Student-Designed Fluid Experiment for DIME Competition

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Student-designed and -built apparatus for the second Dropping in a Microgravity Environment (DIME) competition held April 23-25, 2002, at NASA's Glenn Research Center. Competitors included two teams from Sycamore High School, Cincinnati, OH, and one each from Bay High School, Bay Village, OH, and COSI Academy, Columbus, OH. DIME is part of NASA's education and outreach activities. Details are on line at http://microgravity.grc.nasa.gov/DIME_2002.html.

  4. Integrated predictive modelling simulations of burning plasma experiment designs

    NASA Astrophysics Data System (ADS)

    Bateman, Glenn; Onjun, Thawatchai; Kritz, Arnold H.

    2003-11-01

    Models for the height of the pedestal at the edge of H-mode plasmas (Onjun T et al 2002 Phys. Plasmas 9 5018) are used together with the Multi-Mode core transport model (Bateman G et al 1998 Phys. Plasmas 5 1793) in the BALDUR integrated predictive modelling code to predict the performance of the ITER (Aymar A et al 2002 Plasma Phys. Control. Fusion 44 519), FIRE (Meade D M et al 2001 Fusion Technol. 39 336), and IGNITOR (Coppi B et al 2001 Nucl. Fusion 41 1253) fusion reactor designs. The simulation protocol used in this paper is tested by comparing predicted temperature and density profiles against experimental data from 33 H-mode discharges in the JET (Rebut P H et al 1985 Nucl. Fusion 25 1011) and DIII-D (Luxon J L et al 1985 Fusion Technol. 8 441) tokamaks. The sensitivities of the predictions are evaluated for the burning plasma experimental designs by using variations of the pedestal temperature model that are one standard deviation above and below the standard model. Simulations of the fusion reactor designs are carried out for scans in which the plasma density and auxiliary heating power are varied.

  5. Aerospace Engineering Systems and the Advanced Design Technologies Testbed Experience

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.; Livingston, Mary E.; Melton, John E.; Torres, Francisco J.; Stremel, Paul M.

    1999-01-01

    Continuous improvement of aerospace product development processes is a driving requirement across much of the aerospace community. As up to 90% of the cost of an aerospace product is committed during the first 10% of the development cycle, there is a strong emphasis on capturing, creating, and communicating better information (both requirements and performance) early in the product development process. The community has responded by pursuing the development of computer-based systems designed to enhance the decision-making capabilities of product development individuals and teams. Recently, the historical foci on sharing the geometrical representation and on configuration management are being augmented: 1) Physics-based analysis tools for filling the design space database; 2) Distributed computational resources to reduce response time and cost; 3) Web-based technologies to relieve machine-dependence; and 4) Artificial intelligence technologies to accelerate processes and reduce process variability. The Advanced Design Technologies Testbed (ADTT) activity at NASA Ames Research Center was initiated to study the strengths and weaknesses of the technologies supporting each of these trends, as well as the overall impact of the combination of these trends on a product development event. Lessons learned and recommendations for future activities are reported.

  6. Skylab SO71/SO72 circadian periodicity experiment. [experimental design and checkout of hardware

    NASA Technical Reports Server (NTRS)

    Fairchild, M. K.; Hartmann, R. A.

    1973-01-01

    The circadian rhythm hardware activities from 1965 through 1973 are considered. A brief history of the programs leading to the development of the combined Skylab SO71/SO72 Circadian Periodicity Experiment (CPE) is given. SO71 is the Skylab experiment number designating the pocket mouse circadian experiment, and SO72 designates the vinegar gnat circadian experiment. Final design modifications and checkout of the CPE, integration testing with the Apollo service module CSM 117 and the launch preparation and support tasks at Kennedy Space Center are reported.

  7. Design of experiments with multiple independent variables: a resource management perspective on complete and reduced factorial designs.

    PubMed

    Collins, Linda M; Dziak, John J; Li, Runze

    2009-09-01

    An investigator who plans to conduct an experiment with multiple independent variables must decide whether to use a complete or reduced factorial design. This article advocates a resource management perspective on making this decision, in which the investigator seeks a strategic balance between service to scientific objectives and economy. Considerations in making design decisions include whether research questions are framed as main effects or simple effects; whether and which effects are aliased (confounded) in a particular design; the number of experimental conditions that must be implemented in a particular design and the number of experimental subjects the design requires to maintain the desired level of statistical power; and the costs associated with implementing experimental conditions and obtaining experimental subjects. In this article 4 design options are compared: complete factorial, individual experiments, single factor, and fractional factorial. Complete and fractional factorial designs and single-factor designs are generally more economical than conducting individual experiments on each factor. Although relatively unfamiliar to behavioral scientists, fractional factorial designs merit serious consideration because of their economy and versatility.

  8. Mini-columns for Conducting Breakthrough Experiments. Design and Construction

    SciTech Connect

    Dittrich, Timothy M.; Reimus, Paul William; Ware, Stuart Douglas

    2015-06-11

    Experiments with moderately and strongly sorbing radionuclides (i.e., U, Cs, Am) have shown that sorption between experimental solutions and traditional column materials must be accounted for to accurately determine stationary phase or porous media sorption properties (i.e., sorption site density, sorption site reaction rate coefficients, and partition coefficients or Kd values). This report details the materials and construction of mini-columns for use in breakthrough columns to allow for accurate measurement and modeling of sorption parameters. Material selection, construction techniques, wet packing of columns, tubing connections, and lessons learned are addressed.

  9. Lockheed design of a wind satellite (WINDSAT) experiment

    NASA Technical Reports Server (NTRS)

    Osmundson, John S.; Martin, Stephen C.

    1985-01-01

    WINDSAT is a proposed space based global wind measuring system. A Shuttleborne experiment is proposed as a proof of principle demonstration before development of a full operational system. WINDSAT goals are to measure wind speed and direction to + or - 1 m/s and 10 deg accuracy over the entire earth from 0 to 20 km altitude with 1 km altitude resolution. The wind measuring instrument is a coherent lidar incorporating a pulsed CO2 TEA laser transmitter and a continuously scanning 1.25 m diameter optical system. The wind speed is measured by heterodyne detecting the backscattered return laser radiation and measuring this frequency shift.

  10. Pixel multichip module design for a high energy physics experiment

    SciTech Connect

    Guilherme Cardoso et al.

    2003-11-05

    At Fermilab, a pixel detector multichip module is being developed for the BTeV experiment. The module is composed of three layers. The lowest layer is formed by the readout integrated circuits (ICs). The back of the ICs is in thermal contact with the supporting structure, while the top is flip-chip bump-bonded to the pixel sensor. A low mass flex-circuit interconnect is glued on the top of this assembly, and the readout IC pads are wire-bounded to the circuit. This paper presents recent results on the development of a multichip module prototype and summarizes its performance characteristics.

  11. On transonic flow models for optimized design and experiment

    NASA Astrophysics Data System (ADS)

    Stodůlka, Jiří; Sobieczky, Helmut

    2014-03-01

    In the paper the near sonic flow theory for flows with small perturbations to sonic parallel flow is developed. This theory stands on the basis of potential flow of a compressible fluid and enables to receive an exact solution of the flow parameters past transonic cusped airfoils and their geometrical description. Generated airfoil shapes are tested using CFD ANSYS Fluent code to validate the results. Obtained numerical results from all-round commercial code show good accordance with the theory and confirm their value for future work in transonic design.

  12. Student-Designed Fluid Experiment for DIME Competition

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Test tubes to hold different types of fluids while in free-fall were among the student-designed items for the second Dropping in a Microgravity Environment (DIME) competition held April 23-25, 2002, at NASA's Glenn Research Center. Competitors included two teams from Sycamore High School, Cincinnati, OH, and one each from Bay High School, Bay Village, OH, and COSI Academy, Columbus, OH. DIME is part of NASA's education and outreach activities. Details are on line at http://microgravity.grc.nasa.gov/DIME_2002.html.

  13. Engineering design of the FRX-C experiment

    SciTech Connect

    Kewish, R.W. Jr.; Bartsch, R.R.; Siemon, R.E.

    1981-01-01

    Research on Compact Toroid (CT) configurations has been greatly accelerated in the last few years because of their potential for providing a practical and economical fusion system. Los Alamos research is being concentrated on two types of configurations: (1) magnetized-gun-produced Spheromaks (configurations that contain a mixture of toroidal and poloidal fields); and (2) field-reversed configurations (FRCs) that contain purely poloidal magnetic field. This paper describes the design of FRX-C, a field-reversed theta pinch used to form FRCs.

  14. Design of a new liquid cell for shock experiments

    SciTech Connect

    Reinhart, W.D.; Chhabildas, L.C.

    1999-11-22

    Controlled impact methodology has been used on a powdergun to obtain dynamic behavior properties of Tributyl Phosphate (TBP). A novel test methodology is used to provide extremely accurate equation of state data of the liquid. A thin aluminum plate used for confining the liquid also serves as a diagnostic to provide reshock states and subsequent release adiabats from the reshocked state. Polar polymer, polyvinylidene fluoride (PVDF) gauges and velocity interferometer system for any reflector (VISAR) provided redundant and precise data of temporal resolution to five nanoseconds and shock velocity measurements of better than 1%. The design and test methodologies are presented in this paper.

  15. Shuttle Orbiter Active Thermal Control Subsystem design and flight experience

    NASA Technical Reports Server (NTRS)

    Bond, Timothy A.; Metcalf, Jordan L.; Asuncion, Carmelo

    1991-01-01

    The paper examines the design of the Space Shuttle Orbiter Active Thermal Control Subsystem (ATCS) constructed for providing the vehicle and payload cooling during all phases of a mission and during ground turnaround operations. The operation of the Shuttle ATCS and some of the problems encountered during the first 39 flights of the Shuttle program are described, with special attention given to the major problems encountered with the degradation of the Freon flow rate on the Orbiter Columbia, the Flash Evaporator Subsystem mission anomalies which occurred on STS-26 and STS-34, and problems encountered with the Ammonia Boiler Subsystem. The causes and the resolutions of these problems are discussed.

  16. Regulatory affairs in biotechnology: optimal statistical designs for biomedical experiments.

    PubMed

    Carriere, K C

    1998-01-01

    One of the major issues in all applications of biotechnology is how to regulate the process through which new technological information is produced. The end products of biotechnological applications are diverse (e.g., better drugs, better interventions, better fertilizers). Such applications should be properly regulated to obtain valid scientific findings in the most efficient way possible. Some statistically optimal designs are more popularly employed than others as regulatory tools in medical, pharmaceutical and clinical trials. The statistical and practical properties (strengths and weaknesses) are presented to better appreciate their optimality. Recent developments on some related issues are also reviewed.

  17. Design of Experiments for the Thermal Characterization of Metallic Foam

    NASA Technical Reports Server (NTRS)

    Crittenden, Paul E.; Cole, Kevin D.

    2003-01-01

    Metallic foams are being investigated for possible use in the thermal protection systems of reusable launch vehicles. As a result, the performance of these materials needs to be characterized over a wide range of temperatures and pressures. In this paper a radiation/conduction model is presented for heat transfer in metallic foams. Candidates for the optimal transient experiment to determine the intrinsic properties of the model are found by two methods. First, an optimality criterion is used to find an experiment to find all of the parameters using one heating event. Second, a pair of heating events is used to determine the parameters in which one heating event is optimal for finding the parameters related to conduction, while the other heating event is optimal for finding the parameters associated with radiation. Simulated data containing random noise was analyzed to determine the parameters using both methods. In all cases the parameter estimates could be improved by analyzing a larger data record than suggested by the optimality criterion.

  18. Young Children's Learning of Novel Digital Interfaces: How Technology Experience, Age, and Design Come into Play

    ERIC Educational Resources Information Center

    Gilutz, Shuli

    2009-01-01

    This study looks at the relationship between age, technology experience, and design factors in determining young children's comprehension of novel digital interfaces. In Experiment 1, 35 preschoolers played three games that varied in complexity and familiarity. Parental questionnaires were used to assess children's previous technology experience.…

  19. Characterizing Variability in Smestad and Gratzel's Nanocrystalline Solar Cells: A Collaborative Learning Experience in Experimental Design

    ERIC Educational Resources Information Center

    Lawson, John; Aggarwal, Pankaj; Leininger, Thomas; Fairchild, Kenneth

    2011-01-01

    This article describes a collaborative learning experience in experimental design that closely approximates what practicing statisticians and researchers in applied science experience during consulting. Statistics majors worked with a teaching assistant from the chemistry department to conduct a series of experiments characterizing the variation…

  20. Students' Design of Experiments: An Inquiry Module on the Conduction of Heat

    ERIC Educational Resources Information Center

    Hatzikraniotis, E.; Kallery, M.; Molohidis, A.; Psillos, D.

    2010-01-01

    This article examines secondary students' design of experiments after engagement in an innovative and inquiry-oriented module on heat transfer. The module consists of an integration of hands-on experiments, simulated experiments and microscopic model simulations, includes a structured series of guided investigative tasks and was implemented for a…

  1. A Case Study of Professors' and Instructional Designers' Experiences in the Development of Online Courses

    ERIC Educational Resources Information Center

    Stevens, Karl B.

    2012-01-01

    The purpose of this qualitative case study was to examine the experiences of instructional designers and professors during the online course development process and to determine if their experiences had an effect on the process itself. To gain an understanding of their experiences, open-ended interviews were conducted, seeking descriptions of…

  2. Inflatable Re-Entry Vehicle Experiment (IRVE) Design Overview

    NASA Technical Reports Server (NTRS)

    Hughes, Stephen J.; Dillman, Robert A.; Starr, Brett R.; Stephan, Ryan A.; Lindell, Michael C.; Player, Charles J.; Cheatwood, F. McNeil

    2005-01-01

    Inflatable aeroshells offer several advantages over traditional rigid aeroshells for atmospheric entry. Inflatables offer increased payload volume fraction of the launch vehicle shroud and the possibility to deliver more payload mass to the surface for equivalent trajectory constraints. An inflatable s diameter is not constrained by the launch vehicle shroud. The resultant larger drag area can provide deceleration equivalent to a rigid system at higher atmospheric altitudes, thus offering access to higher landing sites. When stowed for launch and cruise, inflatable aeroshells allow access to the payload after the vehicle is integrated for launch and offer direct access to vehicle structure for structural attachment with the launch vehicle. They also offer an opportunity to eliminate system duplication between the cruise stage and entry vehicle. There are however several potential technical challenges for inflatable aeroshells. First and foremost is the fact that they are flexible structures. That flexibility could lead to unpredictable drag performance or an aerostructural dynamic instability. In addition, durability of large inflatable structures may limit their application. They are susceptible to puncture, a potentially catastrophic insult, from many possible sources. Finally, aerothermal heating during planetary entry poses a significant challenge to a thin membrane. NASA Langley Research Center and NASA's Wallops Flight Facility are jointly developing inflatable aeroshell technology for use on future NASA missions. The technology will be demonstrated in the Inflatable Re-entry Vehicle Experiment (IRVE). This paper will detail the development of the initial IRVE inflatable system to be launched on a Terrier/Orion sounding rocket in the fourth quarter of CY2005. The experiment will demonstrate achievable packaging efficiency of the inflatable aeroshell for launch, inflation, leak performance of the inflatable system throughout the flight regime, structural

  3. Researching Design Practices and Design Cognition: Contexts, Experiences and Pedagogical Knowledge-in-Pieces

    ERIC Educational Resources Information Center

    Kali, Yael; Goodyear, Peter; Markauskaite, Lina

    2011-01-01

    If research and development in the field of learning design is to have a serious and sustained impact on education, then technological innovation needs to be accompanied--and probably guided--by good empirical studies of the design practices and design thinking of those who develop these innovations. This article synthesises two related lines of…

  4. Experimental design: computer simulation for improving the precision of an experiment.

    PubMed

    van Wilgenburg, Henk; Zillesen, Piet G van Schaick; Krulichova, Iva

    2004-06-01

    An interactive computer-assisted learning program, ExpDesign, that has been developed for simulating animal experiments, is introduced. The program guides students through the steps for designing animal experiments and estimating optimal sample sizes. Principles are introduced for controlling variation, establishing the experimental unit, selecting randomised block and factorial experimental designs, and applying the appropriate statistical analysis. Sample Power is a supporting tool that visualises the process of estimating the sample size. The aim of developing the ExpDesign program has been to make biomedical research workers more familiar with some basic principles of experimental design and statistics and to facilitate discussions with statisticians.

  5. Workspace design for crane cabins applying a combined traditional approach and the Taguchi method for design of experiments.

    PubMed

    Spasojević Brkić, Vesna K; Veljković, Zorica A; Golubović, Tamara; Brkić, Aleksandar Dj; Kosić Šotić, Ivana

    2016-01-01

    Procedures in the development process of crane cabins are arbitrary and subjective. Since approximately 42% of incidents in the construction industry are linked to them, there is a need to collect fresh anthropometric data and provide additional recommendations for design. In this paper, dimensioning of the crane cabin interior space was carried out using a sample of 64 crane operators' anthropometric measurements, in the Republic of Serbia, by measuring workspace with 10 parameters using nine measured anthropometric data from each crane operator. This paper applies experiments run via full factorial designs using a combined traditional and Taguchi approach. The experiments indicated which design parameters are influenced by which anthropometric measurements and to what degree. The results are expected to be of use for crane cabin designers and should assist them to design a cabin that may lead to less strenuous sitting postures and fatigue for operators, thus improving safety and accident prevention.

  6. Tokamak Physics Experiment (TPX) power supply design and development

    SciTech Connect

    Neumeyer, C.; Bronner, G.; Lu, E.; Ramakrishnan, S.

    1995-04-01

    The Tokamak Physics Experiment (TPX) is an advanced tokamak project aimed at the production of quasi-steady state plasmas with advanced shape, heating, and particle control. TPX is to be built at the Princeton Plasma Physics Laboratory (PPPL) using many of the facilities from the Tokamak Fusion Test Reactor (TFTR). TPX will be the first tokamak to utilize superconducting (SC) magnets in both the toroidal field (TF) and poloidal field (PF) systems. This new feature requires a departure from the traditional tokamak power supply schemes. This paper describes the plan for the adaptation of the PPPL/FTR power system facilities to supply TPX. Five major areas are addressed, namely the AC power system, the TF, PF and Fast Plasma Position Control (FPPC) power supplies, and quench protection for the TF and PF systems. Special emphasis is placed on the development of new power supply and protection schemes.

  7. Computational design and analysis of flatback airfoil wind tunnel experiment.

    SciTech Connect

    Mayda, Edward A.; van Dam, C.P.; Chao, David D.; Berg, Dale E.

    2008-03-01

    A computational fluid dynamics study of thick wind turbine section shapes in the test section of the UC Davis wind tunnel at a chord Reynolds number of one million is presented. The goals of this study are to validate standard wind tunnel wall corrections for high solid blockage conditions and to reaffirm the favorable effect of a blunt trailing edge or flatback on the performance characteristics of a representative thick airfoil shape prior to building the wind tunnel models and conducting the experiment. The numerical simulations prove the standard wind tunnel corrections to be largely valid for the proposed test of 40% maximum thickness to chord ratio airfoils at a solid blockage ratio of 10%. Comparison of the computed lift characteristics of a sharp trailing edge baseline airfoil and derived flatback airfoils reaffirms the earlier observed trend of reduced sensitivity to surface contamination with increasing trailing edge thickness.

  8. RoboJockey: Designing an Entertainment Experience with Robots.

    PubMed

    Yoshida, Shigeo; Shirokura, Takumi; Sugiura, Yuta; Sakamoto, Daisuke; Ono, Tetsuo; Inami, Masahiko; Igarashi, Takeo

    2016-01-01

    The RoboJockey entertainment system consists of a multitouch tabletop interface for multiuser collaboration. RoboJockey enables a user to choreograph a mobile robot or a humanoid robot by using a simple visual language. With RoboJockey, a user can coordinate the mobile robot's actions with a combination of back, forward, and rotating movements and coordinate the humanoid robot's actions with a combination of arm and leg movements. Every action is automatically performed to background music. RoboJockey was demonstrated to the public during two pilot studies, and the authors observed users' behavior. Here, they report the results of their observations and discuss the RoboJockey entertainment experience. PMID:25585412

  9. Conceptual design for spacelab two-phase flow experiments

    NASA Technical Reports Server (NTRS)

    Bradshaw, R. D.; King, C. D.

    1977-01-01

    KC-135 aircraft tests confirmed the gravity sensitivity of two phase flow correlations. The prime component of the apparatus is a 1.5 cm dia by 90 cm fused quartz tube test section selected for visual observation. The water-cabin air system with water recycle was a clear choice for a flow regime-pressure drop test since it was used satisfactorily on KC-135 tests. Freon-11 with either overboard dump or with liquid-recycle will be used for the heat transfer test. The two experiments use common hardware. The experimental plan covers 120 data points in six hours with mass velocities from 10 to 640 kg/sec-sq m and qualities 0.01 to 0.64. The apparatus with pump, separator, storage tank and controls is mounted in a double spacelab rack. Supporting hardware, procedures, measured variables and program costs are defined.

  10. How Instructional Design Experts Use Knowledge and Experience to Solve Ill-Structured Problems

    ERIC Educational Resources Information Center

    Ertmer, Peggy A.; Stepich, Donald A.; York, Cindy S.; Stickman, Ann; Wu, Xuemei (Lily); Zurek, Stacey; Goktas, Yuksel

    2008-01-01

    This study examined how instructional design (ID) experts used their prior knowledge and previous experiences to solve an ill-structured instructional design problem. Seven experienced designers used a think-aloud procedure to articulate their problem-solving processes while reading a case narrative. Results, presented in the form of four…

  11. How to Teach Engineering and Industrial Design: a U.K. Experience.

    ERIC Educational Resources Information Center

    Sheldon, D. F.

    1988-01-01

    Explored are the possibilities of teaching engineering through a project approach. Discussed are the introduction, clashing cultures of industrial and engineering design, skills required of a designer, teaching approach to the total design activity, CAD/CAM experiences, and conclusions. (Author/YP)

  12. Technologists Talk: Making the Links between Design, Problem-Solving and Experiences with Hard Materials

    ERIC Educational Resources Information Center

    Potter, Patricia

    2013-01-01

    Design and problem-solving is a key learning focus in technology education and remains a distinguishing factor that separates it from other subject areas. This research investigated how two expert designers considered experiences with hard materials contributed to their learning design and problem-solving with these materials. The research project…

  13. Preliminary design for Arctic atmospheric radiative transfer experiments

    NASA Technical Reports Server (NTRS)

    Zak, B. D.; Church, H. W.; Stamnes, K.; Shaw, G.; Filyushkin, V.; Jin, Z.; Ellingson, R. G.; Tsay, S. C.

    1995-01-01

    If current plans are realized, within the next few years, an extraordinary set of coordinated research efforts focusing on energy flows in the Arctic will be implemented. All are motivated by the prospect of global climate change. SHEBA (Surface Energy Budget of the Arctic Ocean), led by the National Science Foundation (NSF) and the Office of Naval Research (ONR), involves instrumenting an ice camp in the perennial Arctic ice pack, and taking data for 12-18 months. The ARM (Atmospheric Radiation Measurement) North Slope of Alaska and Adjacent Arctic Ocean (NSA/AAO) Cloud and Radiation Testbed (CART) focuses on atmospheric radiative transport, especially in the presence of clouds. The NSA/AAO CART involves instrumenting a sizeable area on the North Slope of Alaska and adjacent waters in the vicinity of Barrow, and acquiring data over a period of about 10 years. FIRE (First ISCCP (International Satellite Cloud Climatology Program) Regional Experiment) Phase 3 is a program led by the National Aeronautics and Space Administration (NASA) which focuses on Arctic clouds, and which is coordinated with SHEBA and ARM. FIRE has historically emphasized data from airborne and satellite platforms. All three program anticipate initiating Arctic data acquisition during spring, 1997. In light of his historic opportunity, the authors discuss a strawman atmospheric radiative transfer experimental plan that identifies which features of the radiative transport models they think should be tested, what experimental data are required for each type of test, the platforms and instrumentation necessary to acquire those data, and in general terms, how the experiments could be conducted. Aspects of the plan are applicable to all three programs.

  14. Designing Critical Experiments in Support of Full Burnup Credit

    SciTech Connect

    Mueller, Don; Roberts, Jeremy A

    2008-01-01

    Burnup credit is the process of accounting for the negative reactivity due to fuel burnup and generation of parasitic absorbers over fuel assembly lifetime. For years, the fresh fuel assumption was used as a simple bound in criticality work for used fuel storage and transportation. More recently, major actinides have been included [1]. However, even this yields a highly conservative estimate in criticality calculations. Because of the numerous economical benefits including all available negative reactivity (i.e., full burnup credit) could provide [2], it is advantageous to work toward full burnup credit. Unfortunately, comparatively little work has been done to include non-major actinides and other fission products (FP) in burnup credit analyses due in part to insufficient experimental data for validation of codes and nuclear data. The Burnup Credit Criticality Experiment (BUCCX) at Sandia National Laboratory was a set of experiments with {sup 103}Rh that have relevance for burnup credit [3]. This work uses TSUNAMI-3D to investigate and adjust a BUCCX model to match isotope-specific, energy-dependent k{sub eff} sensitivity profiles to those of a representative high-capacity cask model (GBC-32) [4] for each FP of interest. The isotopes considered are {sup 149}Sm, {sup 143}Nd, {sup 103}Rh, {sup 133}Cs, {sup 155}Gd, {sup 152}Sm, {sup 99}Tc, {sup 145}Nd, {sup 153}Eu, {sup 147}Sm, {sup 109}Ag, {sup 95}Mo, {sup 150}Sm, {sup 101}Ru, and {sup 151}Eu. The goal is to understand the biases and bias uncertainties inherent in nuclear data, and ultimately, to apply these in support of full burnup credit.

  15. Preliminary design for Arctic atmospheric radiative transfer experiments

    NASA Astrophysics Data System (ADS)

    Zak, B. D.; Church, H. W.; Stamnes, K.; Shaw, G.; Filyushkin, V.; Jin, Z.; Ellingson, R. G.; Tsay, S. C.

    If current plans are realized, within the next few years, an extraordinary set of coordinated research efforts focusing on energy flows in the Arctic will be implemented. All are motivated by the prospect of global climate change. SHEBA (Surface Energy Budget of the Arctic Ocean), led by the National Science Foundation (NSF) and the Office of Naval Research (ONR), involves instrumenting an ice camp in the perennial Arctic ice pack, and taking data for 12-18 months. The ARM (Atmospheric Radiation Measurement) North Slope of Alaska and Adjacent Arctic Ocean (NSA/AAO) Cloud and Radiation Testbed (CART) focuses on atmospheric radiative transport, especially in the presence of clouds. The NSA/AAO CART involves instrumenting a sizeable area on the North Slope of Alaska and adjacent waters in the vicinity of Barrow, and acquiring data over a period of about 10 years. FIRE (First ISCCP (International Satellite Cloud Climatology Program) Regional Experiment) Phase 3 is a program led by the National Aeronautics and Space Administration (NASA) which focuses on Arctic clouds, and which is coordinated with SHEBA and ARM. FIRE has historically emphasized data from airborne and satellite platforms. All three program anticipate initiating Arctic data acquisition during spring, 1997. In light of his historic opportunity, the authors discuss a strawman atmospheric radiative transfer experimental plan that identifies which features of the radiative transport models they think should be tested, what experimental data are required for each type of test, the platforms and instrumentation necessary to acquire those data, and in general terms, how the experiments could be conducted. Aspects of the plan are applicable to all three programs.

  16. W/V-Band RF Propagation Experiment Design

    NASA Technical Reports Server (NTRS)

    Acosta, Roberto J.; Nessel, James A.; Simons, Rainee N.; Zemba, Michael J.; Morse, Jacquelynne Rose; Budinger, James M.

    2012-01-01

    The utilization of frequency spectrum for space-to-ground communications applications has generally progressed from the lowest available bands capable of supporting transmission through the atmosphere to the higher bands, which have required research and technological advancement to implement. As communications needs increase and the available spectrum in the microwave frequency bands (3 30 GHz) becomes congested globally, future systems will move into the millimeter wave (mm-wave) range (30 300 GHz). While current systems are operating in the Ka-band (20 30 GHz), systems planned for the coming decades will initiate operations in the Q-Band (33 50 GHz), V-Band (50 75 GHz) and W Band (75 110 GHz) of the spectrum. These bands offer extremely broadband capabilities (contiguous allocations of 500 MHz to 1GHz or more) and an uncluttered spectrum for a wide range of applications. NASA, DoD and commercial missions that can benefit from moving into the mm-wave bands include data relay and near-Earth data communications, unmanned aircraft communications, NASA science missions, and commercial broadcast/internet services, all able to be implemented via very small terminals. NASA Glenn Research Center has a long history of performing the inherently governmental function of opening new frequency spectrum by characterizing atmospheric effects on electromagnetic propagation and collaborating with the satellite communication industry to develop specific communications technologies for use by NASA and the nation. Along these lines, there are critical issues related to W/V-band propagation that need to be thoroughly understood before design of any operational system can commence. These issues arise primarily due to the limitations imposed on W/V-band signal propagation by the Earth s atmosphere, and to the fundamental lack of understanding of these effects with regards to proper system design and fade mitigation. In this paper, The GRC RF propagation team recommends measurements

  17. Designing Forest Adaptation Experiments through Manager-Scientist Partnerships

    NASA Astrophysics Data System (ADS)

    Nagel, L. M.; Swanston, C.; Janowiak, M.

    2014-12-01

    Three common forest adaptation options discussed in the context of an uncertain future climate are: creating resistance, promoting resilience, and enabling forests to respond to change. Though there is consensus on the broad management goals addressed by each of these options, translating these concepts into management plans specific for individual forest types that vary in structure, composition, and function remains a challenge. We will describe a decision-making framework that we employed within a manager-scientist partnership to develop a suite of adaptation treatments for two contrasting forest types as part of a long-term forest management experiment. The first, in northern Minnesota, is a red pine-dominated forest with components of white pine, aspen, paper birch, and northern red oak, with a hazel understory. The second, in southwest Colorado, is a warm-dry mixed conifer forest dominated by ponderosa pine, white fir, and Douglas-fir, with scattered aspen and an understory of Gambel oak. The current conditions at both sites are characterized by overstocking with moderate-to-high fuel loading, vulnerability to numerous forest health threats, and are generally uncharacteristic of historic structure and composition. The desired future condition articulated by managers for each site included elements of historic structure and natural range of variability, but were greatly tempered by known vulnerabilities and projected changes to climate and disturbance patterns. The resultant range of treatments we developed are distinct for each forest type, and address a wide range of management objectives.

  18. Design and implementation of the winter haze intensive tracer experiment

    SciTech Connect

    Malm, W.C.; Iyer, H.K. ); Pitchford, M. )

    1988-01-01

    Protection of vistas for certain national parks and wilderness areas as provided by the clean air act amendments of 1977 has stimulated an interest in visibility research. Methods are being developed and used to characterize atmospheric transparency, to identify the relative importance of the various particulate and gaseous atmospheric materials and to determine the role of man -made emissions. Much of the research has been conducted in the dessert southwest, in particular in northern Arizona and southern Utah. According to the authors,the juxtaposition of energy resources (especially coal) and national parks (including Grand Canyon, Bryce Canyon and Canyonlands) in an area where small changes in aerosol concentration can significantly affect visibility justifies concern by government and private organizations for visibility impacts resulting from industrial emissions.Accordingly, a cooperative effort, the subregional cooperative electric utility, national park service (NPS), Environmental Protection Agency (EPA) and Department of Defense (DOD) study, SCENES, is centered in this area. It operates on a five-year plan (1984-1989) involving continual visibility and aerosol measurements at a dozen locations, plus more in-depth intensive and special studies conducted over shorter, seasonally representative periods. In this paper, the authors discuss the winter haze intensive tracer experiment (WHITEX) which was conducted in January and February 1987 in the Colorado River area of the Colorado Plateau.

  19. Geothermal injection treatment: process chemistry, field experiences, and design options

    SciTech Connect

    Kindle, C.H.; Mercer, B.W.; Elmore, R.P.; Blair, S.C.; Myers, D.A.

    1984-09-01

    The successful development of geothermal reservoirs to generate electric power will require the injection disposal of approximately 700,000 gal/h (2.6 x 10/sup 6/ 1/h) of heat-depleted brine for every 50,000 kW of generating capacity. To maintain injectability, the spent brine must be compatible with the receiving formation. The factors that influence this brine/formation compatibility and tests to quantify them are discussed in this report. Some form of treatment will be necessary prior to injection for most situations; the process chemistry involved to avoid and/or accelerate the formation of precipitate particles is also discussed. The treatment processes, either avoidance or controlled precipitation approaches, are described in terms of their principles and demonstrated applications in the geothermal field and, when such experience is limited, in other industrial use. Monitoring techniques for tracking particulate growth, the effect of process parameters on corrosion and well injectability are presented. Examples of brine injection, preinjection treatment, and recovery from injectivity loss are examined and related to the aspects listed above.

  20. Multichannel readout ASIC design flow for high energy physics and cosmic rays experiments

    NASA Astrophysics Data System (ADS)

    Voronin, A.; Malankin, E.

    2016-02-01

    In the large-scale high energy physics and astrophysics experiments multi-channel readout application specific integrated circuits (ASICs) are widely used. The ASICs for such experiments are complicated systems, which usually include both analog and digital building blocks. The complexity and large number of channels in such ASICs require the proper methodological approach to their design. The paper represents the mixed-signal design flow of the ASICs for high energy physics and cosmic rays experiments. This flow was successfully embedded to the development of the read-out ASIC prototype for the muon chambers of the CBM experiment. The approach was approved in UMC CMOS MMRF 180 nm process. The design flow enable to analyse the mixed-signal system operation on the different levels: functional, behavioural, schematic and post layout including parasitic elements. The proposed design flow allows reducing the simulation period and eliminating the functionality mismatches on the very early stage of the design.

  1. Design and bioinformatics analysis of genome-wide CLIP experiments

    PubMed Central

    Wang, Tao; Xiao, Guanghua; Chu, Yongjun; Zhang, Michael Q.; Corey, David R.; Xie, Yang

    2015-01-01

    The past decades have witnessed a surge of discoveries revealing RNA regulation as a central player in cellular processes. RNAs are regulated by RNA-binding proteins (RBPs) at all post-transcriptional stages, including splicing, transportation, stabilization and translation. Defects in the functions of these RBPs underlie a broad spectrum of human pathologies. Systematic identification of RBP functional targets is among the key biomedical research questions and provides a new direction for drug discovery. The advent of cross-linking immunoprecipitation coupled with high-throughput sequencing (genome-wide CLIP) technology has recently enabled the investigation of genome-wide RBP–RNA binding at single base-pair resolution. This technology has evolved through the development of three distinct versions: HITS-CLIP, PAR-CLIP and iCLIP. Meanwhile, numerous bioinformatics pipelines for handling the genome-wide CLIP data have also been developed. In this review, we discuss the genome-wide CLIP technology and focus on bioinformatics analysis. Specifically, we compare the strengths and weaknesses, as well as the scopes, of various bioinformatics tools. To assist readers in choosing optimal procedures for their analysis, we also review experimental design and procedures that affect bioinformatics analyses. PMID:25958398

  2. Design of experiments for enantiomeric separation in supercritical fluid chromatography.

    PubMed

    Landagaray, Elodie; Vaccher, Claude; Yous, Saïd; Lipka, Emmanuelle

    2016-02-20

    A new chiral melatoninergic ligand, potentially successor of Valdoxan(®), presenting an improved pharmacological profile with regard to agomelatine, was chosen as a probe for a supercritical fluid chromatographic separation carried-out on an amylose tris[(S)-1-α-methylbenzylcarbamate] based stationary phase. The goal of this work was to optimize simultaneously three factors identified to have a significant influence to obtain the best resolution in the shortest analysis time (i.e., retention time of the second eluting enantiomer) for this chiral compound. For this purpose a central circumscribed composite (CCC) design was developed with three factors: the flow-rate, the pressure outlet and the percentage of ethanol to optimize of two responses: shortest analysis time and best resolution. The optimal conditions obtained via the optimizer mode of the software (using the Nelder-Mead method) i.e., CO2/EtOH 86:14 (v:v), 104bar, 3.2mLmin(-1) at 35°C lead to a resolution of 3.27 in less than 6min. These conditions were transposed to a preparative scale where a concentrated methanolic solution of 40mM was injected with a sample loop of 100μL. This step allowed to separate an amount of around 65mg of racemic melatonin ligand in only 3h with impressive yields (97%) and enantiomeric excess (99.5%). PMID:26765267

  3. Clinical experience with therapeutic vaccines designed for patients with hepatitis.

    PubMed

    Batdelger, Dendev; Dandii, Dorjiin; Dahgwahdorj, Yagaanbuyant; Erdenetsogt, Erdene; Oyunbileg, Janchivyn; Tsend, Navaansodov; Bayarmagnai, Bold; Jirathitikal, Vichai; Bourinbaiar, Aldar S

    2009-01-01

    Franciscan missionary Giovanni Di Plano Carpini traveled in 1245 to a country named Yeke Tartar, to visit a certain man called Genghis Khan. His journey's report narrated peculiar dietary habits of the locals: "they eat anything, even lice". Little that Carpini knew, he had actually documented the earliest known to us record of oral vaccination against blood-borne infections - an approach that is still used occasionally in the present-day Mongolia for therapy of hepatitis. Currently, efforts aimed at developing therapeutic hepatitis vaccines have switched to more palatable path, but we may still benefit from the insight of medieval Mongols. This review provides an update on development of hepatitis B and C vaccines as related to immunotherapy of hepatitis. Immune therapy is a fast-moving field but the results so far failed to pitch woo. Current trends in research on therapeutic vaccine candidates and liver immunology are discussed. We subscribe to the idea that viral hepatitis is essentially an autoimmune disease generating immune-mediated liver damage. Therapeutic vaccines need to be designed in such a way that self-destructive immunity of the host is targeted not the virus, which is not cytopathic.

  4. Experiment design for through-focus testing of intraocular lenses

    NASA Astrophysics Data System (ADS)

    Millán, María. S.; Alba-Bueno, Francisco; Vega, Fidel

    2013-11-01

    Eye models to test intraocular lenses (IOLs) in an optical bench are commonly designed in agreement with the ISO 11979-2 and 11979-9 standard requirements. However, modifications to the ISO eye model have been proposed to test IOLs in conditions closer to real human eye. Wavefront analysis and aberration characterization, wavelength dependence, efficiency, off-axis performance and imaging degradation under certain amount of misalignment can thus be measured in vitro. The main parts of the system to test IOLs are: the illumination system and object test, the eye model including the IOL immersed in a wet cell and a microscope assembled to a sensor that magnifies and captures the aerial image of the object formed by the eye model. A problem concerning the simultaneous variation of defocus and magnification arises when using the microscope to capture out-of-focus images in a through-focus study. Using the eye model, we study the problem of implementing a through-focus measurement of the imaging quality of an IOL. We find a solution based on geometrical optics and compare it with other proposals reported in the literature. The effects on the measurement of the Modulation Transfer Function and the Point Spread Function are predicted. Experimental results are obtained and discussed.

  5. Best Practices for Operando Battery Experiments: Influences of X-ray Experiment Design on Observed Electrochemical Reactivity

    SciTech Connect

    Borkiewicz, O. J.; Wiaderek, Kamila M.; Chupas, Peter J.; Chapman, Karena W.

    2015-06-04

    Dynamic properties and multiscale complexities governing electrochemical energy storage in batteries are most ideally interrogated under simulated operating conditions within an electrochemical cell. We assess how electrochemical reactivity can be impacted by experiment design, including the X-ray measurements or by common features or adaptations of electrochemical cells that enable X-ray measurements.

  6. Superheavies: Short-Term Experiments and Far-Reaching Designs

    NASA Astrophysics Data System (ADS)

    Zagrebaev, V. I.; Karpov, A. V.; Mishustin, I. N.; Greiner, Walter

    Low values of the fusion cross sections and very short half-lives of nuclei with Z>120 put obstacles in synthesis of new elements. However the fusion reactions of medium mass projectiles with different actinide targets still can be used for the production of the not-yet-synthesized SH nuclei. The gap of unknown SH nuclei, located between the isotopes which were produced earlier in the cold and hot fusion reactions, could be filled in fusion reactions of ^{48}Ca with available lighter isotopes of Pu, Am, and Cm. Cross sections for the production of these nuclei are predicted to be rather large, and the corresponding experiments can be easily performed at existing facilities. The use of heavier actinide targets give us a chance to produce more neutron enriched SH isotopes. Moreover, for the first time, a narrow pathway is found to the middle of the island of stability owing to possible β ^+ decay of SH isotopes which can be formed in ordinary fusion reactions of stable nuclei. Multi-nucleon transfer processes at near barrier collisions of heavy (and very heavy, U-like) ions seem to be quite realistic reaction mechanism allowing us to produce new neutron enriched heavy nuclei located in the unexplored upper part of the nuclear map. Neutron capture reactions can be also used for the production of the long-living neutron rich SH nuclei. Strong neutron fluxes might be provided by pulsed nuclear reactors and by nuclear explosions in laboratory conditions and by supernova explosions in nature. All these possibilities are discussed in the chapter.

  7. Optical Design of the MOSES Sounding Rocket Experiment

    NASA Technical Reports Server (NTRS)

    Thomas, Roger J.; Kankelborg, Charles C.; Fisher, Richard R. (Technical Monitor)

    2001-01-01

    The Multi-Order Solar EUV Spectrograph (MOSES) is a sounding rocket payload now being developed by Montana State University in collaboration with the Goddard Space Flight Center, Lockheed Martin Advanced Technology Center, and Mullard Space Science Laboratory. The instrument utilizes a unique optical design to provide solar EUV measurements with true 2-pixel resolutions of 1.0 arcsec and 60 mA over a full two-dimensional field of view of 1056 x 528 arcsec, all at a time cadence of 10 s. This unprecedented capability is achieved by means of an objective spherical grating 100 mm in diameter, ruled at 833 gr/mm. The concave grating focuses spectrally dispersed solar radiation onto three separate detectors, simultaneously recording the zero-order as well as the plus and minus first-spectral-order images. Data analysis procedures, similar to those used in X-ray tomography reconstructions, can then disentangle the mixed spatial and spectral information recorded by the multiple detectors. A flat folding mirror permits an imaging focal length of 4.74 m to be packaged within the payload's physical length of 2.82 m. Both the objective grating and folding flat have specialized, closely matched, multilayer coatings that strongly enhance their EUV reflectance while also suppressing off-band radiation that would otherwise complicate data inversion. Although the spectral bandpass is rather narrow, several candidate wavelength intervals are available to carry out truly unique scientific studies of the outer solar atmosphere. Initial flights of MOSES, scheduled to begin in 2004, will observe a 10 Angstrom band that covers very strong emission lines characteristic of both the sun's corona (Si XI 303 Angstroms) and transition-region (He II 304 Angstroms). The MOSES program is supported by a grant from NASA's Office of Space Science.

  8. Optimal Experiment Design for Monoexponential Model Fitting: Application to Apparent Diffusion Coefficient Imaging

    PubMed Central

    Alipoor, Mohammad; Maier, Stephan E.; Gu, Irene Yu-Hua; Mehnert, Andrew; Kahl, Fredrik

    2015-01-01

    The monoexponential model is widely used in quantitative biomedical imaging. Notable applications include apparent diffusion coefficient (ADC) imaging and pharmacokinetics. The application of ADC imaging to the detection of malignant tissue has in turn prompted several studies concerning optimal experiment design for monoexponential model fitting. In this paper, we propose a new experiment design method that is based on minimizing the determinant of the covariance matrix of the estimated parameters (D-optimal design). In contrast to previous methods, D-optimal design is independent of the imaged quantities. Applying this method to ADC imaging, we demonstrate its steady performance for the whole range of input variables (imaged parameters, number of measurements, and range of b-values). Using Monte Carlo simulations we show that the D-optimal design outperforms existing experiment design methods in terms of accuracy and precision of the estimated parameters. PMID:26839880

  9. Optimal Experiment Design for Monoexponential Model Fitting: Application to Apparent Diffusion Coefficient Imaging.

    PubMed

    Alipoor, Mohammad; Maier, Stephan E; Gu, Irene Yu-Hua; Mehnert, Andrew; Kahl, Fredrik

    2015-01-01

    The monoexponential model is widely used in quantitative biomedical imaging. Notable applications include apparent diffusion coefficient (ADC) imaging and pharmacokinetics. The application of ADC imaging to the detection of malignant tissue has in turn prompted several studies concerning optimal experiment design for monoexponential model fitting. In this paper, we propose a new experiment design method that is based on minimizing the determinant of the covariance matrix of the estimated parameters (D-optimal design). In contrast to previous methods, D-optimal design is independent of the imaged quantities. Applying this method to ADC imaging, we demonstrate its steady performance for the whole range of input variables (imaged parameters, number of measurements, and range of b-values). Using Monte Carlo simulations we show that the D-optimal design outperforms existing experiment design methods in terms of accuracy and precision of the estimated parameters.

  10. Maximize, minimize or target - optimization for a fitted response from a designed experiment

    DOE PAGES

    Anderson-Cook, Christine Michaela; Cao, Yongtao; Lu, Lu

    2016-04-01

    One of the common goals of running and analyzing a designed experiment is to find a location in the design space that optimizes the response of interest. Depending on the goal of the experiment, we may seek to maximize or minimize the response, or set the process to hit a particular target value. After the designed experiment, a response model is fitted and the optimal settings of the input factors are obtained based on the estimated response model. Furthermore, the suggested optimal settings of the input factors are then used in the production environment.

  11. High Temperature Electrolysis Pressurized Experiment Design, Operation, and Results

    SciTech Connect

    J.E. O'Brien; X. Zhang; G.K. Housley; K. DeWall; L. Moore-McAteer

    2012-09-01

    A new facility has been developed at the Idaho National Laboratory for pressurized testing of solid oxide electrolysis stacks. Pressurized operation is envisioned for large-scale hydrogen production plants, yielding higher overall efficiencies when the hydrogen product is to be delivered at elevated pressure for tank storage or pipelines. Pressurized operation also supports higher mass flow rates of the process gases with smaller components. The test stand can accommodate planar cells with dimensions up to 8.5 cm x 8.5 cm and stacks of up to 25 cells. It is also suitable for testing other cell and stack geometries including tubular cells. The pressure boundary for these tests is a water-cooled spool-piece pressure vessel designed for operation up to 5 MPa. Pressurized operation of a ten-cell internally manifolded solid oxide electrolysis stack has been successfully demonstrated up 1.5 MPa. The stack is internally manifolded and operates in cross-flow with an inverted-U flow pattern. Feed-throughs for gas inlets/outlets, power, and instrumentation are all located in the bottom flange. The entire spool piece, with the exception of the bottom flange, can be lifted to allow access to the internal furnace and test fixture. Lifting is accomplished with a motorized threaded drive mechanism attached to a rigid structural frame. Stack mechanical compression is accomplished using springs that are located inside of the pressure boundary, but outside of the hot zone. Initial stack heatup and performance characterization occurs at ambient pressure followed by lowering and sealing of the pressure vessel and subsequent pressurization. Pressure equalization between the anode and cathode sides of the cells and the stack surroundings is ensured by combining all of the process gases downstream of the stack. Steady pressure is maintained by means of a backpressure regulator and a digital pressure controller. A full description of the pressurized test apparatus is provided in this

  12. Providing Novice Instructional Designers Real-World Experiences: The PacifiCorp Design and Development Competition

    ERIC Educational Resources Information Center

    Bishop, MJ; Schuch, Dan; Spector, J. Michael; Tracey, Monica W.

    2005-01-01

    According to the International Board of Standards for Training, Performance, and Instruction (ibstpi) new technologies and methods have made instructional design practice more complex and sophisticated today than it was in the early years of the field (Richey et al., 2001). The authors of the 2000 ibstpi instructional design standards claimed…

  13. Resist Profile Control Obtained Through A Desirability Function And Statistically Designed Experiments

    NASA Astrophysics Data System (ADS)

    Bell, Kenneth L.; Christensen, Lorna D.

    1989-07-01

    This paper describes a technique used to determine an optimized microlithographic process using statistical methods which included a statistically designed experiment (SDE); a desirability function, d(θ*) and a rigorous daily statistical process control program, (SPC).

  14. Electrical design of Space Shuttle payload G-534: The pool boiling experiment

    NASA Technical Reports Server (NTRS)

    Francisco, David R.

    1993-01-01

    Payload G-534, the Pool Boiling Experiment (PBE), is a Get Away Special (GAS) payload that flew on the Space Shuttle Spacelab Mission J (STS 47) on September 19-21, 1992. This paper will give a brief overall description of the experiment with the main discussion being the electrical design with a detailed description of the power system and interface to the GAS electronics. The batteries used and their interface to the experiment Power Control Unit (PCU) and GAS electronics will be examined. The design philosophy for the PCU will be discussed in detail. The criteria for selection of fuses, relays, power semiconductors, and other electrical components along with grounding and shielding policy for the entire experiment are presented. The intent of this paper is to discuss the use of military tested parts and basic design guidelines to build a quality experiment for minimal additional cost.

  15. Assessment of LWR piping design loading based on plant operating experience

    SciTech Connect

    Svensson, P. O.

    1980-08-01

    The objective of this study has been to: (1) identify current Light Water Reactor (LWR) piping design load parameters, (2) identify significant actual LWR piping loads from plant operating experience, (3) perform a comparison of these two sets of data and determine the significance of any differences, and (4) make an evaluation of the load representation in current LWR piping design practice, in view of plant operating experience with respect to piping behavior and response to loading.

  16. Design and calibration of the fast ion diagnostic experiment detector on the poloidal divertor experiment

    SciTech Connect

    Kaita, R.; Goldston, R.J.; Meyerhofer, D.; Eridon, J.

    1981-12-01

    A special purpose charge-exchange analyzer was constructed to measure the spatial distribution of hot-plasma ions, as a function of energy and time, in the poloidal divertor experiment (PDX). The fast neutrals produced by charge exchange within the tokamak are reionized as they pass through a helium stripping cell in the detector. The energies of these ions are determined by the trajectories they follow between cylindrical deflection plates which are set at known electrostatic potentials. We describe the technique used to calibrate the response of this system as it depends on the energies and the masses of the particles which are being detected.

  17. IsoDesign: a software for optimizing the design of 13C-metabolic flux analysis experiments.

    PubMed

    Millard, Pierre; Sokol, Serguei; Letisse, Fabien; Portais, Jean-Charles

    2014-01-01

    The growing demand for (13) C-metabolic flux analysis ((13) C-MFA) in the field of metabolic engineering and systems biology is driving the need to rationalize expensive and time-consuming (13) C-labeling experiments. Experimental design is a key step in improving both the number of fluxes that can be calculated from a set of isotopic data and the precision of flux values. We present IsoDesign, a software that enables these parameters to be maximized by optimizing the isotopic composition of the label input. It can be applied to (13) C-MFA investigations using a broad panel of analytical tools (MS, MS/MS, (1) H NMR, (13) C NMR, etc.) individually or in combination. It includes a visualization module to intuitively select the optimal label input depending on the biological question to be addressed. Applications of IsoDesign are described, with an example of the entire (13) C-MFA workflow from the experimental design to the flux map including important practical considerations. IsoDesign makes the experimental design of (13) C-MFA experiments more accessible to a wider biological community. IsoDesign is distributed under an open source license at http://metasys.insa-toulouse.fr/software/isodes/

  18. Design of Experiments with Multiple Independent Variables: A Resource Management Perspective on Complete and Reduced Factorial Designs

    ERIC Educational Resources Information Center

    Collins, Linda M.; Dziak, John J.; Li, Runze

    2009-01-01

    An investigator who plans to conduct an experiment with multiple independent variables must decide whether to use a complete or reduced factorial design. This article advocates a resource management perspective on making this decision, in which the investigator seeks a strategic balance between service to scientific objectives and economy.…

  19. The Role of Flow Experience and CAD Tools in Facilitating Creative Behaviours for Architecture Design Students

    ERIC Educational Resources Information Center

    Dawoud, Husameddin M.; Al-Samarraie, Hosam; Zaqout, Fahed

    2015-01-01

    This study examined the role of flow experience in intellectual activity with an emphasis on the relationship between flow experience and creative behaviour in design using CAD. The study used confluence and psychometric approaches because of their unique abilities to depict a clear image of creative behaviour. A cross-sectional study…

  20. Experiences of Design-and-Make Interventions with Indian Middle School Students

    ERIC Educational Resources Information Center

    Khunyakari, Ritesh P.

    2015-01-01

    Enabling learning through meaningful classroom experiences has always been a challenge for teachers. Bringing about a balance of the "conceptual" and the "hands-on", along with contextual embeddedness in problem-solving situations, broadly characterises the experience of development and trials of three Design and Technology…

  1. The Expectations and Experiences of First-Year Students in Art & Design

    ERIC Educational Resources Information Center

    Yorke, Mantz; Vaughan, David

    2013-01-01

    This article reports on a survey of the expectations and experiences of first-year students who were enrolled in 2010 on programmes in Art & Design in the United Kingdom. The survey covered 20 institutions and received 778 usable responses. The results indicate considerable variation in both expectations and experiences, and provide a basis…

  2. Building International Experiences into an Engineering Curriculum--A Design Project-Based Approach

    ERIC Educational Resources Information Center

    Maldonado, Victor; Castillo, Luciano; Carbajal, Gerardo; Hajela, Prabhat

    2014-01-01

    This paper is a descriptive account of how short-term international and multicultural experiences can be integrated into early design experiences in an aerospace engineering curriculum. Such approaches are considered as important not only in fostering a student's interest in the engineering curriculum, but also exposing them to a…

  3. Designing an Acoustic Suspension Speaker System in the General Physics Laboratory: A Divergent experiment

    ERIC Educational Resources Information Center

    Horton, Philip B.

    1969-01-01

    Describes a student laboratory project involving the design of an "acoustic suspension speaker system. The characteristics of the loudspeaker used are measured as an extension of the inertia-balance experiment. The experiment may be extended to a study of Stelmholtz resonators, coupled oscillators, electromagnetic forces, thermodynamics and…

  4. Paragogy and Flipped Assessment: Experience of Designing and Running a MOOC on Research Methods

    ERIC Educational Resources Information Center

    Lee, Yenn; Rofe, J. Simon

    2016-01-01

    This study draws on the authors' first-hand experience of designing, developing and delivering (3Ds) a massive open online course (MOOC) entitled "Understanding Research Methods" since 2014, largely but not exclusively for learners in the humanities and social sciences. The greatest challenge facing us was to design an assessment…

  5. Thermal design, analysis, and testing of the CETA Space Shuttle Flight Experiment

    NASA Technical Reports Server (NTRS)

    Witsil, Amy K.; Foss, Richard A.

    1990-01-01

    Attention is given to the Crew and Equipment Translation Aid (CETA) Space Shuttle flight experiment designed to demonstrate techniques and equipment for propelling and restraining crew during EVA. Emphasis is placed on the thermal analysis of the CETA hardware, including thermal design trade-offs, modeling assumptions, temperature predictions, and testing activities.

  6. Attending to Cognitive Organization in the Design of Computer Menus: A Two-Experiment Study.

    ERIC Educational Resources Information Center

    Coll, Joan H.; And Others

    1993-01-01

    Discussion of the design of computer menus focuses on two experiments that examined task time across several orderings of menu lists. Findings are described that demonstrated that good physical layout in screen design is not sufficient without good conceptual layout. The size of menu lists for maximizing efficiency is considered. (Contains 17…

  7. Development of Metacognitive Skills: Designing Problem-Based Experiment with Prospective Science Teachers in Biology Laboratory

    ERIC Educational Resources Information Center

    Denis Çeliker, Huriye

    2015-01-01

    The purpose of this study is to investigate the effect of designing problem-based experiments (DPBE) on the level of metacognitive skills of prospective science teachers. For this purpose, pre test-post test design, without control group, was used in the research. The research group of the study comprised 113 second-grade prospective science…

  8. Evaluating the Effectiveness of Developmental Mathematics by Embedding a Randomized Experiment within a Regression Discontinuity Design

    ERIC Educational Resources Information Center

    Moss, Brian G.; Yeaton, William H.; Lloyd, Jane E.

    2014-01-01

    Using a novel design approach, a randomized experiment (RE) was embedded within a regression discontinuity (RD) design (R-RE-D) to evaluate the impact of developmental mathematics at a large midwestern college ("n" = 2,122). Within a region of uncertainty near the cut-score, estimates of benefit from a prospective RE were closely…

  9. Lifting off the Ground to Return Anew: Mediated Praxis, Transformative Learning, and Social Design Experiments

    ERIC Educational Resources Information Center

    Gutierrez, Kris D.; Vossoughi, Shirin

    2010-01-01

    This article examines a praxis model of teacher education and advances a new method for engaging novice teachers in reflective practice and robust teacher learning. Social design experiments--cultural historical formations designed to promote transformative learning for adults and children--are organized around expansive notions of learning and…

  10. Design Your Own Workup: A Guided-Inquiry Experiment for Introductory Organic Laboratory Courses

    ERIC Educational Resources Information Center

    Mistry, Nimesh; Fitzpatrick, Christopher; Gorman, Stephen

    2016-01-01

    A guided-inquiry experiment was designed and implemented in an introductory organic chemistry laboratory course. Students were given a mixture of compounds and had to isolate two of the components by designing a viable workup procedure using liquid-liquid separation methods. Students were given the opportunity to apply their knowledge of chemical…

  11. Classroom Experiences in an Engineering Design Graphics Course with a CAD/CAM Extension.

    ERIC Educational Resources Information Center

    Barr, Ronald E.; Juricic, Davor

    1997-01-01

    Reports on the development of a new CAD/CAM laboratory experience for an Engineering Design Graphics (EDG) course. The EDG curriculum included freehand sketching, introduction to Computer-Aided Design and Drafting (CADD), and emphasized 3-D solid modeling. Reviews the project and reports on the testing of the new laboratory components which were…

  12. SSC dipole log manget model cryostat design and initial production experience

    SciTech Connect

    Niemann, R.C.; Carson, J.A.; Engler, N.H.; Gonczy, J.D.; Nicol, T.H.

    1986-06-01

    The SSC dipole magnet development program includes the design and construction of full length magnet models for heat leak and magnetic measurements and for the evaluation of the performance of strings of magnets. The design of the model magnet cryostat is presented and the production experiences for the initial long magnet model, a heat leak measurement device, are related.

  13. Interim Service ISDN Satellite (ISIS) network model for advanced satellite designs and experiments

    NASA Technical Reports Server (NTRS)

    Pepin, Gerard R.; Hager, E. Paul

    1991-01-01

    The Interim Service Integrated Services Digital Network (ISDN) Satellite (ISIS) Network Model for Advanced Satellite Designs and Experiments describes a model suitable for discrete event simulations. A top-down model design uses the Advanced Communications Technology Satellite (ACTS) as its basis. The ISDN modeling abstractions are added to permit the determination and performance for the NASA Satellite Communications Research (SCAR) Program.

  14. An application of the IMC software to controller design for the JPL LSCL Experiment Facility

    NASA Technical Reports Server (NTRS)

    Zhu, Guoming; Skelton, Robert E.

    1993-01-01

    A software package which Integrates Model reduction and Controller design (The IMC software) is applied to design controllers for the JPL Large Spacecraft Control Laboratory Experiment Facility. Modal Cost Analysis is used for the model reduction, and various Output Covariance Constraints are guaranteed by the controller design. The main motivation is to find the controller with the 'best' performance with respect to output variances. Indeed it is shown that by iterating on the reduced order design model, the controller designed does have better performance than that obtained with the first model reduction.

  15. Design of a creep experiment for SiC/SiC composites in HFIR

    SciTech Connect

    Hecht, S.L.; Hamilton, M.L.; Jones, R.H.

    1997-08-01

    A new specimen was designed for performing in-reactor creep tests on composite materials, specifically on SiC/SiC composites. The design was tailored for irradiation at 800{degrees}C in a HFIR RB position. The specimen comprises a composite cylinder loaded by a pressurized internal bladder that is made of Nb1Zr. The experiment was designed for approximately a one year irradiation.

  16. Design Principles for High School Engineering Design Challenges: Experiences from High School Science Classrooms

    ERIC Educational Resources Information Center

    Schunn, Christian

    2011-01-01

    At the University of Pittsburgh, the author and his colleagues have been exploring a range of approaches to design challenges for implementation in high school science classrooms. In general, their approach has always involved students working during class time over the course of many weeks. So, their understanding of what works must be…

  17. designGG: an R-package and web tool for the optimal design of genetical genomics experiments

    PubMed Central

    Li, Yang; Swertz, Morris A; Vera, Gonzalo; Fu, Jingyuan; Breitling, Rainer; Jansen, Ritsert C

    2009-01-01

    Background High-dimensional biomolecular profiling of genetically different individuals in one or more environmental conditions is an increasingly popular strategy for exploring the functioning of complex biological systems. The optimal design of such genetical genomics experiments in a cost-efficient and effective way is not trivial. Results This paper presents designGG, an R package for designing optimal genetical genomics experiments. A web implementation for designGG is available at . All software, including source code and documentation, is freely available. Conclusion DesignGG allows users to intelligently select and allocate individuals to experimental units and conditions such as drug treatment. The user can maximize the power and resolution of detecting genetic, environmental and interaction effects in a genome-wide or local mode by giving more weight to genome regions of special interest, such as previously detected phenotypic quantitative trait loci. This will help to achieve high power and more accurate estimates of the effects of interesting factors, and thus yield a more reliable biological interpretation of data. DesignGG is applicable to linkage analysis of experimental crosses, e.g. recombinant inbred lines, as well as to association analysis of natural populations. PMID:19538731

  18. Physical barriers formed from gelling liquids: 1. numerical design of laboratory and field experiments

    SciTech Connect

    Finsterle, S.; Moridis, G.J.; Pruess, K.; Persoff, P.

    1994-01-01

    The emplacement of liquids under controlled viscosity conditions is investigated by means of numerical simulations. Design calculations are performed for a laboratory experiment on a decimeter scale, and a field experiment on a meter scale. The purpose of the laboratory experiment is to study the behavior of multiple gout plumes when injected in a porous medium. The calculations for the field trial aim at designing a grout injection test from a vertical well in order to create a grout plume of a significant extent in the subsurface.

  19. Using reactor operating experience to improve the design of a new Broad Application Test Reactor

    SciTech Connect

    Fletcher, C.D.; Ryskamp, J.M.; Drexler, R.L.; Leyse, C.F.

    1993-07-01

    Increasing regulatory demands and effects of plant aging are limiting the operation of existing test reactors. Additionally, these reactors have limited capacities and capabilities for supporting future testing missions. A multidisciplinary team of experts developed sets of preliminary safety requirements, facility user needs, and reactor design concepts for a new Broad Application Test Reactor (BATR). Anticipated missions for the new reactor include fuels and materials irradiation testing, isotope production, space testing, medical research, fusion testing, intense positron research, and transmutation doping. The early BATR design decisions have benefited from operating experiences with existing reactors. This paper discusses these experiences and highlights their significance for the design of a new BATR.

  20. The role of integral experiments and nuclear cross section evaluations in space nuclear reactor design

    NASA Astrophysics Data System (ADS)

    Moses, David L.; McKnight, Richard D.

    The importance of the nuclear and neutronic properties of candidate space reactor materials to the design process has been acknowledged as has been the use of benchmark reactor physics experiments to verify and qualify analytical tools used in design, safety, and performance evaluation. Since June 1966, the Cross Section Evaluation Working Group (CSEWG) has acted as an interagency forum for the assessment and evaluation of nuclear reaction data used in the nuclear design process. CSEWG data testing has involved the specification and calculation of benchmark experiments which are used widely for commercial reactor design and safety analysis. These benchmark experiments preceded the issuance of the industry standards for acceptance, but the benchmarks exceed the minimum acceptance criteria for such data. Thus, a starting place has been provided in assuring the accuracy and uncertainty of nuclear data important to space reactor applications.

  1. Simulation study to determine the impact of different design features on design efficiency in discrete choice experiments

    PubMed Central

    Vanniyasingam, Thuva; Cunningham, Charles E; Foster, Gary; Thabane, Lehana

    2016-01-01

    Objectives Discrete choice experiments (DCEs) are routinely used to elicit patient preferences to improve health outcomes and healthcare services. While many fractional factorial designs can be created, some are more statistically optimal than others. The objective of this simulation study was to investigate how varying the number of (1) attributes, (2) levels within attributes, (3) alternatives and (4) choice tasks per survey will improve or compromise the statistical efficiency of an experimental design. Design and methods A total of 3204 DCE designs were created to assess how relative design efficiency (d-efficiency) is influenced by varying the number of choice tasks (2–20), alternatives (2–5), attributes (2–20) and attribute levels (2–5) of a design. Choice tasks were created by randomly allocating attribute and attribute level combinations into alternatives. Outcome Relative d-efficiency was used to measure the optimality of each DCE design. Results DCE design complexity influenced statistical efficiency. Across all designs, relative d-efficiency decreased as the number of attributes and attribute levels increased. It increased for designs with more alternatives. Lastly, relative d-efficiency converges as the number of choice tasks increases, where convergence may not be at 100% statistical optimality. Conclusions Achieving 100% d-efficiency is heavily dependent on the number of attributes, attribute levels, choice tasks and alternatives. Further exploration of overlaps and block sizes are needed. This study's results are widely applicable for researchers interested in creating optimal DCE designs to elicit individual preferences on health services, programmes, policies and products. PMID:27436671

  2. Estimating parameters with pre-specified accuracies in distributed parameter systems using optimal experiment design

    NASA Astrophysics Data System (ADS)

    Potters, M. G.; Bombois, X.; Mansoori, M.; Van den Hof, Paul M. J.

    2016-08-01

    Estimation of physical parameters in dynamical systems driven by linear partial differential equations is an important problem. In this paper, we introduce the least costly experiment design framework for these systems. It enables parameter estimation with an accuracy that is specified by the experimenter prior to the identification experiment, while at the same time minimising the cost of the experiment. We show how to adapt the classical framework for these systems and take into account scaling and stability issues. We also introduce a progressive subdivision algorithm that further generalises the experiment design framework in the sense that it returns the lowest cost by finding the optimal input signal, and optimal sensor and actuator locations. Our methodology is then applied to a relevant problem in heat transfer studies: estimation of conductivity and diffusivity parameters in front-face experiments. We find good correspondence between numerical and theoretical results.

  3. Pliocene Model Intercomparison Project (PlioMIP): Experimental Design and Boundary Conditions (Experiment 2)

    NASA Technical Reports Server (NTRS)

    Haywood, A. M.; Dowsett, H. J.; Robinson, M. M.; Stoll, D. K.; Dolan, A. M.; Lunt, D. J.; Otto-Bliesner, B.; Chandler, M. A.

    2011-01-01

    The Palaeoclimate Modelling Intercomparison Project has expanded to include a model intercomparison for the mid-Pliocene warm period (3.29 to 2.97 million yr ago). This project is referred to as PlioMIP (the Pliocene Model Intercomparison Project). Two experiments have been agreed upon and together compose the initial phase of PlioMIP. The first (Experiment 1) is being performed with atmosphere only climate models. The second (Experiment 2) utilizes fully coupled ocean-atmosphere climate models. Following on from the publication of the experimental design and boundary conditions for Experiment 1 in Geoscientific Model Development, this paper provides the necessary description of differences and/or additions to the experimental design for Experiment 2.

  4. Pliocene Model Intercomparison Project (PlioMIP): experimental design and boundary conditions (Experiment 2)

    USGS Publications Warehouse

    Haywood, A.M.; Dowsett, H.J.; Robinson, M.M.; Stoll, D.K.; Dolan, A.M.; Lunt, D.J.; Otto-Bliesner, B.; Chandler, M.A.

    2011-01-01

    The Palaeoclimate Modelling Intercomparison Project has expanded to include a model intercomparison for the mid-Pliocene warm period (3.29 to 2.97 million yr ago). This project is referred to as PlioMIP (the Pliocene Model Intercomparison Project). Two experiments have been agreed upon and together compose the initial phase of PlioMIP. The first (Experiment 1) is being performed with atmosphere-only climate models. The second (Experiment 2) utilises fully coupled ocean-atmosphere climate models. Following on from the publication of the experimental design and boundary conditions for Experiment 1 in Geoscientific Model Development, this paper provides the necessary description of differences and/or additions to the experimental design for Experiment 2.

  5. Optimal Design of Passive Flow Control for a Boundary-Layer-Ingesting Offset Inlet Using Design-of-Experiments

    NASA Technical Reports Server (NTRS)

    Allan, Brian G.; Owens, Lewis R.; Lin, John C.

    2006-01-01

    This research will investigate the use of Design-of-Experiments (DOE) in the development of an optimal passive flow control vane design for a boundary-layer-ingesting (BLI) offset inlet in transonic flow. This inlet flow control is designed to minimize the engine fan-face distortion levels and first five Fourier harmonic half amplitudes while maximizing the inlet pressure recovery. Numerical simulations of the BLI inlet are computed using the Reynolds-averaged Navier-Stokes (RANS) flow solver, OVERFLOW, developed at NASA. These simulations are used to generate the numerical experiments for the DOE response surface model. In this investigation, two DOE optimizations were performed using a D-Optimal Response Surface model. The first DOE optimization was performed using four design factors which were vane height and angles-of-attack for two groups of vanes. One group of vanes was placed at the bottom of the inlet and a second group symmetrically on the sides. The DOE design was performed for a BLI inlet with a free-stream Mach number of 0.85 and a Reynolds number of 2 million, based on the length of the fan-face diameter, matching an experimental wind tunnel BLI inlet test. The first DOE optimization required a fifth order model having 173 numerical simulation experiments and was able to reduce the DC60 baseline distortion from 64% down to 4.4%, while holding the pressure recovery constant. A second DOE optimization was performed holding the vanes heights at a constant value from the first DOE optimization with the two vane angles-of-attack as design factors. This DOE only required a second order model fit with 15 numerical simulation experiments and reduced DC60 to 3.5% with small decreases in the fourth and fifth harmonic amplitudes. The second optimal vane design was tested at the NASA Langley 0.3- Meter Transonic Cryogenic Tunnel in a BLI inlet experiment. The experimental results showed a 80% reduction of DPCP(sub avg), the circumferential distortion level at the

  6. Optimal Design of Passive Flow Control for a Boundary-Layer-Ingesting Offset Inlet Using Design-of-Experiments

    NASA Technical Reports Server (NTRS)

    Allan, Brian G.; Owens, Lewis R., Jr.; Lin, John C.

    2006-01-01

    This research will investigate the use of Design-of-Experiments (DOE) in the development of an optimal passive flow control vane design for a boundary-layer-ingesting (BLI) offset inlet in transonic flow. This inlet flow control is designed to minimize the engine fan face distortion levels and first five Fourier harmonic half amplitudes while maximizing the inlet pressure recovery. Numerical simulations of the BLI inlet are computed using the Reynolds-averaged Navier-Stokes (RANS) flow solver, OVERFLOW, developed at NASA. These simulations are used to generate the numerical experiments for the DOE response surface model. In this investigation, two DOE optimizations were performed using a D-Optimal Response Surface model. The first DOE optimization was performed using four design factors which were vane height and angles-of-attack for two groups of vanes. One group of vanes was placed at the bottom of the inlet and a second group symmetrically on the sides. The DOE design was performed for a BLI inlet with a free-stream Mach number of 0.85 and a Reynolds number of 2 million, based on the length of the fan face diameter, matching an experimental wind tunnel BLI inlet test. The first DOE optimization required a fifth order model having 173 numerical simulation experiments and was able to reduce the DC60 baseline distortion from 64% down to 4.4%, while holding the pressure recovery constant. A second DOE optimization was performed holding the vanes heights at a constant value from the first DOE optimization with the two vane angles-of-attack as design factors. This DOE only required a second order model fit with 15 numerical simulation experiments and reduced DC60 to 3.5% with small decreases in the fourth and fifth harmonic amplitudes. The second optimal vane design was tested at the NASA Langley 0.3-Meter Transonic Cryogenic Tunnel in a BLI inlet experiment. The experimental results showed a 80% reduction of DPCPavg, the circumferential distortion level at the engine

  7. The climateprediction.net BBC climate change experiment: design of the coupled model ensemble.

    PubMed

    Frame, D J; Aina, T; Christensen, C M; Faull, N E; Knight, S H E; Piani, C; Rosier, S M; Yamazaki, K; Yamazaki, Y; Allen, M R

    2009-03-13

    Perturbed physics experiments are among the most comprehensive ways to address uncertainty in climate change forecasts. In these experiments, parameters and parametrizations in atmosphere-ocean general circulation models are perturbed across ranges of uncertainty, and results are compared with observations. In this paper, we describe the largest perturbed physics climate experiment conducted to date, the British Broadcasting Corporation (BBC) climate change experiment, in which the physics of the atmosphere and ocean are changed, and run in conjunction with a forcing ensemble designed to represent uncertainty in past and future forcings, under the A1B Special Report on Emissions Scenarios (SRES) climate change scenario. PMID:19087930

  8. Cryogenic design of the liquid helium experiment ``critical dynamics in microgravity``

    SciTech Connect

    Moeur, W.A.; Adriaans, M.J.; Boyd, S.T.; Strayer, D.M.; Duncan, R.V. |

    1995-10-01

    Although many well controlled experiments have been conducted to measure the static properties of systems near criticality, few experiments have explored the transport properties in systems driven far away from equilibrium as a phase transition occurs. The cryogenic design of an experiment to study the dynamic aspect of critical phenomena is reported here. Measurements of the thermal gradient across the superfluid (He II) -- normal fluid (He I) interface in helium under microgravity conditions will be performed as a heat flux holds the system away from equilibrium. New technologies are under development for this experiment, which is in the definition phase for a space shuttle flight.

  9. Design and Performance of an Automated Bioreactor for Cell Culture Experiments in a Microgravity Environment

    NASA Astrophysics Data System (ADS)

    Kim, Youn-Kyu; Park, Seul-Hyun; Lee, Joo-Hee; Choi, Gi-Hyuk

    2015-03-01

    In this paper, we describe the development of a bioreactor for a cell-culture experiment on the International Space Station (ISS). The bioreactor is an experimental device for culturing mouse muscle cells in a microgravity environment. The purpose of the experiment was to assess the impact of microgravity on the muscles to address the possibility of longterm human residence in space. After investigation of previously developed bioreactors, and analysis of the requirements for microgravity cell culture experiments, a bioreactor design is herein proposed that is able to automatically culture 32 samples simultaneously. This reactor design is capable of automatic control of temperature, humidity, and culture-medium injection rate; and satisfies the interface requirements of the ISS. Since bioreactors are vulnerable to cell contamination, the medium-circulation modules were designed to be a completely replaceable, in order to reuse the bioreactor after each experiment. The bioreactor control system is designed to circulate culture media to 32 culture chambers at a maximum speed of 1 ml/min, to maintain the temperature of the reactor at 36°C, and to keep the relative humidity of the reactor above 70%. Because bubbles in the culture media negatively affect cell culture, a de-bubbler unit was provided to eliminate such bubbles. A working model of the reactor was built according to the new design, to verify its performance, and was used to perform a cell culture experiment that confirmed the feasibility of this device.

  10. Designing and Developing Game-Like Learning Experience in Virtual Worlds: Challenges and Design Decisions of Novice Instructional Designers

    ERIC Educational Resources Information Center

    Yilmaz, Turkan Karakus; Cagiltay, Kursat

    2016-01-01

    Many virtual worlds have been adopted for implementation within educational settings because they are potentially useful for building effective learning environments. Since the flexibility of virtual worlds challenges to obtain effective and efficient educational outcomes, the design of such platforms need more attention. In the present study, the…

  11. Scaling of Thermal-Hydraulic Experiments for a Space Rankine Cycle and Selection of a Preconceptual Scaled Experiment Design

    SciTech Connect

    Sulfredge, CD

    2006-01-27

    To assist with the development of a space-based Rankine cycle power system using liquid potassium as the working fluid, a study has been conducted on possible scaled experiments with simulant fluids. This report will consider several possible working fluids and describe a scaling methodology to achieve thermal-hydraulic similarity between an actual potassium system and scaled representations of the Rankine cycle boiler or condenser. The most practical scaling approach examined is based on the selection of perfluorohexane (FC-72) as the simulant. Using the scaling methodology, a series of possible solutions have been calculated for the FC-72 boiler and condenser. The possible scaled systems will then be compared and preconceptual specifications and drawings given for the most promising design. The preconceptual design concept will also include integrating the scaled boiler and scaled condenser into a single experimental loop. All the preconceptual system specifications appear practical from a fabrication and experimental standpoint, but further work will be needed to arrive at a final experiment design.

  12. All voices matter in experience design: A commitment to action in engaging patient and family voice.

    PubMed

    Wolf, Jason A

    2016-09-01

    This article intends to frame the broader concept of experience design and the engagement of patient and family voice, reinforcing how truly aligned healthcare professionals are not only on the value of this work but also in understanding the benefits of it. When addressing the idea of design, it is important to look at the broadest possible construct and consider the engagement of patient and family voices in healthcare operational efforts, not as passive advisors but as active participants in data gathering, providing input, and with actual decision-making. The article offers engagement is not just part of process, facility, or experience design but must be part of the decisions made in how organizations in healthcare today are built, led, and sustained, fundamentally reinforcing our opportunity in healthcare is to focus on overall experience with purpose and intention. This commitment is what will lead to the outcomes all ultimately hope to achieve. PMID:27486186

  13. Effects of Spatial Experiences & Cognitive Styles in the Solution Process of Space-Based Design Problems in the First Year of Architectural Design Education

    ERIC Educational Resources Information Center

    Erkan Yazici, Yasemin

    2013-01-01

    There are many factors that influence designers in the architectural design process. Cognitive style, which varies according to the cognitive structure of persons, and spatial experience, which is created with spatial data acquired during life are two of these factors. Designers usually refer to their spatial experiences in order to find solutions…

  14. The emotional experience of patient care: a case for innovation in health care design.

    PubMed

    Altringer, Beth

    2010-07-01

    This paper considers recent developments in health care facility design and in the psychology literature that support a case for increased design sensitivity to the emotional experience of patient care. The author discusses several examples of innovative patient-centred health care design interventions. These generally resulted in improvements in the patient and staff experience of care, at less cost than major infrastructural interventions. The paper relates these developments in practice with recent neuroscience research, illustrating that the design of the built environment influences patient emotional stress. In turn, patient emotional stress appears to influence patient satisfaction, and in some instances, patient outcomes. This paper highlights the need for further research in this area.

  15. The NASA Langley Laminar-Flow-Control (LFC) experiment on a swept, supercritical airfoil: Design overview

    NASA Technical Reports Server (NTRS)

    Harris, Charles D.; Harvey, William D.; Brooks, Cuyler W., Jr.

    1988-01-01

    A large-chord, swept, supercritical, laminar-flow-control (LFC) airfoil was designed and constructed and is currently undergoing tests in the Langley 8 ft Transonic Pressure Tunnel. The experiment was directed toward evaluating the compatibility of LFC and supercritical airfoils, validating prediction techniques, and generating a data base for future transport airfoil design as part of NASA's ongoing research program to significantly reduce drag and increase aircraft efficiency. Unique features of the airfoil included a high design Mach number with shock free flow and boundary layer control by suction. Special requirements for the experiment included modifications to the wind tunnel to achieve the necessary flow quality and contouring of the test section walls to simulate free air flow about a swept model at transonic speeds. Design of the airfoil with a slotted suction surface, the suction system, and modifications to the tunnel to meet test requirements are discussed.

  16. Neutral Beam Injection Requirements and Design Issues for the National Compact Stellarator Experiment

    SciTech Connect

    H.W. Kugel; H. Neilson; W. Reiersen; M. Zarnstorff

    2002-02-11

    The National Compact Stellarator Experiment (NCSX) will require 6 MW of 50 keV neutral beam injection (NBI) with initial pulse lengths of 500 msec and upgradeable to pulse lengths of 1.5 sec. This paper discusses the NCSX NBI requirements and design issues, and shows how these are provided by the candidate PBX-M [Princeton Beta Experiment-Modification] NBI system.

  17. Radiometric and photometric design for an Acoustic Containerless Experiment System. [for space processing

    NASA Technical Reports Server (NTRS)

    Glavich, T. A.

    1981-01-01

    The design of an optical system for a high temperature Acoustic Containerless Experiment System is examined. The optical system provides two-axis video, cine and infrared images of an acoustically positioned sample over a temperature range of 20 to 1200 C. Emphasis is placed on the radiometric and photometric characterization of the elements in the optical system and the oven to assist image data determination. Sample visibility due to wall radiance is investigated along with visibility due to strobe radiance. The optical system is designed for operation in Spacelab, and is used for a variety of materials processing experiments.

  18. Optimizing ELISAs for precision and robustness using laboratory automation and statistical design of experiments.

    PubMed

    Joelsson, Daniel; Moravec, Phil; Troutman, Matthew; Pigeon, Joseph; DePhillips, Pete

    2008-08-20

    Transferring manual ELISAs to automated platforms requires optimizing the assays for each particular robotic platform. These optimization experiments are often time consuming and difficult to perform using a traditional one-factor-at-a-time strategy. In this manuscript we describe the development of an automated process using statistical design of experiments (DOE) to quickly optimize immunoassays for precision and robustness on the Tecan EVO liquid handler. By using fractional factorials and a split-plot design, five incubation time variables and four reagent concentration variables can be optimized in a short period of time.

  19. Experiment Design for Complex VTOL Aircraft with Distributed Propulsion and Tilt Wing

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick C.; Landman, Drew

    2015-01-01

    Selected experimental results from a wind tunnel study of a subscale VTOL concept with distributed propulsion and tilt lifting surfaces are presented. The vehicle complexity and automated test facility were ideal for use with a randomized designed experiment. Design of Experiments and Response Surface Methods were invoked to produce run efficient, statistically rigorous regression models with minimized prediction error. Static tests were conducted at the NASA Langley 12-Foot Low-Speed Tunnel to model all six aerodynamic coefficients over a large flight envelope. This work supports investigations at NASA Langley in developing advanced configurations, simulations, and advanced control systems.

  20. Design of high-Reynolds-number flat-plate experiments in the NTF

    NASA Technical Reports Server (NTRS)

    Saric, William S.

    1988-01-01

    The design of an experiment to measure skin friction and turbulent boundary layer characteristics at Reynolds numbers exceeding 1 x 10 to the 9th is described. The experiment will be conducted in a zero-pressure-gradient flow on a flat plate in the National Transonic Facility (NTF). The development of computational codes to analyze the aerodynamic loads and the blockage is documented. Novel instrumentation techniques and models, designed to operate in cryogenic environments, are presented. Special problems associated with aerodynamic loads, surface finish, and hot-wire anemometers are discussed.

  1. Building international experiences into an engineering curriculum - a design project-based approach

    NASA Astrophysics Data System (ADS)

    Maldonado, Victor; Castillo, Luciano; Carbajal, Gerardo; Hajela, Prabhat

    2014-07-01

    This paper is a descriptive account of how short-term international and multicultural experiences can be integrated into early design experiences in an aerospace engineering curriculum. Such approaches are considered as important not only in fostering a student's interest in the engineering curriculum, but also exposing them to a multicultural setting that they are likely to encounter in their professional careers. In the broader sense, this programme is described as a model that can be duplicated in other engineering disciplines as a first-year experience. In this study, undergraduate students from Rensselaer Polytechnic Institute (RPI) and Universidad del Turabo (UT) in Puerto Rico collaborated on a substantial design project consisting of designing, fabricating, and flight-testing radio-controlled model aircraft as a capstone experience in a semester-long course on Fundamentals of Flight. The two-week long experience in Puerto Rico was organised into academic and cultural components designed with the following objectives: (i) to integrate students in a multicultural team-based academic and social environment, (ii) to practise team-building skills and develop students' critical thinking and analytical skills, and finally (iii) to excite students about their engineering major through practical applications of aeronautics and help them decide if it is a right fit for them.

  2. The usefulness of systematic reviews of animal experiments for the design of preclinical and clinical studies.

    PubMed

    de Vries, Rob B M; Wever, Kimberley E; Avey, Marc T; Stephens, Martin L; Sena, Emily S; Leenaars, Marlies

    2014-01-01

    The question of how animal studies should be designed, conducted, and analyzed remains underexposed in societal debates on animal experimentation. This is not only a scientific but also a moral question. After all, if animal experiments are not appropriately designed, conducted, and analyzed, the results produced are unlikely to be reliable and the animals have in effect been wasted. In this article, we focus on one particular method to address this moral question, namely systematic reviews of previously performed animal experiments. We discuss how the design, conduct, and analysis of future (animal and human) experiments may be optimized through such systematic reviews. In particular, we illustrate how these reviews can help improve the methodological quality of animal experiments, make the choice of an animal model and the translation of animal data to the clinic more evidence-based, and implement the 3Rs. Moreover, we discuss which measures are being taken and which need to be taken in the future to ensure that systematic reviews will actually contribute to optimizing experimental design and thereby to meeting a necessary condition for making the use of animals in these experiments justified.

  3. Application of Modern Design of Experiments to CARS Thermometry in a Model Scramjet Engine

    NASA Technical Reports Server (NTRS)

    Danehy, P. M.; DeLoach, R.; Cutler, A. D.

    2002-01-01

    We have applied formal experiment design and analysis to optimize the measurement of temperature in a supersonic combustor at NASA Langley Research Center. We used the coherent anti-Stokes Raman spectroscopy (CARS) technique to map the temperature distribution in the flowfield downstream of an 1160 K, Mach 2 freestream into which supersonic hydrogen fuel is injected at an angle of 30 degrees. CARS thermometry is inherently a single-point measurement technique; it was used to map thc flow by translating the measurement volume through the flowfield. The method known as "Modern Design of Experiments" (MDOE) was used to estimate the data volume required, design the test matrix, perform the experiment and analyze the resulting data. MDOE allowed us to match the volume of data acquired to the precision requirements of the customer. Furthermore, one aspect of MDOE, known as response surface methodology, allowed us to develop precise maps of the flowfield temperature, allowing interpolation between measurement points. An analytic function in two spatial variables was fit to the data from a single measurement plane. Fitting with a Cosine Series Bivariate Function allowed the mean temperature to be mapped with 95% confidence interval half-widths of +/- 30 Kelvin, comfortably meeting the confidence of +/- 50 Kelvin specified prior to performing the experiments. We estimate that applying MDOE to the present experiment saved a factor of 5 in data volume acquired, compared to experiments executed in the traditional manner. Furthermore, the precision requirements could have been met with less than half the data acquired.

  4. Design and analysis of numerical experiments. [applicable to fully nonlinear, global, equivalent-barotropic model

    NASA Technical Reports Server (NTRS)

    Bowman, Kenneth P.; Sacks, Jerome; Chang, Yue-Fang

    1993-01-01

    Methods for the design and analysis of numerical experiments that are especially useful and efficient in multidimensional parameter spaces are presented. The analysis method, which is similar to kriging in the spatial analysis literature, fits a statistical model to the output of the numerical model. The method is applied to a fully nonlinear, global, equivalent-barotropic dynamical model. The statistical model also provides estimates for the uncertainty of predicted numerical model output, which can provide guidance on where in the parameter space to conduct further experiments, if necessary. The method can provide significant improvements in the efficiency with which numerical sensitivity experiments are conducted.

  5. Analytical and experimental investigation of liquid double drop dynamics: Preliminary design for space shuttle experiments

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The preliminary grant assessed the use of laboratory experiments for simulating low g liquid drop experiments in the space shuttle environment. Investigations were begun of appropriate immiscible liquid systems, design of experimental apparatus and analyses. The current grant continued these topics, completed construction and preliminary testing of the experimental apparatus, and performed experiments on single and compound liquid drops. A continuing assessment of laboratory capabilities, and the interests of project personnel and available collaborators, led to, after consultations with NASA personnel, a research emphasis specializing on compound drops consisting of hollow plastic or elastic spheroids filled with liquids.

  6. Specifications for and preliminary design of a plant growth chamber for orbital experimental experiments

    NASA Technical Reports Server (NTRS)

    Sweet, H. C.; Simmonds, R. C.

    1976-01-01

    It was proposed that plant experiments be performed on board the space shuttle. To permit the proper execution of most tests, the craft must contain a plant growth chamber which is adequately designed to control those environmental factors which can induce changes in a plant's physiology and morphology. The various needs of, and environmental factors affecting, plants are identified. The permissilbe design, construction and performance limits for a plant-growth chamber are set, and tentative designs were prepared for units which are compatible with both the botanical requirements and the constraints imposed by the space shuttle.

  7. Vibration control experiment design for the 15-m hoop/column antenna

    NASA Technical Reports Server (NTRS)

    Ham, F. M.; Hyland, D. C.

    1985-01-01

    A test program is designed for a ground-based vibration control experiment utilizing as the test article the 15-M Hoop/Column Antenna. Overall objectives of the designed ground-based test program include: (1) the validation of large space structure (LSS) control systemm techniques; (2) the validation of LSS parameter identification techniques: (3) the evaluation of actuator of actuator and sensor placement methodology; and (3) the validation of LSS computer models. Critical concerns in LSS Controls and Dynamics are: low frequency vibrational modes, close modal spacing, parameter uncertainties, controller software limitations, nonlinearities and coupling of modes through damping. Analytical results are presented which include compensator designs for varying compensator order.

  8. Robust experiment design for estimating myocardial {beta} adrenergic receptor concentration using PET

    SciTech Connect

    Salinas, Cristian; Muzic, Raymond F. Jr.; Ernsberger, Paul; Saidel, Gerald M.

    2007-01-15

    Myocardial {beta} adrenergic receptor ({beta}-AR) concentration can substantially decrease in congestive heart failure and significantly increase in chronic volume overload, such as in severe aortic valve regurgitation. Positron emission tomography (PET) with an appropriate ligand-receptor model can be used for noninvasive estimation of myocardial {beta}-AR concentration in vivo. An optimal design of the experiment protocol, however, is needed for sufficiently precise estimates of {beta}-AR concentration in a heterogeneous population. Standard methods of optimal design do not account for a heterogeneous population with a wide range of {beta}-AR concentrations and other physiological parameters and consequently are inadequate. To address this, we have developed a methodology to design a robust two-injection protocol that provides reliable estimates of myocardial {beta}-AR concentration in normal and pathologic states. A two-injection protocol of the high affinity {beta}-AR antagonist [{sup 18}F]-(S)-fluorocarazolol was designed based on a computer-generated (or synthetic) population incorporating a wide range of {beta}-AR concentrations. Timing and dosage of the ligand injections were optimally designed with minimax criterion to provide the least bad {beta}-AR estimates for the worst case in the synthetic population. This robust experiment design for PET was applied to experiments with pigs before and after {beta}-AR upregulation by chemical sympathectomy. Estimates of {beta}-AR concentration were found by minimizing the difference between the model-predicted and experimental PET data. With this robust protocol, estimates of {beta}-AR concentration showed high precision in both normal and pathologic states. The increase in {beta}-AR concentration after sympathectomy predicted noninvasively with PET is consistent with the increase shown by in vitro assays in pig myocardium. A robust experiment protocol was designed for PET that yields reliable estimates of {beta

  9. Optimizing an experimental design for a CSEM experiment: methodology and synthetic tests

    NASA Astrophysics Data System (ADS)

    Roux, E.; Garcia, X.

    2014-04-01

    Optimizing an experimental design is a compromise between maximizing information we get about the target and limiting the cost of the experiment, providing a wide range of constraints. We present a statistical algorithm for experiment design that combines the use of linearized inverse theory and stochastic optimization technique. Linearized inverse theory is used to quantify the quality of one given experiment design while genetic algorithm (GA) enables us to examine a wide range of possible surveys. The particularity of our algorithm is the use of the multi-objective GA NSGA II that searches designs that fit several objective functions (OFs) simultaneously. This ability of NSGA II is helping us in defining an experiment design that focuses on a specified target area. We present a test of our algorithm using a 1-D electrical subsurface structure. The model we use represents a simple but realistic scenario in the context of CO2 sequestration that motivates this study. Our first synthetic test using a single OF shows that a limited number of well-distributed observations from a chosen design have the potential to resolve the given model. This synthetic test also points out the importance of a well-chosen OF, depending on our target. In order to improve these results, we show how the combination of two OFs using a multi-objective GA enables us to determine an experimental design that maximizes information about the reservoir layer. Finally, we present several tests of our statistical algorithm in more challenging environments by exploring the influence of noise, specific site characteristics or its potential for reservoir monitoring.

  10. Real-time PCR probe optimization using design of experiments approach.

    PubMed

    Wadle, S; Lehnert, M; Rubenwolf, S; Zengerle, R; von Stetten, F

    2016-03-01

    Primer and probe sequence designs are among the most critical input factors in real-time polymerase chain reaction (PCR) assay optimization. In this study, we present the use of statistical design of experiments (DOE) approach as a general guideline for probe optimization and more specifically focus on design optimization of label-free hydrolysis probes that are designated as mediator probes (MPs), which are used in reverse transcription MP PCR (RT-MP PCR). The effect of three input factors on assay performance was investigated: distance between primer and mediator probe cleavage site; dimer stability of MP and target sequence (influenza B virus); and dimer stability of the mediator and universal reporter (UR). The results indicated that the latter dimer stability had the greatest influence on assay performance, with RT-MP PCR efficiency increased by up to 10% with changes to this input factor. With an optimal design configuration, a detection limit of 3-14 target copies/10 μl reaction could be achieved. This improved detection limit was confirmed for another UR design and for a second target sequence, human metapneumovirus, with 7-11 copies/10 μl reaction detected in an optimum case. The DOE approach for improving oligonucleotide designs for real-time PCR not only produces excellent results but may also reduce the number of experiments that need to be performed, thus reducing costs and experimental times.

  11. Real-time PCR probe optimization using design of experiments approach.

    PubMed

    Wadle, S; Lehnert, M; Rubenwolf, S; Zengerle, R; von Stetten, F

    2016-03-01

    Primer and probe sequence designs are among the most critical input factors in real-time polymerase chain reaction (PCR) assay optimization. In this study, we present the use of statistical design of experiments (DOE) approach as a general guideline for probe optimization and more specifically focus on design optimization of label-free hydrolysis probes that are designated as mediator probes (MPs), which are used in reverse transcription MP PCR (RT-MP PCR). The effect of three input factors on assay performance was investigated: distance between primer and mediator probe cleavage site; dimer stability of MP and target sequence (influenza B virus); and dimer stability of the mediator and universal reporter (UR). The results indicated that the latter dimer stability had the greatest influence on assay performance, with RT-MP PCR efficiency increased by up to 10% with changes to this input factor. With an optimal design configuration, a detection limit of 3-14 target copies/10 μl reaction could be achieved. This improved detection limit was confirmed for another UR design and for a second target sequence, human metapneumovirus, with 7-11 copies/10 μl reaction detected in an optimum case. The DOE approach for improving oligonucleotide designs for real-time PCR not only produces excellent results but may also reduce the number of experiments that need to be performed, thus reducing costs and experimental times. PMID:27077046

  12. Real-time PCR probe optimization using design of experiments approach

    PubMed Central

    Wadle, S.; Lehnert, M.; Rubenwolf, S.; Zengerle, R.; von Stetten, F.

    2015-01-01

    Primer and probe sequence designs are among the most critical input factors in real-time polymerase chain reaction (PCR) assay optimization. In this study, we present the use of statistical design of experiments (DOE) approach as a general guideline for probe optimization and more specifically focus on design optimization of label-free hydrolysis probes that are designated as mediator probes (MPs), which are used in reverse transcription MP PCR (RT-MP PCR). The effect of three input factors on assay performance was investigated: distance between primer and mediator probe cleavage site; dimer stability of MP and target sequence (influenza B virus); and dimer stability of the mediator and universal reporter (UR). The results indicated that the latter dimer stability had the greatest influence on assay performance, with RT-MP PCR efficiency increased by up to 10% with changes to this input factor. With an optimal design configuration, a detection limit of 3–14 target copies/10 μl reaction could be achieved. This improved detection limit was confirmed for another UR design and for a second target sequence, human metapneumovirus, with 7–11 copies/10 μl reaction detected in an optimum case. The DOE approach for improving oligonucleotide designs for real-time PCR not only produces excellent results but may also reduce the number of experiments that need to be performed, thus reducing costs and experimental times. PMID:27077046

  13. The relationships among design experiments, invariant measurement scales, and domain theories.

    PubMed

    Bunderson, C Victor; Newby, Van A

    2009-01-01

    In this paper we discuss principled design experiments, a rigorous, experimentally-oriented form of design-based research. We show the dependence of design experiments on invariant measurement scales. We discuss four kinds of invariance culminating in interpretive invariance, and how this in turn depends on increasingly adequate theories of a domain. These theories give an account of the dimensions and ordered attainments on a set of dimensions that span a domain appropriately. This account may be called a domain theory or learning theory of progressive attainments (in a local domain). We show the direct, and the broader benefits of developing and using these descriptive theories of a domain to guide prescriptive design approaches to research. In process of giving an account of this set of interdependencies, we will discuss aspects of the design method we are using, called Validity-Centered Design. This design framework guides the development of instruments based on domain theories, the development of learning opportunities; also based on domain theories, and the construction of a sound validity argument for systems that integrate learning with assessment.

  14. The Scope and Design of Structured Group Learning Experiences at Community Colleges

    ERIC Educational Resources Information Center

    Hatch, Deryl K.; Bohlig, E. Michael

    2015-01-01

    This study explores through descriptive analysis the similarities of structured group learning experiences such as first-year seminars, learning communities, orientation, success courses, and accelerated developmental education programs, in terms of their design features and implementation at community colleges. The study takes as its conceptual…

  15. Space shuttle descent flight control design requirements and experiments Learned, Pt. 1 p 617-628

    NASA Technical Reports Server (NTRS)

    Kafer, G.; Wilson, D.

    1983-01-01

    Some of the lessons learned during the development of the Space Shuttle descent flight control system (FCS) are reviewed. Examples confirm the importance for requirements definition, systems level analyses, and testing. In sounding these experiences may have implication for future designs or suggest the discipline required in this engineering art.

  16. Small-Scale Design Experiments as Working Space for Larger Mobile Communication Challenges

    ERIC Educational Resources Information Center

    Lowe, Sarah; Stuedahl, Dagny

    2014-01-01

    In this paper, a design experiment using Instagram as a cultural probe is submitted as a method for analyzing the challenges that arise when considering the implementation of social media within a distributed communication space. It outlines how small, iterative investigations can reveal deeper research questions relevant to the education of…

  17. Participating with Experience--A Case Study of Students as Co-Producers of Course Design

    ERIC Educational Resources Information Center

    Reneland-Forsman, Linda

    2016-01-01

    Higher Education (HE) needs to handle a diverse student population. The role of student expectations and previous experience is a key to fully participate. This study investigates student meaning making and interaction in a course designed to stimulate student as co-creators of course content and aims. Results revealed that rich communication…

  18. Kinetic resolution of oxazinones: rational exploration of chemical space through the design of experiments.

    PubMed

    Renzi, Polyssena; Kronig, Christel; Carlone, Armando; Eröksüz, Serap; Berkessel, Albrecht; Bella, Marco

    2014-09-01

    The organocatalytic kinetic resolution of 4-substituted oxazinones has been optimised (selectivity factor S up to 98, chiral oxazinone ee values up to 99.6 % (1 a-g) and product ee values up to 90 % (3 a-g)) in a rational way by applying the Design of Experiments (DoE) approach.

  19. Exploring a Comprehensive Model for Early Childhood Vocabulary Instruction: A Design Experiment

    ERIC Educational Resources Information Center

    Wang, X. Christine; Christ, Tanya; Chiu, Ming Ming

    2014-01-01

    Addressing a critical need for effective vocabulary practices in early childhood classrooms, we conducted a design experiment to achieve three goals: (1) developing a comprehensive model for early childhood vocabulary instruction, (2) examining the effectiveness of this model, and (3) discerning the contextual conditions that hinder or facilitate…

  20. Students' Perceptions of Their Learning Experiences in an Authentic Instructional Design Context

    ERIC Educational Resources Information Center

    Dabbagh, Nada; Blijd, Cecily Williams

    2010-01-01

    The purpose of this case study was to examine students' perceptions of their learning experiences while working on a real world instructional design project in a performance oriented team in the context of a situated and problem-based learning environment. Participants were 11 graduate students enrolled in a learning-by-doing instructional design…

  1. Design a Contract: A Simple Principal-Agent Problem as a Classroom Experiment

    ERIC Educational Resources Information Center

    Gachter, Simon; Konigstein, Manfred

    2009-01-01

    The authors present a simple classroom experiment that can be used as a teaching device to introduce important concepts of organizational economics and incentive contracting. First, students take the role of a principal and design a contract that consists of a fixed payment and an incentive component. Second, students take the role of agents and…

  2. Board Games and Board Game Design as Learning Tools for Complex Scientific Concepts: Some Experiences

    ERIC Educational Resources Information Center

    Chiarello, Fabio; Castellano, Maria Gabriella

    2016-01-01

    In this paper the authors report different experiences in the use of board games as learning tools for complex and abstract scientific concepts such as Quantum Mechanics, Relativity or nano-biotechnologies. In particular we describe "Quantum Race," designed for the introduction of Quantum Mechanical principles, "Lab on a chip,"…

  3. Mobile App Design for Teaching and Learning: Educators' Experiences in an Online Graduate Course

    ERIC Educational Resources Information Center

    Hsu, Yu-Chang; Ching, Yu-Hui

    2013-01-01

    This research explored how educators with limited programming experiences learned to design mobile apps through peer support and instructor guidance. Educators were positive about the sense of community in this online course. They also considered App Inventor a great web-based visual programming tool for developing useful and fully functioning…

  4. Designing Experiments on Thermal Interactions by Secondary-School Students in a Simulated Laboratory Environment

    ERIC Educational Resources Information Center

    Lefkos, Ioannis; Psillos, Dimitris; Hatzikraniotis, Euripides

    2011-01-01

    Background and purpose: The aim of this study was to explore the effect of investigative activities with manipulations in a virtual laboratory on students' ability to design experiments. Sample: Fourteen students in a lower secondary school in Greece attended a teaching sequence on thermal phenomena based on the use of information and…

  5. Students' Sense of Community Based on Experiences with Residence Hall Design

    ERIC Educational Resources Information Center

    Heasley, Christopher L.

    2013-01-01

    This study seeks to determine students' sense of community outcomes based on experiences with different residence hall architectural designs. Sense of community is a "feeling that members have of belonging, a feeling that members matter to one another and to the group, and a shared faith that members' needs will be met through their…

  6. Enhancing Research and Practice in Early Childhood through Formative and Design Experiments

    ERIC Educational Resources Information Center

    Bradley, Barbara A.; Reinking, David

    2011-01-01

    This article describes formative and design experiments and how they can advance research and instructional practices in early childhood education. We argue that this relatively new approach to education research closes the gap between research and practice, and it addresses limitations that have been identified in early childhood research. We…

  7. Design of the Advanced Gas Reactor Fuel Experiments for Irradiation in the Advanced Test Reactor

    SciTech Connect

    S. Blaine Grover

    2005-10-01

    The United States Department of Energy’s Advanced Gas Reactor (AGR) Fuel Development and Qualification Program will be irradiating eight particle fuel tests in the Advanced Test Reactor (ATR) located at the newly formed Idaho National Laboratory (INL) to support development of the next generation Very High Temperature Reactor (VHTR) in the United States. The ATR has a long history of irradiation testing in support of reactor development and the INL has been designated as the new United States Department of Energy’s lead laboratory for nuclear energy development. These AGR fuel experiments will be irradiated over the next ten years to demonstrate and qualify new particle fuel for use in high temperature gas reactors. The experiments will be irradiated in an inert sweep gas atmosphere with on-line temperature monitoring and control combined with on-line fission product monitoring of the sweep gas. The final design phase has just been completed on the first experiment (AGR-1) in this series and the support systems and fission product monitoring system that will monitor and control the experiment during irradiation. This paper discusses the development of the experimental hardware and support system designs and the status of the experiment.

  8. Lattice design of the integrable optics test accelerator and optical stochastic cooling experiment at Fermilab

    NASA Astrophysics Data System (ADS)

    Kafka, Gene

    The Integrable Optics Test Accelerator (IOTA) storage ring at Fermilab will serve as the backbone for a broad spectrum of Advanced Accelerator R&D (AARD) experiments, and as such, must be designed with significant flexibility in mind, but without compromising cost efficiency. The nonlinear experiments at IOTA will include: achievement of a large nonlinear tune shift/spread without degradation of dynamic aperture; suppression of strong lattice resonances; study of stability of nonlinear systems to perturbations; and studies of different variants of nonlinear magnet design. The ring optics control has challenging requirements that reach or exceed the present state of the art. The development of a complete self-consistent design of the IOTA ring optics, meeting the demands of all planned AARD experiments, is presented. Of particular interest are the precise control for nonlinear integrable optics experiments and the transverse-to-longitudinal coupling and phase stability for the Optical Stochastic Cooling Experiment (OSC). Since the beam time-of-flight must be tightly controlled in the OSC section, studies of second order corrections in this section are presented.

  9. Lattice design of the integrable optics test accelerator and optical stochastic cooling experiment at Fermilab

    SciTech Connect

    Kafka, Gene

    2015-05-01

    The Integrable Optics Test Accelerator (IOTA) storage ring at Fermilab will serve as the backbone for a broad spectrum of Advanced Accelerator R&D (AARD) experiments, and as such, must be designed with signi cant exibility in mind, but without compromising cost e ciency. The nonlinear experiments at IOTA will include: achievement of a large nonlinear tune shift/spread without degradation of dynamic aperture; suppression of strong lattice resonances; study of stability of nonlinear systems to perturbations; and studies of di erent variants of nonlinear magnet design. The ring optics control has challenging requirements that reach or exceed the present state of the art. The development of a complete self-consistent design of the IOTA ring optics, meeting the demands of all planned AARD experiments, is presented. Of particular interest are the precise control for nonlinear integrable optics experiments and the transverse-to-longitudinal coupling and phase stability for the Optical Stochastic Cooling Experiment (OSC). Since the beam time-of- ight must be tightly controlled in the OSC section, studies of second order corrections in this section are presented.

  10. Recommended practices in elevated temperature design: A compendium of breeder reactor experiences (1970-1986): An overview

    SciTech Connect

    Wei, B.C.; Cooper, W.L. Jr.; Dhalla, A.K.

    1987-09-01

    Significant experiences have been accumulated in the establishment of design methods and criteria applicable to the design of Liquid Metal Fast Breeder Reactor (LMFBR) components. The Subcommittee of the Elevated Temperature Design under the Pressure Vessel Research Council (PVRC) has undertaken to collect, on an international basis, design experience gained, and the lessons learned, to provide guidelines for next generation advanced reactor designs. This paper shall present an overview and describe the highlights of the work.

  11. Design Optimization of PZT-Based Piezoelectric Cantilever Beam by Using Computational Experiments

    NASA Astrophysics Data System (ADS)

    Kim, Jihoon; Park, Sanghyun; Lim, Woochul; Jang, Junyong; Lee, Tae Hee; Hong, Seong Kwang; Song, Yewon; Sung, Tae Hyun

    2016-08-01

    Piezoelectric energy harvesting is gaining huge research interest since it provides high power density and has real-life applicability. However, investigative research for the mechanical-electrical coupling phenomenon remains challenging. Many researchers depend on physical experiments to choose devices with the best performance which meet design objectives through case analysis; this involves high design costs. This study aims to develop a practical model using computer simulations and to propose an optimized design for a lead zirconate titanate (PZT)-based piezoelectric cantilever beam which is widely used in energy harvesting. In this study, the commercial finite element (FE) software is used to predict the voltage generated from vibrations of the PZT-based piezoelectric cantilever beam. Because the initial FE model differs from physical experiments, the model is calibrated by multi-objective optimization to increase the accuracy of the predictions. We collect data from physical experiments using the cantilever beam and use these experimental results in the calibration process. Since dynamic analysis in the FE analysis of the piezoelectric cantilever beam with a dense step size is considerably time-consuming, a surrogate model is employed for efficient optimization. Through the design optimization of the PZT-based piezoelectric cantilever beam, a high-performance piezoelectric device was developed. The sensitivity of the variables at the optimum design is analyzed to suggest a further improved device.

  12. Edesign: Primer and Enhanced Internal Probe Design Tool for Quantitative PCR Experiments and Genotyping Assays

    PubMed Central

    Kasahara, Naoko; Delobel, Diane; Hanami, Takeshi; Tanaka, Yuki; de Hoon, Michiel J. L.; Hayashizaki, Yoshihide; Usui, Kengo; Harbers, Matthias

    2016-01-01

    Analytical PCR experiments preferably use internal probes for monitoring the amplification reaction and specific detection of the amplicon. Such internal probes have to be designed in close context with the amplification primers, and may require additional considerations for the detection of genetic variations. Here we describe Edesign, a new online and stand-alone tool for designing sets of PCR primers together with an internal probe for conducting quantitative real-time PCR (qPCR) and genotypic experiments. Edesign can be used for selecting standard DNA oligonucleotides like for instance TaqMan probes, but has been further extended with new functions and enhanced design features for Eprobes. Eprobes, with their single thiazole orange-labelled nucleotide, allow for highly sensitive genotypic assays because of their higher DNA binding affinity as compared to standard DNA oligonucleotides. Using new thermodynamic parameters, Edesign considers unique features of Eprobes during primer and probe design for establishing qPCR experiments and genotyping by melting curve analysis. Additional functions in Edesign allow probe design for effective discrimination between wild-type sequences and genetic variations either using standard DNA oligonucleotides or Eprobes. Edesign can be freely accessed online at http://www.dnaform.com/edesign2/, and the source code is available for download. PMID:26863543

  13. Development of a Plastic Melt Waste Compactor for Space Missions Experiments and Prototype Design

    NASA Technical Reports Server (NTRS)

    Pace, Gregory; Wignarajah, Kanapathipillai; Pisharody, Suresh; Fisher, John

    2004-01-01

    This paper describes development at NASA Ames Research Center of a heat melt compactor that can be used on both near term and far term missions. Experiments have been performed to characterize the behavior of composite wastes that are representative of the types of wastes produced on current and previous space missions such as International Space Station, Space Shuttle, MIR and Skylab. Experiments were conducted to characterize the volume reduction, bonding, encapsulation and biological stability of the waste composite and also to investigate other key design issues such as plastic extrusion, noxious off-gassing and removal of the of the plastic waste product from the processor. The experiments provided the data needed to design a prototype plastic melt waste processor, a description of which is included in the paper.

  14. Design and development status of ETS-7, an RVD and space robot experiment satellite

    NASA Technical Reports Server (NTRS)

    Oda, M.; Inagaki, T.; Nishida, M.; Kibe, K.; Yamagata, F.

    1994-01-01

    ETS-7 (Engineering Test Satellite #7) is an experimental satellite for the in-orbit experiment of the Rendezvous Docking (RVD) and the space robot (RBT) technologies. ETS-7 is a set of two satellites, a chaser satellite and a target satellite. Both satellites will be launched together by NASDA's H-2 rocket into a low earth orbit. Development of ETS-7 started in 1990. Basic design and EM (Engineering Model) development are in progress now in 1994. The satellite will be launched in mid 1997 and the above in-orbit experiments will be conducted for 1.5 years. Design of ETS-7 RBT experiment system and development status are described in this paper.

  15. Comparison of Resource Requirements for a Wind Tunnel Test Designed with Conventional vs. Modern Design of Experiments Methods

    NASA Technical Reports Server (NTRS)

    DeLoach, Richard; Micol, John R.

    2011-01-01

    The factors that determine data volume requirements in a typical wind tunnel test are identified. It is suggested that productivity in wind tunnel testing can be enhanced by managing the inference error risk associated with evaluating residuals in a response surface modeling experiment. The relationship between minimum data volume requirements and the factors upon which they depend is described and certain simplifications to this relationship are realized when specific model adequacy criteria are adopted. The question of response model residual evaluation is treated and certain practical aspects of response surface modeling are considered, including inference subspace truncation. A wind tunnel test plan developed by using the Modern Design of Experiments illustrates the advantages of an early estimate of data volume requirements. Comparisons are made with a representative One Factor At a Time (OFAT) wind tunnel test matrix developed to evaluate a surface to air missile.

  16. Flight path design issues for the TOPEX mission. [Ocean Topography Experiment

    NASA Technical Reports Server (NTRS)

    Frautnick, J. C.; Cutting, E.

    1983-01-01

    The proposed Ocean Topography Experiment (TOPEX) is an earth satellite mission currently under consideration by NASA. The primary purpose of the experiment is to determine the general circulation of the oceans and its variability. High precision, space based altimeter measurements will be combined with surface measurements and ocean models to accomplish the mission objectives. The paper will discuss mission requirements on orbit design, orbit selection space, derived requirements on navigation and satellite design issues which impact orbit selection. Unique aspects of the TOPEX orbit design are highlighted, such as high precision repeating orbits, 'frozen orbit' values of eccentricity and periapses, precise maneuver and orbit determination requirements and insuring crossing arcs over a calibration-site.

  17. Spaceflight Holography Investigation in a Virtual Apparatus (SHIVA) Ground Experiments and Concepts for Flight Design

    NASA Technical Reports Server (NTRS)

    Miernik, Janie H.; Trolinger, James D.; Lackey, Jeffrey D.; Milton, Martha E.; Waggoner, Jason; Pope, Regina D.

    2002-01-01

    This paper discusses the development and design of an experimental test cell for ground-based testing to provide requirements for the Spaceflight Holography Investigation in a Virtual Apparatus (SHIVA) experiment. Ground-based testing of a hardware breadboard set-up is being conducted at Marshall Space Flight Center in Huntsville, Alabama. SHIVA objectives are to test and validate new solutions of the general equation of motion of a particle in a fluid, including particle-particle interaction, wall effects, motion at higher Reynolds Number, and a motion and dissolution of a crystal moving in a fluid. These objectives will be achieved by recording a large number of holograms of particle motion in the International Space Station (ISS) glove box under controlled conditions, extracting the precise three- dimensional position of all the particles as a function of time, and examining the effects of all parameters on the motion of the particles. This paper will describe the mechanistic approach to enabling the SHIVA experiment to be performed in a ISS glove box in microgravity. Because the particles are very small, surface tension becomes a major consideration in designing the mechanical method to meet the experiments objectives in microgravity, To keep a particle or particles in the center of the test cell long enough to perform and record the experiment and to preclude contribution to particle motion, requires avoiding any initial velocity in particle placement. A Particle Injection Mechanism (PIM) designed for microgravity has been devised and tested to enable SHIVA imaging. Also, a test cell capture mechanism, to secure the test cell during vibration on a specially designed shaker table for the SHIVA experiment will be described. Concepts for flight design are also presented.

  18. Design and test of a mechanically pumped two-phase thermal control flight experiment

    NASA Technical Reports Server (NTRS)

    Grote, M. G.; Stark, J. A.; Butler, C. D.; Mcintosh, R.

    1987-01-01

    A flight experiment of a mechanically pumped two-phase ammonia thermal control system, incorporating a number of new component designs, has been assembled and tested in a 1-g environment. Additional microgravity tests are planned on the Space Shuttle when Shuttle flights are resumed. The primary purpose of this experiment is to evaluate the operation of a mechanically pumped two-phase ammonia system, with emphasis on determining the performance of an evaporative Two-Phase Mounting Plate. The experiment also evaluates the performance of other specially designed components, such as the two-phase reservoir for temperature control, condensing radiator/heat sink, spiral tube boiler, and pressure drop experiment. The 1-g tests have shown that start-up of the two-phase experiment is easily accomplished with only a partial fill of ammonia. The experiment maintained a constant mounting plate temperature without flow rate controls over a very wide range of heat loads, flow rates, inlet flow conditions and exit qualities. The tests also showed the successful operation of the mounting plate in the heat sharing condensing mode.

  19. Iterative experiment design guides the characterization of a light-inducible gene expression circuit

    PubMed Central

    Ruess, Jakob; Parise, Francesca; Milias-Argeitis, Andreas; Khammash, Mustafa; Lygeros, John

    2015-01-01

    Systems biology rests on the idea that biological complexity can be better unraveled through the interplay of modeling and experimentation. However, the success of this approach depends critically on the informativeness of the chosen experiments, which is usually unknown a priori. Here, we propose a systematic scheme based on iterations of optimal experiment design, flow cytometry experiments, and Bayesian parameter inference to guide the discovery process in the case of stochastic biochemical reaction networks. To illustrate the benefit of our methodology, we apply it to the characterization of an engineered light-inducible gene expression circuit in yeast and compare the performance of the resulting model with models identified from nonoptimal experiments. In particular, we compare the parameter posterior distributions and the precision to which the outcome of future experiments can be predicted. Moreover, we illustrate how the identified stochastic model can be used to determine light induction patterns that make either the average amount of protein or the variability in a population of cells follow a desired profile. Our results show that optimal experiment design allows one to derive models that are accurate enough to precisely predict and regulate the protein expression in heterogeneous cell populations over extended periods of time. PMID:26085136

  20. Iterative experiment design guides the characterization of a light-inducible gene expression circuit.

    PubMed

    Ruess, Jakob; Parise, Francesca; Milias-Argeitis, Andreas; Khammash, Mustafa; Lygeros, John

    2015-06-30

    Systems biology rests on the idea that biological complexity can be better unraveled through the interplay of modeling and experimentation. However, the success of this approach depends critically on the informativeness of the chosen experiments, which is usually unknown a priori. Here, we propose a systematic scheme based on iterations of optimal experiment design, flow cytometry experiments, and Bayesian parameter inference to guide the discovery process in the case of stochastic biochemical reaction networks. To illustrate the benefit of our methodology, we apply it to the characterization of an engineered light-inducible gene expression circuit in yeast and compare the performance of the resulting model with models identified from nonoptimal experiments. In particular, we compare the parameter posterior distributions and the precision to which the outcome of future experiments can be predicted. Moreover, we illustrate how the identified stochastic model can be used to determine light induction patterns that make either the average amount of protein or the variability in a population of cells follow a desired profile. Our results show that optimal experiment design allows one to derive models that are accurate enough to precisely predict and regulate the protein expression in heterogeneous cell populations over extended periods of time.

  1. Designing Microarray and RNA-seq Experiments for Greater Systems Biology Discovery in Modern Plant Genomics.

    PubMed

    Yang, Chuanping; Wei, Hairong

    2014-11-01

    Microarray and RNA-seq experiments have become an important part of modern genomics and systems biology. Obtaining meaningful biological data from these experiments is an arduous task that demands close attention to many details. Negligence at any step can lead to gene expression data containing inadequate or composite information that is recalcitrant for pattern extraction. Therefore, it is imperative to carefully consider experimental design before launching a time-consuming and costly experiment. Contemporarily, most genomics experiments have two objectives: (1) generate two or more groups of comparable data for identifying differentially expressed genes, gene families, biological processes, or metabolic pathways under experimental condition. (2) build local gene regulatory networks and identify hierarchically important regulators governing biological processes and pathways of interest. Since the first objective aims to identify the active molecular identities and the second provides a basis for understanding the underlying molecular mechanisms through inferring causality relationships mediated by treatment, an optimal experiment is to produce biologically relevant and extractable data to meet both objectives without substantially increasing the cost. This review discussed the major issues that researchers commonly face when embarking on a microarray or RNA-seq experiments and summarized important aspects of experimental design, which aim to help researchers deliberate how to generate gene expression profiles with low background noise but more interaction to facilitate novel biological knowledge discoveries in modern plant genomics.

  2. Designing microarray and RNA-Seq experiments for greater systems biology discovery in modern plant genomics.

    PubMed

    Yang, Chuanping; Wei, Hairong

    2015-02-01

    Microarray and RNA-seq experiments have become an important part of modern genomics and systems biology. Obtaining meaningful biological data from these experiments is an arduous task that demands close attention to many details. Negligence at any step can lead to gene expression data containing inadequate or composite information that is recalcitrant for pattern extraction. Therefore, it is imperative to carefully consider experimental design before launching a time-consuming and costly experiment. Contemporarily, most genomics experiments have two objectives: (1) to generate two or more groups of comparable data for identifying differentially expressed genes, gene families, biological processes, or metabolic pathways under experimental conditions; (2) to build local gene regulatory networks and identify hierarchically important regulators governing biological processes and pathways of interest. Since the first objective aims to identify the active molecular identities and the second provides a basis for understanding the underlying molecular mechanisms through inferring causality relationships mediated by treatment, an optimal experiment is to produce biologically relevant and extractable data to meet both objectives without substantially increasing the cost. This review discusses the major issues that researchers commonly face when embarking on microarray or RNA-seq experiments and summarizes important aspects of experimental design, which aim to help researchers deliberate how to generate gene expression profiles with low background noise but with more interaction to facilitate novel biological discoveries in modern plant genomics.

  3. Design and Development of a CPCI-Based Electronics Package for Space Station Experiments

    NASA Technical Reports Server (NTRS)

    Kolacz, John S.; Clapper, Randy S.; Wade, Raymond P.

    2006-01-01

    The NASA John H. Glenn Research Center is developing a Compact-PCI (CPCI) based electronics package for controlling space experiment hardware on the International Space Station. Goals of this effort include an easily modified, modular design that allows for changes in experiment requirements. Unique aspects of the experiment package include a flexible circuit used for internal interconnections and a separate enclosure (box in a box) for controlling 1 kW of power for experiment fuel heating requirements. This electronics package was developed as part of the FEANICS (Flow Enclosure Accommodating Novel Investigations in Combustion of Solids) mini-facility which is part of the Fluids and Combustion Facility s Combustion Integrated Rack (CIR). The CIR will be the platform for future microgravity combustion experiments and will reside on the Destiny Module of the International Space Station (ISS). The FEANICS mini-facility will be the primary means for conducting solid fuel combustion experiments in the CIR on ISS. The main focus of many of these solid combustion experiments will be to conduct applied scientific investigations in fire-safety to support NASA s future space missions. A description of the electronics package and the results of functional testing are the subjects of this report. The report concludes that the use of innovative packaging methods combined with readily available COTS hardware can provide a modular electronics package which is easily modified for changing experiment requirements.

  4. Application of Design of Experiments and Surrogate Modeling within the NASA Advanced Concepts Office, Earth-to-Orbit Design Process

    NASA Technical Reports Server (NTRS)

    Zwack, Mathew R.; Dees, Patrick D.; Holt, James B.

    2016-01-01

    Decisions made during early conceptual design have a large impact upon the expected life-cycle cost (LCC) of a new program. It is widely accepted that up to 80% of such cost is committed during these early design phases [1]. Therefore, to help minimize LCC, decisions made during conceptual design must be based upon as much information as possible. To aid in the decision making for new launch vehicle programs, the Advanced Concepts Office (ACO) at NASA Marshall Space Flight Center (MSFC) provides rapid turnaround pre-phase A and phase A concept definition studies. The ACO team utilizes a proven set of tools to provide customers with a full vehicle mass breakdown to tertiary subsystems, preliminary structural sizing based upon worst-case flight loads, and trajectory optimization to quantify integrated vehicle performance for a given mission [2]. Although the team provides rapid turnaround for single vehicle concepts, the scope of the trade space can be limited due to analyst availability and the manpower requirements for manual execution of the analysis tools. In order to enable exploration of a broader design space, the ACO team has implemented an advanced design methods (ADM) based approach. This approach applies the concepts of design of experiments (DOE) and surrogate modeling to more exhaustively explore the trade space and provide the customer with additional design information to inform decision making. This paper will first discuss the automation of the ACO tool set, which represents a majority of the development effort. In order to fit a surrogate model within tolerable error bounds a number of DOE cases are needed. This number will scale with the number of variable parameters desired and the complexity of the system's response to those variables. For all but the smallest design spaces, the number of cases required cannot be produced within an acceptable timeframe using a manual process. Therefore, automation of the tools was a key enabler for the successful

  5. Combined Cycle Engine Large-Scale Inlet for Mode Transition Experiments: System Identification Rack Hardware Design

    NASA Technical Reports Server (NTRS)

    Thomas, Randy; Stueber, Thomas J.

    2013-01-01

    The System Identification (SysID) Rack is a real-time hardware-in-the-loop data acquisition (DAQ) and control instrument rack that was designed and built to support inlet testing in the NASA Glenn Research Center 10- by 10-Foot Supersonic Wind Tunnel. This instrument rack is used to support experiments on the Combined-Cycle Engine Large-Scale Inlet for Mode Transition Experiment (CCE? LIMX). The CCE?LIMX is a testbed for an integrated dual flow-path inlet configuration with the two flow paths in an over-and-under arrangement such that the high-speed flow path is located below the lowspeed flow path. The CCE?LIMX includes multiple actuators that are designed to redirect airflow from one flow path to the other; this action is referred to as "inlet mode transition." Multiple phases of experiments have been planned to support research that investigates inlet mode transition: inlet characterization (Phase-1) and system identification (Phase-2). The SysID Rack hardware design met the following requirements to support Phase-1 and Phase-2 experiments: safely and effectively move multiple actuators individually or synchronously; sample and save effector control and position sensor feedback signals; automate control of actuator positioning based on a mode transition schedule; sample and save pressure sensor signals; and perform DAQ and control processes operating at 2.5 KHz. This document describes the hardware components used to build the SysID Rack including their function, specifications, and system interface. Furthermore, provided in this document are a SysID Rack effectors signal list (signal flow); system identification experiment setup; illustrations indicating a typical SysID Rack experiment; and a SysID Rack performance overview for Phase-1 and Phase-2 experiments. The SysID Rack described in this document was a useful tool to meet the project objectives.

  6. Experiences with an adaptive design for a dose-finding study in patients with osteoarthritis.

    PubMed

    Miller, Frank; Björnsson, Marcus; Svensson, Ola; Karlsten, Rolf

    2014-03-01

    Dose-finding studies in non-oncology areas are usually conducted in Phase II of the development process of a new potential medicine and it is key to choose a good design for such a study, as the results will decide if and how to proceed to Phase III. The present article has focus on the design of a dose-finding study for pain in osteoarthritis patients treated with the TRPV1 antagonist AZD1386. We describe different design alternatives in the planning of this study, the reasoning for choosing the adaptive design and experiences with conduct and interim analysis. Three alternatives were proposed: one single dose-finding study with parallel design, a programme with a smaller Phase IIa study followed by a Phase IIb dose-finding study, and an adaptive dose-finding study. We describe these alternatives in detail and explain why the adaptive design was chosen for the study. We give insights in design aspects of the adaptive study, which need to be pre-planned, like interim decision criteria, statistical analysis method and setup of a Data Monitoring Committee. Based on the interim analysis it was recommended to stop the study for futility since AZD1386 showed no significant pain decrease based on the primary variable. We discuss results and experiences from the conduct of the study with the novel design approach. Huge cost savings have been done compared to if the option with one dose-finding design for Phase II had been chosen. However, we point out several challenges with this approach.

  7. Investigation of crew motion disturbances on Skylab-Experiment T-013. [for future manned spacecraft design

    NASA Technical Reports Server (NTRS)

    Conway, B. A.

    1974-01-01

    Astronaut crew motions can produce some of the largest disturbances acting on a manned spacecraft which can affect vehicle attitude and pointing. Skylab Experiment T-013 was developed to investigate the magnitude and effects of some of these disturbances on the Skylab spacecraft. The methods and techniques used to carry out this experiment are discussed, and preliminary results of data analysis presented. Initial findings indicate that forces on the order of 300 N were exerted during vigorous soaring activities, and that certain experiment activities produced spacecraft angular rate excursions 0.03 to 0.07 deg/sec. Results of Experiment T-013 will be incorporated into mathematical models of crew-motion disturbances, and are expected to be of significant aid in the sizing, design, and analysis of stabilization and control systems for future manned spacecraft.

  8. The Role of Formal Experiment Design in Hypersonic Flight System Technology Development

    NASA Technical Reports Server (NTRS)

    McClinton, Charles R.; Ferlemann, Shelly M.; Rock, Ken E.; Ferlemann, Paul G.

    2002-01-01

    Hypersonic airbreathing engine (scramjet) powered vehicles are being considered to replace conventional rocket-powered launch systems. Effective utilization of scramjet engines requires careful integration with the air vehicle. This integration synergistically combines aerodynamic forces with propulsive cycle functions of the engine. Due to the highly integrated nature of the hypersonic vehicle design problem, the large flight envelope, and the large number of design variables, the use of a statistical design approach in design is effective. Modern Design-of-Experiments (MDOE) has been used throughout the Hyper-X program, for both systems analysis and experimental testing. Application of MDOE fall into four categories: (1) experimental testing; (2) studies of unit phenomena; (3) refining engine design; and (4) full vehicle system optimization. The MDOE process also provides analytical models, which are also used to document lessons learned, supplement low-level design tools, and accelerate future studies. This paper will discuss the design considerations for scramjet-powered vehicles, specifics of MDOE utilized for Hyper-X, and present highlights from the use of these MDOE methods within the Hyper-X Program.

  9. How scientific experiments are designed: Problem solving in a knowledge-rich, error-rich environment

    NASA Astrophysics Data System (ADS)

    Baker, Lisa M.

    While theory formation and the relation between theory and data has been investigated in many studies of scientific reasoning, researchers have focused less attention on reasoning about experimental design, even though the experimental design process makes up a large part of real-world scientists' reasoning. The goal of this thesis was to provide a cognitive account of the scientific experimental design process by analyzing experimental design as problem-solving behavior (Newell & Simon, 1972). Three specific issues were addressed: the effect of potential error on experimental design strategies, the role of prior knowledge in experimental design, and the effect of characteristics of the space of alternate hypotheses on alternate hypothesis testing. A two-pronged in vivo/in vitro research methodology was employed, in which transcripts of real-world scientific laboratory meetings were analyzed as well as undergraduate science and non-science majors' design of biology experiments in the psychology laboratory. It was found that scientists use a specific strategy to deal with the possibility of error in experimental findings: they include "known" control conditions in their experimental designs both to determine whether error is occurring and to identify sources of error. The known controls strategy had not been reported in earlier studies with science-like tasks, in which participants' responses to error had consisted of replicating experiments and discounting results. With respect to prior knowledge: scientists and undergraduate students drew on several types of knowledge when designing experiments, including theoretical knowledge, domain-specific knowledge of experimental techniques, and domain-general knowledge of experimental design strategies. Finally, undergraduate science students generated and tested alternates to their favored hypotheses when the space of alternate hypotheses was constrained and searchable. This result may help explain findings of confirmation

  10. Statistical models for the analysis and design of digital polymerase chain (dPCR) experiments

    USGS Publications Warehouse

    Dorazio, Robert; Hunter, Margaret

    2015-01-01

    Statistical methods for the analysis and design of experiments using digital PCR (dPCR) have received only limited attention and have been misused in many instances. To address this issue and to provide a more general approach to the analysis of dPCR data, we describe a class of statistical models for the analysis and design of experiments that require quantification of nucleic acids. These models are mathematically equivalent to generalized linear models of binomial responses that include a complementary, log–log link function and an offset that is dependent on the dPCR partition volume. These models are both versatile and easy to fit using conventional statistical software. Covariates can be used to specify different sources of variation in nucleic acid concentration, and a model’s parameters can be used to quantify the effects of these covariates. For purposes of illustration, we analyzed dPCR data from different types of experiments, including serial dilution, evaluation of copy number variation, and quantification of gene expression. We also showed how these models can be used to help design dPCR experiments, as in selection of sample sizes needed to achieve desired levels of precision in estimates of nucleic acid concentration or to detect differences in concentration among treatments with prescribed levels of statistical power.

  11. Optimization of Pb(II) biosorption by Robinia tree leaves using statistical design of experiments.

    PubMed

    Zolgharnein, Javad; Shahmoradi, Ali; Sangi, Mohammad Reza

    2008-07-30

    The present study introduces Robinia tree leaves as a novel and efficient biosorbent for removing Pb(II) from aqueous solutions. In order to reduce the large number of experiments and find the highest removal efficiency of Pb(II), a set of full 2(3) factorial design with two blocks were performed in duplicate (16 experiments). In all experiments, the contact time was fixed at 25 min. The main interaction effects of the three factors including sorbent mass, pH and initial concentration of metal-ion were considered. By using Student's t-test and analysis of variances (ANOVA), the main factors, which had the highest effect on the removal process, were identified. Twenty-six experiments were designed according to Doehlert response surface design to obtain a mathematical model describing functional relationship between response and main independent variables. The most suitable regression model, that fitted the experimental data extremely well, was chosen according to the lack-of-fit-test and adjusted R(2) value. Finally, after checking for possible outliers, the optimum conditions for maximum removal of Pb(II) from aqueous solution were obtained. The best conditions were calculated to be as: initial concentration of Pb(II)=40 mg L(-1), pH 4.6 and concentration of sorbet equal to 27.3 g L(-1).

  12. Conceptual design of a Moving Belt Radiator (MBR) shuttle-attached experiment

    NASA Technical Reports Server (NTRS)

    Aguilar, Jerry L.

    1990-01-01

    The conceptual design of a shuttle-attached Moving Belt Radiator (MBR) experiment is presented. The MBR is an advanced radiator concept in which a rotating belt is used to radiate thermal energy to space. The experiment is developed with the primary focus being the verification of the dynamic characteristics of a rotating belt with a secondary objective of proving the thermal and sealing aspects in a reduced gravity, vacuum environment. The mechanical design, selection of the belt material and working fluid, a preliminary test plan, and program plan are presented. The strategy used for selecting the basic sizes and materials of the components are discussed. Shuttle and crew member requirements are presented with some options for increasing or decreasing the demands on the STS. An STS carrier and the criteria used in the selection process are presented. The proposed carrier for the Moving Belt Radiator experiment is the Hitchhiker-M. Safety issues are also listed with possible results. This experiment is designed so that a belt can be deployed, run at steady state conditions, run with dynamic perturbations imposed, verify the operation of the interface heat exchanger and seals, and finally be retracted into a stowed position for transport back to earth.

  13. Designing experiments on thermal interactions by secondary-school students in a simulated laboratory environment

    NASA Astrophysics Data System (ADS)

    Lefkos, Ioannis; Psillos, Dimitris; Hatzikraniotis, Euripides

    2011-07-01

    Background and purpose: The aim of this study was to explore the effect of investigative activities with manipulations in a virtual laboratory on students' ability to design experiments. Sample Fourteen students in a lower secondary school in Greece attended a teaching sequence on thermal phenomena based on the use of information and communication technology, and specifically of the simulated virtual laboratory 'ThermoLab'. Design and methods A pre-post comparison was applied. Students' design of experiments was rated in eight dimensions; namely, hypothesis forming and verification, selection of variables, initial conditions, device settings, materials and devices used, process and phenomena description. A three-level ranking scheme was employed for the evaluation of students' answers in each dimension. Results A Wilcoxon signed-rank test revealed a statistically significant difference between the students' pre- and post-test scores. Additional analysis by comparing the pre- and post-test scores using the Hake gain showed high gains in all but one dimension, which suggests that this improvement was almost inclusive. Conclusions We consider that our findings support the statement that there was an improvement in students' ability to design experiments.

  14. The design and development of a release mechanism for space shuttle life-science experiments

    NASA Technical Reports Server (NTRS)

    Jones, H. M.; Daniell, R. G.

    1984-01-01

    The design, development, and testing of a release mechanism for use in two life science experiments on the Spacelab 1, 4, and D1 missions is described. The mechanism is a self latching ball lock device actuated by a linear solenoid. An unusual feature is the tapering of the ball lock plunger to give it a near constant breakout force for release under a wide range of loads. The selection of the design, based on the design requirements, is discussed. A number of problems occurred during development and test, including problems caused by human factors that became apparent after initial delivery for crewtraining sessions. These problems and their solutions are described to assist in the design and testing of similar mechanisms.

  15. Optical design of the Diffuse Infrared Background Experiment for NASA's Cosmic Background Explorer

    NASA Technical Reports Server (NTRS)

    Miller, M. S.; Evans, D. C.; Moseley, H.; Ludwig, U. W.

    1982-01-01

    The conceptual design for a ten-band absolute filter photometer (the Diffuse Infrared Background Experiment) to operate at 2 K and measure galactic and extragalactic infrared radiation in the 1 to 300-micron range and polarization in the 1 to 3.5-micron range is presented, as part of the NASA Cosmic Background Explorer. The telescope optical design, a Gregorian design incorporating bafffles and shades to provide high stray-light rejection, is described. Pupil nonuniformity in the detector-assembly optical design has been limited. It is determined that detector sensitiity requirements can be met, and that the problem of radiation-induced responsivity variations can be solved by minimizing detector-assembly size, providing for in situ thermal annealing, and allowing for frequent detector calibration. Limitations on mirror performance are to be met by fabricating mirrors and structure from the same aluminum 6061 ingot.

  16. Robust optimal design of experiments for model discrimination using an interactive software tool.

    PubMed

    Stegmaier, Johannes; Skanda, Dominik; Lebiedz, Dirk

    2013-01-01

    Mathematical modeling of biochemical processes significantly contributes to a better understanding of biological functionality and underlying dynamic mechanisms. To support time consuming and costly lab experiments, kinetic reaction equations can be formulated as a set of ordinary differential equations, which in turn allows to simulate and compare hypothetical models in silico. To identify new experimental designs that are able to discriminate between investigated models, the approach used in this work solves a semi-infinite constrained nonlinear optimization problem using derivative based numerical algorithms. The method takes into account parameter variabilities such that new experimental designs are robust against parameter changes while maintaining the optimal potential to discriminate between hypothetical models. In this contribution we present a newly developed software tool that offers a convenient graphical user interface for model discrimination. We demonstrate the beneficial operation of the discrimination approach and the usefulness of the software tool by analyzing a realistic benchmark experiment from literature. New robust optimal designs that allow to discriminate between the investigated model hypotheses of the benchmark experiment are successfully calculated and yield promising results. The involved robustification approach provides maximally discriminating experiments for the worst parameter configurations, which can be used to estimate the meaningfulness of upcoming experiments. A major benefit of the graphical user interface is the ability to interactively investigate the model behavior and the clear arrangement of numerous variables. In addition to a brief theoretical overview of the discrimination method and the functionality of the software tool, the importance of robustness of experimental designs against parameter variability is demonstrated on a biochemical benchmark problem. The software is licensed under the GNU General Public License

  17. Proposed ATLAS liner design fabricated for hydrodynamics experiments on Shiva Star

    SciTech Connect

    Anderson, W. E.; Adams, C. D.; Armijo, E. V.; Bartos, J. J.; Cameron, B. J.; Garcia, F.; Henneke, B.; Randolph, B.; Salazar, M. A.; Steckle, W. P. , Jr.; Turchi, Peter J.; Gale, D.

    2001-01-01

    An entirely new cylindrical liner system has been designed and fabricated for use on the Shiva Star capacitor bank. The design incorporates features expected to be applicable to a future power flow channel of the Atlas capacitor bank with the intention of keeping any required liner design modifications to a minimum when the power flow channel at Atlas is available. Four shots were successfully conducted at Shiva Star that continued a series of hydrodynamics physics experiments started on the Los Alamos Pegasus capacitor bank. Departures from the diagnostic suite that had previously been used at Pegasus required new techniques in the fabrication of the experiment insert package. We describe new fabrication procedures that were developed by the Polymers and Coatings Group (MST-7) of the Los Alamos Materials Science Division to fabricate the Shiva Star experiment loads. Continuing MST-7 development of interference fit processes for liner experiment applications, current joints at the glide planes were assembled by thermal shrink fit using liquid nitrogen as a coolant. The liner material was low strength, high conductance 1100 series aluminum. The liner glide plane electrodes were machined from full hard copper rod with a 10 ramp to maintain liner to glide plane contact as the liner was imploded. The parts were fabricated with 0.015 mm radial interference fit between the liner inside diameter (ID) and the glide plane outside diameter (OD). to form the static liner current joints. The liner was assembled with some axial clearance at each end to allow slippage if any axial force was generated as the liner assembly cassette was bolted into Shiva Star, a precaution to guard against buckling the liner during installation of the load cassette. Other unique or unusual processes were developed and are described. Minor adaptations of the liner design are now being fabricated for first Atlas experiments.

  18. Design of a factorial experiment with randomization restrictions to assess medical device performance on vascular tissue

    PubMed Central

    2011-01-01

    Background Energy-based surgical scalpels are designed to efficiently transect and seal blood vessels using thermal energy to promote protein denaturation and coagulation. Assessment and design improvement of ultrasonic scalpel performance relies on both in vivo and ex vivo testing. The objective of this work was to design and implement a robust, experimental test matrix with randomization restrictions and predictive statistical power, which allowed for identification of those experimental variables that may affect the quality of the seal obtained ex vivo. Methods The design of the experiment included three factors: temperature (two levels); the type of solution used to perfuse the artery during transection (three types); and artery type (two types) resulting in a total of twelve possible treatment combinations. Burst pressures of porcine carotid and renal arteries sealed ex vivo were assigned as the response variable. Results The experimental test matrix was designed and carried out as a split-plot experiment in order to assess the contributions of several variables and their interactions while accounting for randomization restrictions present in the experimental setup. The statistical software package SAS was utilized and PROC MIXED was used to account for the randomization restrictions in the split-plot design. The combination of temperature, solution, and vessel type had a statistically significant impact on seal quality. Conclusions The design and implementation of a split-plot experimental test-matrix provided a mechanism for addressing the existing technical randomization restrictions of ex vivo ultrasonic scalpel performance testing, while preserving the ability to examine the potential effects of independent factors or variables. This method for generating the experimental design and the statistical analyses of the resulting data are adaptable to a wide variety of experimental problems involving large-scale tissue-based studies of medical or experimental

  19. Motivation for proposed experimentation in the realm of accelerated E. M. systems: A preliminary design for an experiment

    NASA Technical Reports Server (NTRS)

    Post, E. J.

    1970-01-01

    An experiment, designed to determine the difference between fields-magnetic and electric-surrounding a uniformly moving charge as contrasted with the fields surrounding an accelerated charge, is presented. A thought experiment is presented to illustrate the process.

  20. Lost in space: design of experiments and scientific exploration in a Hogarth Universe.

    PubMed

    Lendrem, Dennis W; Lendrem, B Clare; Woods, David; Rowland-Jones, Ruth; Burke, Matthew; Chatfield, Marion; Isaacs, John D; Owen, Martin R

    2015-11-01

    A Hogarth, or 'wicked', universe is an irregular environment generating data to support erroneous beliefs. Here, we argue that development scientists often work in such a universe. We demonstrate that exploring these multidimensional spaces using small experiments guided by scientific intuition alone, gives rise to an illusion of validity and a misplaced confidence in that scientific intuition. By contrast, design of experiments (DOE) permits the efficient mapping of such complex, multidimensional spaces. We describe simulation tools that enable research scientists to explore these spaces in relative safety.

  1. Design of experiments for measuring heat-transfer coefficients with a lumped-parameter calorimeter

    NASA Technical Reports Server (NTRS)

    Vanfossen, G. J., Jr.

    1975-01-01

    A theoretical investigation was conducted to determine optimum experimental conditions for using a lumped-parameter calorimeter to measure heat-transfer coefficients and heating rates. A mathematical model of the transient temperature response of the calorimeter was used with the measured temperature response to predict the heat-transfer coefficient and the rate of heating. A sensitivity analysis was used to determine the optimum transient experiment for simultaneously measuring the heat addition during heating and the convective heat-transfer coefficient during heating and cooling of a lumped-parameter calorimeter. Optimum experiments were also designed for measuring the convective heat-transfer coefficient during both heating and cooling and cooling only.

  2. Advanced Test Reactor In-Canal Ultrasonic Scanner: Experiment Design and Initial Results on Irradiated Plates

    SciTech Connect

    D. M. Wachs; J. M. Wight; D. T. Clark; J. M. Williams; S. C. Taylor; D. J. Utterbeck; G. L. Hawkes; G. S. Chang; R. G. Ambrosek; N. C. Craft

    2008-09-01

    An irradiation test device has been developed to support testing of prototypic scale plate type fuels in the Advanced Test Reactor. The experiment hardware and operating conditions were optimized to provide the irradiation conditions necessary to conduct performance and qualification tests on research reactor type fuels for the RERTR program. The device was designed to allow disassembly and reassembly in the ATR spent fuel canal so that interim inspections could be performed on the fuel plates. An ultrasonic scanner was developed to perform dimensional and transmission inspections during these interim investigations. Example results from the AFIP-2 experiment are presented.

  3. Study of low loss Mn Zn ferrite by design of experiment

    NASA Astrophysics Data System (ADS)

    Nien, H. H.; Liang, T. J.; Huang, C. K.; Changchien, S. K.

    2006-09-01

    This paper presents a study of low loss manganese-zinc (Mn-Zn) ferrite by design of experiment. A two-stage optimization method is used for reducing the number of trials and allowing the inclusion of more factors and levels. The output responses of permeability and resistivity evaluate the experiments in optimizing the control factors. The power loss of the Mn-Zn ferrite is 350 mW/c.c at 300KHz, 80 °C and 100 mT.

  4. Design and Preparation of a Particle Dynamics Space Flight Experiment, SHIVA

    NASA Technical Reports Server (NTRS)

    Trolinger, James; L'Esperance, Drew; Rangel, Roger; Coimbra, Carlos; Wiltherow, William

    2003-01-01

    ABSTRACT This paper describes the flight experiment, supporting ground science, and the design rationale for project SHIVA (Spaceflight Holography Investigation in a Virtual Apparatus). SHIVA is a fundamental study of particle dynamics in fluids in microgravity. Gravity often dominates the equations of motion of a particle in a fluid, so microgravity provides an ideal environment to study the other forces, such as the pressure and viscous drag and especially the Basset history force. We have developed diagnostic recording methods using holography to save all of the particle field optical characteristics, essentially allowing the experiment to be transferred from space back to earth in what we call the "virtual apparatus" for on-earth microgravity experimentation. We can quantify precisely the three-dimensional motion of sets of particles, allowing us to test and apply new analytical solutions developed by members of the team as reported in the 2001 Conference (Banff, Canada). In addition to employing microgravity to augment the fundamental study of these forces, the resulting data will allow us to quantify and understand the ISS environment with great accuracy. This paper shows how we used both experiment and theory to identify and resolve critical issues and produce an optimal the study. We examined the response of particles of specific gravity from 0.1 to 20, with radii from 0.2 to 2mm. to fluid oscillation at frequencies up to 80 Hz with amplitudes up to 200 microns. To observe some of the interesting effects predicted by the new solutions requires the precise location of the position of a particle in three dimensions. To this end we have developed digital holography algorithms that enable particle position location to a small fraction of a pixel in a CCD array. The spaceflight system will record holograms both on film and electronically. The electronic holograms can be downlinked providing real time data, essentially acting like a remote window into the ISS

  5. The Design of Useful Mix Characterization Experiments for the LLNL Reshock Platform

    NASA Astrophysics Data System (ADS)

    Islam, Tanim

    2015-11-01

    The NIF Re-shock platform has been extensively engineered to minimize boundary effects and polluting shocks. It is capable of comprehensively and reproducibly exploring a large parameter space important in mix experiments: strength and timing of shocks and reshocks; the amplitude and wavelength of Richtmyer-Meshkov-unstable interfaces; the Atwood number of these mixing layers; and using a technique developed with experiments at the Omega laser, the simultaneous visualization of spike and bubble fronts. In this work, I explore multimodal and roughened surface designed, and combinations of light and heavy materials, that may illuminate our understanding of mix in plasmas.

  6. Design and implementation of the protective cap/biobarrier experiment at the Idaho National Engineering Laboratory

    SciTech Connect

    Limbach, W.E.; Ratzlaff, T.D.; Anderson, J.E.; Reynolds, T.D.; Laundre, J.W. |

    1994-12-31

    The Protective Cap/Biobarrier Experiment (PCBE), initiated in 1993 at the Idaho National Engineering Laboratory (INEL), is a strip-split plot experiment with three replications designed to rigorously test a 2.0-m loessal soil cap against a cap recommended by the US Environmental Protection Agency and two caps with biological intrusion barriers. Past research at INEL indicates that it should be possible to exclude water from buried wastes using natural materials and natural processes in arid environments rather than expensive materials (geotextiles) and highly engineered caps. The PCBE will also test the effects of two vegetal covers and three irrigation levels on cap performance. Drainage pans, located at the bottom of each plot, will monitor cap failure. Soil water profiles will be monitored biweekly by neutron probe and continuously by time domain reflectometry. The performance of each cap design will be monitored under a variety of conditions through 1998. From 1994 to 1996, the authors will assess plant establishment, rooting depths, patterns of moisture extraction and their interactions among caps, vegetal covers, and irrigation levels. In 1996, they will introduce ants and burrowing mammals to test the structural integrity of each cap design. In 1998, the authors will apply sufficient water to determine the failure limit for each cap design. The PCBE should provide reliable knowledge of the performances of the four cap designs under a variety of conditions and aid in making hazardous-waste management decisions at INEL and at disposal sites in similar environments.

  7. Galileo Optical Experiment (GOPEX) optical train: Design and validation at the Table Mountain Facility

    NASA Technical Reports Server (NTRS)

    Yu, J.; Shao, M.

    1993-01-01

    The Galileo Optical Experiment (GOPEX) has demonstrated the first laser communications uplink to a deep space vehicle. The optical design and validation tests performed at the Table Mountain Facility (TMF) transmitter site are described. The system used a 0.6-m telescope and an optical system at coude focus to produce the uplink beam. The optical system used a pulsed neodymium:yttrium-aluminum-garnet (Nd:Yag) laser and beam diverger optics to produce the required optical output. In order to validate the optical design, a number of uplinks were performed on Earth-orbiting satellites (e.g., Lageos 1 and 2).

  8. Design and implementation of an experiment scheduling system for the ACTS satellite

    NASA Technical Reports Server (NTRS)

    Ringer, Mark J.

    1994-01-01

    The Advanced Communication Technology Satellite (ACTS) was launched on the 12th of September 1993 aboard STS-51. All events since that time have proceeded as planned with user operations commencing on December 6th, 1993. ACTS is a geosynchronous satellite designed to extend the state of the art in communication satellite design and is available to experimenters on a 'time/bandwidth available' basis. The ACTS satellite requires the advance scheduling of experimental activities based upon a complex set of resource, state, and activity constraints in order to ensure smooth operations. This paper describes the software system developed to schedule experiments for ACTS.

  9. Experience with simplified inelastic analysis of piping designed for elevated temperature service

    SciTech Connect

    Severud, L.K.

    1980-03-01

    Screening rules and preliminary design of FFTF piping were developed in 1974 based on expected behavior and engineering judgment, approximate calculations, and a few detailed inelastic analyses of pipelines. This paper provides findings from six additional detailed inelastic analyses with correlations to the simplified analysis screening rules. In addition, simplified analysis methods for treating weldment local stresses and strains as well as fabrication induced flaws are described. Based on the FFTF experience, recommendations for future Code and technology work to reduce design analysis costs are identified.

  10. Rotational fluid flow experiment: WPI/MITRE advanced space design GASCAN 2

    NASA Technical Reports Server (NTRS)

    Daly, Walter F.; Harr, Lee; Paduano, Rocco; Yee, Tony; Eubbani, Eddy; Delprado, Jaime; Khanna, Ajay

    1991-01-01

    The design and implementation is examined of an electro-mechanical system for studying vortex behavior in a microgravity environment. Most of the existing equipment was revised and redesigned as necessary. Emphasis was placed on the documentation and integration of the mechanical and electrical subsystems. Project results include the reconfiguration and thorough testing of all the hardware subsystems, the implementation of an infrared gas entrainment detector, new signal processing circuitry for the ultrasonic fluid circulation device, improved prototype interface circuits, and software for overall control of experiment design operation.

  11. Design/build/mockup of the Waste Isolation Pilot Plant gas generation experiment glovebox

    SciTech Connect

    Rosenberg, K.E.; Benjamin, W.W.; Knight, C.J.; Michelbacher, J.A.

    1996-10-01

    A glovebox was designed, fabricated, and mocked-up for the WIPP Gas Generation Experiments (GGE) being conducted at ANL-W. GGE will determine the gas generation rates from materials in contact handled transuranic waste at likely long term repository temperature and pressure conditions. Since the customer`s schedule did not permit time for performing R&D of the support systems, designing the glovebox, and fabricating the glovebox in a serial fashion, a parallel approach was undertaken. As R&D of the sampling system and other support systems was initiated, a specification was written concurrently for contracting a manufacturer to design and build the glovebox and support equipment. The contractor understood that the R&D being performed at ANL-W would add additional functional requirements to the glovebox design. Initially, the contractor had sufficient information to design the glovebox shell. Once the shell design was approved, ANL-W built a full scale mockup of the shell out of plywood and metal framing; support systems were mocked up and resultant information was forwarded to the glovebox contractor to incorporate into the design. This approach resulted in a glovebox being delivered to ANL-W on schedule and within budget.

  12. Design-of-experiments to Reduce Life-cycle Costs in Combat Aircraft Inlets

    NASA Technical Reports Server (NTRS)

    Anderson, Bernhard H.; Baust, Henry D.; Agrell, Johan

    2003-01-01

    It is the purpose of this study to demonstrate the viability and economy of Design- of-Experiments (DOE), to arrive at micro-secondary flow control installation designs that achieve optimal inlet performance for different mission strategies. These statistical design concepts were used to investigate the properties of "low unit strength" micro-effector installation. "Low unit strength" micro-effectors are micro-vanes, set a very low angle-of incidence, with very long chord lengths. They are designed to influence the neat wall inlet flow over an extended streamwise distance. In this study, however, the long chord lengths were replicated by a series of short chord length effectors arranged in series over multiple bands of effectors. In order to properly evaluate the performance differences between the single band extended chord length installation designs and the segmented multiband short chord length designs, both sets of installations must be optimal. Critical to achieving optimal micro-secondary flow control installation designs is the understanding of the factor interactions that occur between the multiple bands of micro-scale vane effectors. These factor interactions are best understood and brought together in an optimal manner through a structured DOE process, or more specifically Response Surface Methods (RSM).

  13. A design of experiment study of plasma sprayed alumina-titania coatings

    SciTech Connect

    Steeper, T.J.; Varacalle, D.J. Jr.; Wilson, G.C.; Riggs, W.L. II; Rotolico, A.J.; Nerz, J.E.

    1992-08-01

    An experimental study of the plasma spraying of alumina-titania powder is presented in this paper. This powder system is being used to fabricate heater tubes that emulate nuclear fuel tubes for use in thermal-hydraulic testing. Coating experiments were conducted using a Taguchi fractional-factorial design parametric study. Operating parameters were varied around the typical spray parameters in a systematic design of experiments in order to display the range of plasma processing conditions and their effect on the resultant coating. The coatings were characterized by hardness and electrical tests, image analysis, and optical metallography. Coating qualities are discussed with respect to dielectric strength, hardness, porosity, surface roughness, deposition efficiency, and microstructure. The attributes of the coatings are correlated with the changes in operating parameters.

  14. A design of experiment study of plasma sprayed alumina-titania coatings

    SciTech Connect

    Steeper, T.J. and Co., Aiken, SC . Savannah River Lab.); Varacalle, D.J. Jr.; Wilson, G.C. ); Riggs, W.L. II ); Rotolico, A.J.; Nerz, J.E. )

    1992-01-01

    An experimental study of the plasma spraying of alumina-titania powder is presented in this paper. This powder system is being used to fabricate heater tubes that emulate nuclear fuel tubes for use in thermal-hydraulic testing. Coating experiments were conducted using a Taguchi fractional-factorial design parametric study. Operating parameters were varied around the typical spray parameters in a systematic design of experiments in order to display the range of plasma processing conditions and their effect on the resultant coating. The coatings were characterized by hardness and electrical tests, image analysis, and optical metallography. Coating qualities are discussed with respect to dielectric strength, hardness, porosity, surface roughness, deposition efficiency, and microstructure. The attributes of the coatings are correlated with the changes in operating parameters.

  15. Statistically designing microarrays and microarray experiments to enhance sensitivity and specificity.

    PubMed

    Hsu, Jason C; Chang, Jane; Wang, Tao; Steingrímsson, Eiríkur; Magnússon, Magnús Karl; Bergsteinsdottir, Kristin

    2007-01-01

    Gene expression signatures from microarray experiments promise to provide important prognostic tools for predicting disease outcome or response to treatment. A number of microarray studies in various cancers have reported such gene signatures. However, the overlap of gene signatures in the same disease has been limited so far, and some reported signatures have not been reproduced in other populations. Clearly, the methods used for verifying novel gene signatures need improvement. In this article, we describe an experiment in which microarrays and sample hybridization are designed according to the statistical principles of randomization, replication and blocking. Our results show that such designs provide unbiased estimation of differential expression levels as well as powerful tests for them.

  16. Precision Pointing Control System (PPCS) system design and analysis. [for gimbaled experiment platforms

    NASA Technical Reports Server (NTRS)

    Frew, A. M.; Eisenhut, D. F.; Farrenkopf, R. L.; Gates, R. F.; Iwens, R. P.; Kirby, D. K.; Mann, R. J.; Spencer, D. J.; Tsou, H. S.; Zaremba, J. G.

    1972-01-01

    The precision pointing control system (PPCS) is an integrated system for precision attitude determination and orientation of gimbaled experiment platforms. The PPCS concept configures the system to perform orientation of up to six independent gimbaled experiment platforms to design goal accuracy of 0.001 degrees, and to operate in conjunction with a three-axis stabilized earth-oriented spacecraft in orbits ranging from low altitude (200-2500 n.m., sun synchronous) to 24 hour geosynchronous, with a design goal life of 3 to 5 years. The system comprises two complementary functions: (1) attitude determination where the attitude of a defined set of body-fixed reference axes is determined relative to a known set of reference axes fixed in inertial space; and (2) pointing control where gimbal orientation is controlled, open-loop (without use of payload error/feedback) with respect to a defined set of body-fixed reference axes to produce pointing to a desired target.

  17. DOE's effort to reduce truck aerodynamic drag : joint experiments and computations lead to smart design.

    SciTech Connect

    Yaste, David M; Salari, Kambiz; Hammache, Mustapha; Browand, Fred; Pointer, W. David; Ortega, Jason M.; McCallen, Rose; Walker, Stephen M; Heineck, James T; Hassan, Basil; Roy, Christopher John; Storms, B.; Satran, D.; Ross, James; Englar, Robert; Chatalain, Philippe; Rubel, Mike; Leonard, Anthony; Hsu, Tsu-Ya; DeChant, Lawrence Justin.

    2004-06-01

    At 70 miles per hour, overcoming aerodynamic drag represents about 65% of the total energy expenditure for a typical heavy truck vehicle. The goal of this US Department of Energy supported consortium is to establish a clear understanding of the drag producing flow phenomena. This is being accomplished through joint experiments and computations, leading to the smart design of drag reducing devices. This paper will describe our objective and approach, provide an overview of our efforts and accomplishments, and discuss our future direction.

  18. Conceptual design of an orbital propellant transfer experiment. Volume 2: Study results

    NASA Technical Reports Server (NTRS)

    Drake, G. L.; Bassett, C. E.; Merino, F.; Siden, L. E.; Bradley, R. E.; Carr, E. J.; Parker, R. E.

    1980-01-01

    The OTV configurations, operations and requirements planned for the period from the 1980's to the 1990's were reviewed and a propellant transfer experiment was designed that would support the needs of these advanced OTV operational concepts. An overall integrated propellant management technology plan for all NASA centers was developed. The preliminary cost estimate (for planning purposes only) is $56.7 M, of which approximately $31.8 M is for shuttle user costs.

  19. Design, Implementation, and Experiences of Third-Party Software Administration at the ORNL NCCS

    SciTech Connect

    Jones, Nicholas A; Fahey, Mark R

    2008-01-01

    At the ORNL NCCS, the structure and policy surrounding how we install third-party applications. This change is most notable for its effect on our quad-core Cray XT4 (Jaguar) computer. Of particular interest is the addition of many scripts to automate installing and testing system software, as well as the addition of automated reporting mechanisms. We will present an overview of the design and implementation, and also present our experiences to date

  20. Magnetostrictive wire-bonding clamp for semiconductor packaging: initial prototype design, modeling, and experiments

    NASA Astrophysics Data System (ADS)

    Dozor, David M.

    1998-06-01

    A magnetostrictive wire-bonding clamp for use in semiconductor packaging applications has been developed by Mechatronic Technology Co. Semiconductor industry trends, requiring high process throughput on increasing lead count packaging, make the magnetostrictive material Terfenol-D a candidate for this application. To construct this small, lightweight device, small samples of Terfenol-D were prepared by ETREMA Products, Inc. This paper reports the initial design, mathematical modeling, and experiments related to this initial prototype.

  1. Design, construction, and operations experience with the SWSA 6 (Solid Waste Storage Area) Tumulus Disposal Demonstration

    SciTech Connect

    Van Hoesen, S.D.; Van Cleve, J.E.; Wylie, A.N.; Williams, L.C.; Bolinsky, J.

    1988-01-01

    Efforts are underway at the Department of Energy facilities in Oak Ridge to improve the performance of radioactive waste disposal facilities. An engineered disposal concept demonstration involving placement of concrete encased waste on a monitored concrete pad with an earthen cover is being conducted. The design, construction, and operations experience with this project, the SWSA 6 Tumulus Disposal Demonstration, is described. 1 fig., 1 tab.

  2. DOE's Effort to Reduce Truck Aerodynamic Drag-Joint Experiments and Computations Lead to Smart Design

    SciTech Connect

    McCallen, R; Salari, K; Ortega, J; DeChant, L; Hassan, B; Roy, C; Pointer, W; Browand, F; Hammache, M; Hsu, T; Leonard, A; Rubel, M; Chatalain, P; Englar, R; Ross, J; Satran, D; Heineck, J; Walker, S; Yaste, D; Storms, B

    2004-06-17

    At 70 miles per hour, overcoming aerodynamic drag represents about 65% of the total energy expenditure for a typical heavy truck vehicle. The goal of this US Department of Energy supported consortium is to establish a clear understanding of the drag producing flow phenomena. This is being accomplished through joint experiments and computations, leading to the 'smart' design of drag reducing devices. This paper will describe our objective and approach, provide an overview of our efforts and accomplishments, and discuss our future direction.

  3. Design, installation and operating experience of 20 photovoltaic medical refrigerator systems on four continents

    NASA Astrophysics Data System (ADS)

    Hein, G. F.

    The NASA Lewis Research Center in cooperation with the World Health Organization, U.S.A. I.D., the Pan American Health Organization and national government agencies in some developing countries sponsored the installation of twenty photovoltaic powered medical vaccine storage refrigerator-freezer (R/F) systems. The Solar Power Corporation was selected as the contractor to perform the design, development and installation of these twenty units. Solar Power's experiences are described herein.

  4. Directed Design of Experiments (DOE) for Determining Probability of Detection (POD) Capability of NDE Systems (DOEPOD)

    NASA Technical Reports Server (NTRS)

    Generazio, Ed

    2007-01-01

    This viewgraph presentation reviews some of the issues that people who specialize in Non destructive evaluation (NDE) have with determining the statistics of the probability of detection. There is discussion of the use of the binominal distribution, and the probability of hit. The presentation then reviews the concepts of Directed Design of Experiments for Validating Probability of Detection of Inspection Systems (DOEPOD). Several cases are reviewed, and discussed. The concept of false calls is also reviewed.

  5. Design, installation and operating experience of 20 photovoltaic medical refrigerator systems on four continents

    NASA Technical Reports Server (NTRS)

    Hein, G. F.

    1982-01-01

    The NASA Lewis Research Center in cooperation with the World Health Organization, U.S.A. I.D., the Pan American Health Organization and national government agencies in some developing countries sponsored the installation of twenty photovoltaic powered medical vaccine storage refrigerator-freezer (R/F) systems. The Solar Power Corporation was selected as the contractor to perform the design, development and installation of these twenty units. Solar Power's experiences are described herein.

  6. Design, fabrication and testing of the gas analysis system for the tritium recovery experiment, TRIO-01

    SciTech Connect

    Finn, P.A.; Reedy, G.T.; Homa, M.I.; Clemmer, R.G.; Pappas, G.; Slawecki, M.A.; Graczyk, D.G.; Bowers, D.L.; Clemmer, E.D.

    1983-01-01

    The tritium recovery experiment, TRIO-01, required a gas analysis system which detected the form of tritium, the amount of tritium (differential and integral), and the presence and amount of other radioactive species. The system had to handle all contingencies and function for months at a time unattended during weekend operation. The designed system, described herein, consisted of a train of components which could be grouped as desired to match tritium release behavior.

  7. Gas-grain simulation experiment module conceptual design and gas-grain simulation facility breadboard development

    NASA Technical Reports Server (NTRS)

    Zamel, James M.; Petach, Michael; Gat, Nahum; Kropp, Jack; Luong, Christina; Wolff, Michael

    1993-01-01

    This report delineates the Option portion of the Phase A Gas-Grain Simulation Facility study. The conceptual design of a Gas-Grain Simulation Experiment Module (GGSEM) for Space Shuttle Middeck is discussed. In addition, a laboratory breadboard was developed during this study to develop a key function for the GGSEM and the GGSF, specifically, a solid particle cloud generating device. The breadboard design and test results are discussed and recommendations for further studies are included. The GGSEM is intended to fly on board a low earth orbit (LEO), manned platform. It will be used to perform a subset of the experiments planned for the GGSF for Space Station Freedom, as it can partially accommodate a number of the science experiments. The outcome of the experiments performed will provide an increased understanding of the operational requirements for the GGSF. The GGSEM will also act as a platform to accomplish technology development and proof-of-principle experiments for GGSF hardware, and to verify concepts and designs of hardware for GGSF. The GGSEM will allow assembled subsystems to be tested to verify facility level operation. The technology development that can be accommodated by the GGSEM includes: GGSF sample generation techniques, GGSF on-line diagnostics techniques, sample collection techniques, performance of various types of sensors for environmental monitoring, and some off-line diagnostics. Advantages and disadvantages of several LEO platforms available for GGSEM applications are identified and discussed. Several of the anticipated GGSF experiments require the deagglomeration and dispensing of dry solid particles into an experiment chamber. During the GGSF Phase A study, various techniques and devices available for the solid particle aerosol generator were reviewed. As a result of this review, solid particle deagglomeration and dispensing were identified as key undeveloped technologies in the GGSF design. A laboratory breadboard version of a solid

  8. Designing of Multi-Interface Diverging Experiments to Model Rayleigh-Taylor Growth in Supernovae

    NASA Astrophysics Data System (ADS)

    Grosskopf, Michael; Drake, R.; Kuranz, C.; Plewa, T.; Hearn, N.; Meakin, C.; Arnett, D.; Miles, A.; Robey, H.; Hansen, J.; Hsing, W.; Edwards, M.

    2008-05-01

    In previous experiments on the Omega Laser, researchers studying blast-wave-driven instabilities have observed the growth of Rayleigh-Taylor instabilities under conditions scaled to the He/H interface of SN1987A. Most of these experiments have been planar experiments, as the energy available proved unable to accelerate enough mass in a diverging geometry. With the advent of the NIF laser, which can deliver hundreds of kJ to an experiment, it is possible to produce 3D, blast-wave-driven, multiple-interface explosions and to study the mixing that develops. We report scaling simulations to model the interface dynamics of a multilayered, diverging Rayleigh-Taylor experiment for NIF using CALE, a hybrid adaptive Lagrangian-Eulerian code developed at LLNL. Specifically, we looked both qualitatively and quantitatively at the Rayleigh-Taylor growth and multi-interface interactions in mass-scaled, spherically divergent systems using different materials. The simulations will assist in the target design process and help choose diagnostics to maximize the information we receive in a particular shot. Simulations are critical for experimental planning, especially for experiments on large-scale facilities. *This research was sponsored by LLNL through contract LLNL B56128 and by the NNSA through DOE Research Grant DE-FG52-04NA00064.

  9. Two-phase reduced gravity experiments for a space reactor design

    NASA Technical Reports Server (NTRS)

    Antoniak, Zenen I.

    1987-01-01

    Future space missions researchers envision using large nuclear reactors with either a single or a two-phase alkali-metal working fluid. The design and analysis of such reactors require state-of-the-art computer codes that can properly treat alkali-metal flow and heat transfer in a reduced-gravity environment. New flow regime maps, models, and correlations are required if the codes are to be successfully applied to reduced-gravity flow and heat transfer. General plans are put forth for the reduced-gravity experiments which will have to be performed, at NASA facilities, with benign fluids. Data from the reduced-gravity experiments with innocuous fluids are to be combined with normal gravity data from two-phase alkali-metal experiments. Because these reduced-gravity experiments will be very basic, and will employ small test loops of simple geometry, a large measure of commonality exists between them and experiments planned by other organizations. It is recommended that a committee be formed to coordinate all ongoing and planned reduced gravity flow experiments.

  10. Design of a Virtual Reality Navigational (VRN) experiment for assessment of egocentric spatial cognition.

    PubMed

    Byagowi, Ahmad; Moussavi, Zahra

    2012-01-01

    Virtual reality (VR) experiments are commonly used to assess human brain functions. We orient ourselves in an environment by computing precise self-to-object spatial relations (egocentric orientation) as well as object-to-object spatial relations (allocentric orientation). Egocentric orientation involves cues that depend on the position of the observer (i.e. left-right, front-behind), whereas allocentric orientation is maintained through the use of environmental features such as landmarks. As such, allocentric orientation involves short-term memory, whereas egocentric orientation does not. This paper presents a Virtual Reality Navigational (VRN) experiment specifically designed to assess egocentric spatial cognition. The design aimed to minimize the effect of spatial cues or landmarks for human navigation in a naturalistic VR environment. The VRN experiment designed for this study, called the Virtual House, is a symmetric three story cubic building, with 3 windows on each side on every floor, and one entrance on each side of the building. In each trial, a window is marked by a pseudo-random sequence as the objective. The marked window is shown to the participant from an outdoor view. The task is to reach the objective window using the shortest path through the building. The experiment entails 2 sets of 8 trials to cover all possibilities. The participants' performance error is measured by the difference between their traversed distance trajectory and the shortest natural distance (calculated using the VR engine), normalized by the shortest distance, in each trial. Fifty-two cognitively healthy adults participated in the study. The results show no learning effect during the 16 trails, implying that the experiment does not rely on short-term memory. Furthermore, the subjects' normalized performance error showed an almost linear increase with age, implying that egocentric spatial cognition ability declines with age.

  11. Design, development, and fabrication of a prototype ice pack heat sink subsystem. Flight experiment physical phenomena experiment chest

    NASA Technical Reports Server (NTRS)

    Roebelen, G. J., Jr.; Dean, W. C., II

    1975-01-01

    The concept of a flight experiment physical phenomena experiment chest, to be used eventually for investigating and demonstrating ice pack heat sink subsystem physical phenomena during a zero gravity flight experiment, is described.

  12. Optimization of Capacitive Acoustic Resonant Sensor Using Numerical Simulation and Design of Experiment

    PubMed Central

    Haque, Rubaiyet Iftekharul; Loussert, Christophe; Sergent, Michelle; Benaben, Patrick; Boddaert, Xavier

    2015-01-01

    Optimization of the acoustic resonant sensor requires a clear understanding of how the output responses of the sensor are affected by the variation of different factors. During this work, output responses of a capacitive acoustic transducer, such as membrane displacement, quality factor, and capacitance variation, are considered to evaluate the sensor design. The six device parameters taken into consideration are membrane radius, backplate radius, cavity height, air gap, membrane tension, and membrane thickness. The effects of factors on the output responses of the transducer are investigated using an integrated methodology that combines numerical simulation and design of experiments (DOE). A series of numerical experiments are conducted to obtain output responses for different combinations of device parameters using finite element methods (FEM). Response surface method is used to identify the significant factors and to develop the empirical models for the output responses. Finally, these results are utilized to calculate the optimum device parameters using multi-criteria optimization with desirability function. Thereafter, the validating experiments are designed and deployed using the numerical simulation to crosscheck the responses. PMID:25894937

  13. Design of a high-lift experiment in water including active flow control

    NASA Astrophysics Data System (ADS)

    Beutel, T.; Sattler, S.; El Sayed, Y.; Schwerter, M.; Zander, M.; Büttgenbach, S.; Leester-Schädel, M.; Radespiel, R.; Sinapius, M.; Wierach, P.

    2014-07-01

    This paper describes the structural design of an active flow-control experiment. The aim of the experiment is to investigate the increase in efficiency of an internally blown Coanda flap using unsteady blowing. The system uses tailor-made microelectromechanical (MEMS) pressure sensors to determine the state of the oncoming flow and an actuated lip to regulate the mass flow and velocity of a stream near a wall over the internally blown flap. Sensors and actuators are integrated into a highly loaded system that is extremely compact. The sensors are connected to a bus system that feeds the data into a real-time control system. The piezoelectric actuators using the d 33 effect at a comparable low voltage of 120 V are integrated into a lip that controls the blowout slot height. The system is designed for closed-loop control that efficiently avoids flow separation on the Coanda flap. The setup is designed for water-tunnel experiments in order to reduce the free-stream velocity and the system’s control frequency by a factor of 10 compared with that in air. This paper outlines the function and verification of the system’s main components and their development.

  14. Role of modeling in the design of experiments in carbohydrate metabolism

    SciTech Connect

    Foster, D.M.; Hetenyi, G. Jr. )

    1991-05-01

    Most publications on modeling present only the final product without describing the details as to how they were developed and tested. It is, however, by model development and testing that the true power of modeling as a research tool reveals itself. The purpose of this paper is to present a behind the scenes look at a set of experiments designed to study carbon atom transport in gluconeogenesis. In particular, it will be shown how the development of one model led to hypotheses for which another set of experiments was designed. The model which resulted from the second study contained in turn a number of new hypotheses for which further experiments remain to be designed. The second model supported the findings of the first, and yielded deeper insights into the exchange of carbon atoms among three metabolites. It is hoped this illustration will encourage other investigators to take advantage of the utilitarian value of modeling not only as a parameter generating tool, but also as a true research tool which can aid significantly to extract more information from available data.

  15. Complete factorial design experiment for 3D load cell instrumented crank validation.

    PubMed

    Omar, Valle-Casas; Rafael, Dalazen; Vinicius, Cene; Alexandre, Balbinot

    2015-08-01

    Developing of instrumentation systems for sport medicine is a promising area, that's why this research evaluates the design of a new instrumented crank arm prototype for a race bicycle projecting an experiment for indoor - outdoor comparison. This study investigated the viability of an instrumentation 3D load cell for force measurement crank, implementing a design of experiment. A Complete factorial design experiment was developed for data validation, with an Analysis of Variance (ANOVA) throwing significant results for controlled factors with response variables rms, mean and variance. A software routine allowed to obtained system variables metrics for Symmetry and Cadence analysis, which came out from Effective force bilateral comparing and speed computation. Characterization allowed achieving calibration curves that were used for data conversion in force projection channels with a linearity error of 0.29% (perpendicular), 0.55% (parallel) and 0.10% (lateral). Interactions of factors resulted significant mainly for indoor tests in symmetry and cadence was significant in interactions generally for outdoor tests. Implemented system was able to generate Effective Force graph for 3D plot symmetry analysis, torque and power symmetry for specialist's analysis.

  16. Optimal fed batch experiment design for estimation of monod kinetics of Azospirillum brasilense: from theory to practice.

    PubMed

    Cappuyns, Astrid M; Bernaerts, Kristel; Smets, Ilse Y; Ona, Ositadinma; Prinsen, Els; Vanderleyden, Jos; Van Impe, Jan F

    2007-01-01

    In this paper the problem of reliable and accurate parameter estimation for unstructured models is considered. It is illustrated how a theoretically optimal design can be successfully translated into a practically feasible, robust, and informative experiment. The well-known parameter estimation problem of Monod kinetic parameters is used as a vehicle to illustrate our approach. As known for a long time, noisy batch measurements do not allow for unique and accurate estimation of the kinetic parameters of the Monod model. Techniques of optimal experiment design are, therefore, exploited to design informative experiments and to improve the parameter estimation accuracy. During the design process, practical feasibility has to be kept in mind. The designed experiments are easy to implement in practice and do not require additional monitoring equipment. Both design and experimental validation of informative fed batch experiments are illustrated with a case study, namely, the growth of the nitrogen-fixing bacteria Azospirillum brasilense.

  17. Cellular changes in microgravity and the design of space radiation experiments.

    PubMed

    Morrison, D R

    1994-10-01

    Cell metabolism, secretion and cell-cell interactions can be altered during space flight. Early radiobiology experiments have demonstrated synergistic effects of radiation and microgravity as indicated by increased mutagenesis, increased chromosome aberrations, inhibited development, and retarded growth. Microgravity-induced changes in immune cell functions include reduced blastogenesis and cell-mediated, delayed-type hypersensitivity responses, increased cytokine secretions, but inhibited cytotoxic effects and macrophage differentiation. These effects are important because of the high radiosensitivity of immune cells. It is difficult to compare ground studies with space radiation biology experiments because of the complexity of the space radiation environment, types of radiation damage and repair mechanisms. Altered intracellular functions and molecular mechanisms must be considered in the design and interpretation of space radiation experiments. Critical steps in radiocarcinogenesis could be affected. New cell systems and hardware are needed to determine the biological effectiveness of the low dose rate, isotropic, multispectral space radiation and the potential usefulness of radioprotectants during space flight.

  18. Thermomechanical simulation of the DIAMINO irradiation experiment using the LICOS fuel design code

    SciTech Connect

    Bejaoui, S.; Helfer, T.; Brunon, E.; Lambert, T.; Bendotti, S.; Neyroud, C.

    2013-07-01

    Two separate-effect experiments in the HFR and OSIRIS Material Test Reactors (MTRs) are currently under Post- Irradiation Examinations (MARIOS) and under preparation (DIAMINO) respectively. The main goal of these experiments is to investigate gaseous release and swelling of Am-bearing UO2-x fuels as a function of temperature, fuel microstructure and gas production rate. First, a brief description of the MARIOS and DIAMINO irradiations is provided. Then, the innovative experimental in-pile device specifically developed for the DIAMINO experiment is described. Eventually, the thermo-mechanical computations performed using the LICOS code are presented. These simulations support the DIAMINO experimental design and highlight some of the capabilities of the code. (authors)

  19. Design and Predictions for a High-Altitude (Low-Reynolds-Number) Aerodynamic Flight Experiment

    NASA Technical Reports Server (NTRS)

    Greer, Donald; Hamory, Phil; Krake, Keith; Drela, Mark

    1999-01-01

    A sailplane being developed at NASA Dryden Flight Research Center will support a high-altitude flight experiment. The experiment will measure the performance parameters of an airfoil at high altitudes (70,000 to 100,000 ft), low Reynolds numbers (200,000 to 700,000), and high subsonic Mach numbers (0.5 and 0.65). The airfoil section lift and drag are determined from pitot and static pressure measurements. The locations of the separation bubble, Tollmien-Schlichting boundary layer instability frequencies, and vortex shedding are measured from a hot-film strip. The details of the planned flight experiment are presented. Several predictions of the airfoil performance are also presented. Mark Drela from the Massachusetts Institute of Technology designed the APEX-16 airfoil, using the MSES code. Two-dimensional Navier-Stokes analyses were performed by Mahidhar Tatineni and Xiaolin Zhong from the University of California, Los Angeles, and by the authors at NASA Dryden.

  20. Hot rocket plume experiment - Survey and conceptual design. [of rhenium-iridium bipropellants

    NASA Technical Reports Server (NTRS)

    Millard, Jerry M.; Luan, Taylor W.; Dowdy, Mack W.

    1992-01-01

    Attention is given to a space-borne engine plume experiment study to fly an experiment which will both verify and quantify the reduced contamination from advanced rhenium-iridium earth-storable bipropellant rockets (hot rockets) and provide a correlation between high-fidelity, in-space measurements and theoretical plume and surface contamination models. The experiment conceptual design is based on survey results from plume and contamination technologists throughout the U.S. With respect to shuttle use, cursory investigations validate Hitchhiker availability and adaptability, adequate remote manipulator system (RMS) articulation and dynamic capability, acceptable RMS attachment capability, adequate power and telemetry capability, and adequate flight altitude and attitude/orbital capability.

  1. Design of Field Experiments for Adaptive Sampling of the Ocean with Autonomous Vehicles

    NASA Astrophysics Data System (ADS)

    Zheng, H.; Ooi, B. H.; Cho, W.; Dao, M. H.; Tkalich, P.; Patrikalakis, N. M.

    2010-05-01

    Due to the highly non-linear and dynamical nature of oceanic phenomena, the predictive capability of various ocean models depends on the availability of operational data. A practical method to improve the accuracy of the ocean forecast is to use a data assimilation methodology to combine in-situ measured and remotely acquired data with numerical forecast models of the physical environment. Autonomous surface and underwater vehicles with various sensors are economic and efficient tools for exploring and sampling the ocean for data assimilation; however there is an energy limitation to such vehicles, and thus effective resource allocation for adaptive sampling is required to optimize the efficiency of exploration. In this paper, we use physical oceanography forecasts of the coastal zone of Singapore for the design of a set of field experiments to acquire useful data for model calibration and data assimilation. The design process of our experiments relied on the oceanography forecast including the current speed, its gradient, and vorticity in a given region of interest for which permits for field experiments could be obtained and for time intervals that correspond to strong tidal currents. Based on these maps, resources available to our experimental team, including Autonomous Surface Craft (ASC) are allocated so as to capture the oceanic features that result from jets and vortices behind bluff bodies (e.g., islands) in the tidal current. Results are summarized from this resource allocation process and field experiments conducted in January 2009.

  2. Design and Characterization of a Neutralized-Transport Experiment for Heavy-Ion Fusion

    SciTech Connect

    Henderson, E; Eylon, S; Roy, P; Yu, S S; Anders, A; Bieniosek, F M; Greenway, W G; Logan, B G; MacGill, R A; Shuman, D B; Vanecek, D L; Waldron, W L; Sharp, W M; Houck, T L; Davidson, R C; Efthimion, P C; Gilson, E P; Sefkow, A B; Welch, D R; Rose, D V; Olson, C L

    2004-05-24

    In heavy-ion inertial-confinement fusion systems, intense beams of ions must be transported from the exit of the final focus magnet system through the fusion chamber to hit millimeter-sized spots on the target. Effective plasma neutralization of intense ion beams in this final transport is essential for a heavy-ion fusion power plant to be economically competitive. The physics of neutralized drift has been studied extensively with particle-in-cell simulations. To provide quantitative comparisons of theoretical predictions with experiment, the Virtual National Laboratory for Heavy Ion Fusion has completed the construction and has begun experimentation with the Neutralized Transport Experiment (NTX). The experiment consists of three main sections, each with its own physics issues. The injector is designed to generate a very high-brightness, space-charge-dominated potassium beam while still allowing variable perveance by a beam aperturing technique. The magnetic-focusing section, consisting of four pulsed magnetic quadrupoles, permits the study of beam tuning, as well as the effects of phase space dilution due to higher-order nonlinear fields. In the final section, a converging ion beam exiting the magnetic section is transported through a drift region with plasma sources for beam neutralization, and the final spot size is measured under various conditions of neutralization. In this paper, we discuss the design and characterization of the three sections in detail and present the first results from the experiment.

  3. Design and characterization of a neutralized-transport experiment for heavy-ion fusion

    SciTech Connect

    Henestroza, E.; Eylon, S.; Roy, P.K.; Yu, S.S.; Anders, A.; Bieniosek, F.M.; Greenway, W.G.; Logan, B.G.; MacGill, R.A.; Shuman, D.B.; Vanecek, D.L.; Waldron, W.L.; Sharp, W.M.; Houck, T.L.; Davidson, R.C.; Efthimion, P.C.; Gilson, E.P.; Sefkow, A.B.; Welch, D.R.; Rose, D.V.; Olson, C.L.

    2004-03-14

    In heavy-ion inertial-confinement fusion systems, intense beams of ions must be transported from the exit of the final focus magnet system through the fusion chamber to hit millimeter-sized spots on the target. Effective plasma neutralization of intense ion beams in this final transport is essential for a heavy-ion fusion power plant to be economically competitive. The physics of neutralized drift has been studied extensively with particle-in-cell simulations. To provide quantitative comparisons of theoretical predictions with experiment, the Virtual National Laboratory for Heavy Ion Fusion has completed the construction and has begun experimentation with the Neutralized Transport Experiment (NTX). The experiment consists of three main sections, each with its own physics issues. The injector is designed to generate a very high-brightness, space-charge-dominated potassium beam while still allowing variable perveance by a beam aperturing technique. The magnetic-focusing section, consisting of four pulsed magnetic quadrupoles, permits the study of beam tuning, as well as the effects of phase space dilution due to higher-order nonlinear fields. In the final section, the converging ion beam exiting the magnetic section is transported through a drift region with plasma sources for beam neutralization, and the final spot size is measured under various conditions of neutralization. In this paper, we discuss the design and characterization of the three sections in detail and present initial results from the experiment.

  4. EDITORIAL Wireless sensor networks: design for real-life deployment and deployment experiences Wireless sensor networks: design for real-life deployment and deployment experiences

    NASA Astrophysics Data System (ADS)

    Gaura, Elena; Roedig, Utz; Brusey, James

    2010-12-01

    modalities and (iv) system solutions with high end-user added value and cost benefits. The common thread is deployment and deployment evaluation. In particular, satisfaction of application requirements, involvement of the end-user in the design and deployment process, satisfactory system performance and user acceptance are concerns addressed in many of the contributions. The contributions form a valuable set, which help to identify the priorities for research in this burgeoning area: Robust, reliable and efficient data collection in embedded wireless multi-hop networks are essential elements in creating a true deploy-and-forget user experience. Maintaining full connectivity within a WSN, in a real world environment populated by other WSNs, WiFi networks or Bluetooth devices that constitute sources of interference is a key element in any application, but more so for those that are safety-critical, such as disaster response. Awareness of the effects of wireless channel, physical position and line-of-sight on received signal strength in real-world, outdoor environments will shape the design of many outdoor applications. Thus, the quantification of such effects is valuable knowledge for designers. Sensors' failure detection, scalability and commercialization are common challenges in many long-term monitoring applications; transferable solutions are evidenced here in the context of pollutant detection and water quality. Innovative, alternative thinking is often needed to achieve the desired long-lived networks when power-hungry sensors are foreseen components; in some instances, the very problems of wireless technology, such as RF irregularity, can be transformed into advantages. The importance of an iterative design and evaluation methodology—from analysis to simulation to real-life deployment—should be well understood by all WSN developers. The value of this is highlighted in the context of a challenging WPAN video-surveillance application based on a novel Nomadic Access

  5. On the Use of Statistics in Design and the Implications for Deterministic Computer Experiments

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    Perhaps the most prevalent use of statistics in engineering design is through Taguchi's parameter and robust design -- using orthogonal arrays to compute signal-to-noise ratios in a process of design improvement. In our view, however, there is an equally exciting use of statistics in design that could become just as prevalent: it is the concept of metamodeling whereby statistical models are built to approximate detailed computer analysis codes. Although computers continue to get faster, analysis codes always seem to keep pace so that their computational time remains non-trivial. Through metamodeling, approximations of these codes are built that are orders of magnitude cheaper to run. These metamodels can then be linked to optimization routines for fast analysis, or they can serve as a bridge for integrating analysis codes across different domains. In this paper we first review metamodeling techniques that encompass design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We discuss their existing applications in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of metamodeling techniques in given situations and how common pitfalls can be avoided.

  6. Design and test of a compact optics system for the pool boiling experiment

    NASA Technical Reports Server (NTRS)

    Ling, Jerri S.; Laubenthal, James R.

    1990-01-01

    The experiment described seeks to improve the understanding of the fundamental mechanisms that constitute nucleate pool boiling. The vehicle for accomplishing this is an investigation, including tests to be conducted in microgravity and coupled with appropriate analyses, of the heat transfer and vapor bubble dynamics associated with nucleation, bubble growth/collapse and subsequent motion, considering the interrelations between buoyancy, momentum and surface tension which will govern the motion of the vapor and surrounding liquid, as a function of the heating rate at the heat transfer surface and the temperature level and distribution in the bulk liquid. The experiment is designed to be contained within the confines of a Get-Away-Special Canister (GAS Can) installed in the bay of the space shuttle. When the shuttle reaches orbit, the experiment will be turned on and testing will proceed automatically. In the proposed Pool Boiling Experiment a pool of liquid, initially at a precisely defined pressure and temperature, will be subjected to a step imposed heat flux from a semitransparent thin-film heater forming part of one wall of the container such that boiling is initiated and maintained for a defined period of time at a constant pressure level. Transient measurements of the heater surface and fluid temperatures near the surface will be made, noting especially the conditions at the onset of boiling, along with motion photography of the boiling process in two simultaneous views, from beneath the heating surface and from the side. The conduct of the experiment and the data acquisition will be completely automated and self-contained. For the initial flight, a total of nine tests are proposed, with three levels of heat flux and three levels of subcooling. The design process used in the development and check-out of the compact photographic/optics system for the Pool Boiling Experiment is documented.

  7. Recent experience with multidisciplinary analysis and optimization in advanced aircraft design

    NASA Technical Reports Server (NTRS)

    Dollyhigh, Samuel M.; Sobieszczanski-Sobieski, Jaroslaw

    1990-01-01

    The task of modern aircraft design has always been complicated due to the number of intertwined technical factors from the various engineering disciplines. Furthermore, this complexity has been rapidly increasing by the development of such technologies as aeroelasticity tailored materials and structures, active control systems, integrated propulsion/airframe controls, thrust vectoring, and so on. Successful designs that achieve maximum advantage from these new technologies require a thorough understanding of the physical phenomena and the interactions among these phenomena. A study commissioned by the Aeronautical Sciences and Evaluation Board of the National Research Council has gone so far as to identify technology integration as a new discipline from which many future aeronautical advancements will arise. Regardless of whether one considers integration as a new discipline or not, it is clear to all engineers involved in aircraft design and analysis that better methods are required. In the past, designers conducted parametric studies in which a relatively small number of principal characteristics were varied to determine the effect on design requirements which were themselves often diverse and contradictory. Once a design was chosen, it then passed through the various engineers' disciplines whose principal task was to make the chosen design workable. Working in a limited design space, the discipline expert sometimes improved the concept, but more often than not, the result was in the form of a penalty to make the original concept workable. If an insurmountable problem was encountered, the process began over. Most design systems that attempt to account for disciplinary interactions have large empirical elements and reliance on past experience is a poor guide in obtaining maximum utilizations of new technologies. Further compounding the difficulty of design is that as the aeronautical sciences have matured, the discipline specialist's area of research has generally

  8. A new paradigm on battery powered embedded system design based on User-Experience-Oriented method

    NASA Astrophysics Data System (ADS)

    Wang, Zhuoran; Wu, Yue

    2014-03-01

    The battery sustainable time has been an active research topic recently for the development of battery powered embedded products such as tablets and smart phones, which are determined by the battery capacity and power consumption. Despite numerous efforts on the improvement of battery capacity in the field of material engineering, the power consumption also plays an important role and easier to ameliorate in delivering a desirable user-experience, especially considering the moderate advancement on batteries for decades. In this study, a new Top-Down modelling method, User-Experience-Oriented Battery Powered Embedded System Design Paradigm, is proposed to estimate the target average power consumption, to guide the hardware and software design, and eventually to approach the theoretical lowest power consumption that the application is still able to provide the full functionality. Starting from the 10-hour sustainable time standard, average working current is defined with battery design capacity and set as a target. Then an implementation is illustrated from both hardware perspective, which is summarized as Auto-Gating power management, and from software perspective, which introduces a new algorithm, SleepVote, to guide the system task design and scheduling.

  9. A Bayesian Approach to the Design and Analysis of Computer Experiments

    SciTech Connect

    Currin, C.

    1988-01-01

    We consider the problem of designing and analyzing experiments for prediction of the function y(f), t {element_of} T, where y is evaluated by means of a computer code (typically by solving complicated equations that model a physical system), and T represents the domain of inputs to the code. We use a Bayesian approach, in which uncertainty about y is represented by a spatial stochastic process (random function); here we restrict attention to stationary Gaussian processes. The posterior mean function can be used as an interpolating function, with uncertainties given by the posterior standard deviations. Instead of completely specifying the prior process, we consider several families of priors, and suggest some cross-validational methods for choosing one that performs relatively well on the function at hand. As a design criterion, we use the expected reduction in the entropy of the random vector y (T*), where T* {contained_in} T is a given finite set of ''sites'' (input configurations) at which predictions are to be made. We describe an exchange algorithm for constructing designs that are optimal with respect to this criterion. To demonstrate the use of these design and analysis methods, several examples are given, including one experiment on a computer model of a thermal energy storage device and another on an integrated circuit simulator.

  10. Development of oral sustained release rifampicin loaded chitosan nanoparticles by design of experiment.

    PubMed

    Patel, Bhavin K; Parikh, Rajesh H; Aboti, Pooja S

    2013-01-01

    Objective. The main objective of the present investigation was to develop and optimize oral sustained release Chitosan nanoparticles (CNs) of rifampicin by design of experiment (DOE). Methodology. CNs were prepared by modified emulsion ionic gelation technique. Here, inclusion of hydrophobic drug moiety in the hydrophilic matrix of polymer is applied for rifampicin delivery using CN. The 2(3) full-factorial design was employed by selecting the independent variables such as Chitosan concentration (X 1), concentration of tripolyphosphate (X 2), and homogenization speed (X 3) in order to achieve desired particle size with maximum percent entrapment efficiency and drug loading. The design was validated by checkpoint analysis, and formulation was optimized using the desirability function. Results. Particle size, drug entrapment efficiency, and drug loading for the optimized batch were found to be 221.9 nm, 44.17 ± 1.98% W/W, and 42.96 ± 2.91% W/W, respectively. In vitro release data of optimized formulation showed an initial burst followed by slow sustained drug release. Kinetic drug release from CNs was best fitted to Higuchi model. Conclusion. Design of Experiment is an important tool for obtaining desired characteristics of rifampicin loaded CNs. In vitro study suggests that oral sustained release CNs might be an effective drug delivery system for tuberculosis.

  11. Optimal experiment design for quantum state tomography: Fair, precise, and minimal tomography

    SciTech Connect

    Nunn, J.; Smith, B. J.; Puentes, G.; Walmsley, I. A.; Lundeen, J. S.

    2010-04-15

    Given an experimental setup and a fixed number of measurements, how should one take data to optimally reconstruct the state of a quantum system? The problem of optimal experiment design (OED) for quantum state tomography was first broached by Kosut et al.[R. Kosut, I. Walmsley, and H. Rabitz, e-print arXiv:quant-ph/0411093 (2004)]. Here we provide efficient numerical algorithms for finding the optimal design, and analytic results for the case of 'minimal tomography'. We also introduce the average OED, which is independent of the state to be reconstructed, and the optimal design for tomography (ODT), which minimizes tomographic bias. Monte Carlo simulations confirm the utility of our results for qubits. Finally, we adapt our approach to deal with constrained techniques such as maximum-likelihood estimation. We find that these are less amenable to optimization than cruder reconstruction methods, such as linear inversion.

  12. Randomized block experimental designs can increase the power and reproducibility of laboratory animal experiments.

    PubMed

    Festing, Michael F W

    2014-01-01

    Randomized block experimental designs have been widely used in agricultural and industrial research for many decades. Usually they are more powerful, have higher external validity, are less subject to bias, and produce more reproducible results than the completely randomized designs typically used in research involving laboratory animals. Reproducibility can be further increased by using time as a blocking factor. These benefits can be achieved at no extra cost. A small experiment investigating the effect of an antioxidant on the activity of a liver enzyme in four inbred mouse strains, which had two replications (blocks) separated by a period of two months, illustrates this approach. The widespread failure to use these designs more widely in research involving laboratory animals has probably led to a substantial waste of animals, money, and scientific resources and slowed down the development of new treatments for human and animal diseases.

  13. Note: Design and development of wireless controlled aerosol sampling network for large scale aerosol dispersion experiments

    NASA Astrophysics Data System (ADS)

    Gopalakrishnan, V.; Subramanian, V.; Baskaran, R.; Venkatraman, B.

    2015-07-01

    Wireless based custom built aerosol sampling network is designed, developed, and implemented for environmental aerosol sampling. These aerosol sampling systems are used in field measurement campaign, in which sodium aerosol dispersion experiments have been conducted as a part of environmental impact studies related to sodium cooled fast reactor. The sampling network contains 40 aerosol sampling units and each contains custom built sampling head and the wireless control networking designed with Programmable System on Chip (PSoC™) and Xbee Pro RF modules. The base station control is designed using graphical programming language LabView. The sampling network is programmed to operate in a preset time and the running status of the samplers in the network is visualized from the base station. The system is developed in such a way that it can be used for any other environment sampling system deployed in wide area and uneven terrain where manual operation is difficult due to the requirement of simultaneous operation and status logging.

  14. Designs for highly nonlinear ablative Rayleigh-Taylor experiments on the National Ignition Facility

    NASA Astrophysics Data System (ADS)

    Casner, A.; Smalyuk, V. A.; Masse, L.; Igumenshchev, I.; Liberatore, S.; Jacquet, L.; Chicanne, C.; Loiseau, P.; Poujade, O.; Bradley, D. K.; Park, H. S.; Remington, B. A.

    2012-08-01

    We present two designs relevant to ablative Rayleigh-Taylor instability in transition from weakly nonlinear to highly nonlinear regimes at the National Ignition Facility [E. I. Moses, J. Phys.: Conf. Ser. 112, 012003 (2008)]. The sensitivity of nonlinear Rayleigh-Taylor instability physics to ablation velocity is addressed with targets driven by indirect drive, with stronger ablative stabilization, and by direct drive, with weaker ablative stabilization. The indirect drive design demonstrates the potential to reach a two-dimensional bubble-merger regime with a 20 ns duration drive at moderate radiation temperature. The direct drive design achieves a 3 to 5 times increased acceleration distance for the sample in comparison to previous experiments allowing at least 2 more bubble generations when starting from a three-dimensional broadband spectrum.

  15. Interim Service ISDN Satellite (ISIS) simulator development for advanced satellite designs and experiments

    NASA Technical Reports Server (NTRS)

    Pepin, Gerard R.

    1992-01-01

    The simulation development associated with the network models of both the Interim Service Integrated Services Digital Network (ISDN) Satellite (ISIS) and the Full Service ISDN Satellite (FSIS) architectures is documented. The ISIS Network Model design represents satellite systems like the Advanced Communications Technology Satellite (ACTS) orbiting switch. The FSIS architecture, the ultimate aim of this element of the Satellite Communications Applications Research (SCAR) Program, moves all control and switching functions on-board the next generation ISDN communications satellite. The technical and operational parameters for the advanced ISDN communications satellite design will be obtained from the simulation of ISIS and FSIS engineering software models for their major subsystems. Discrete event simulation experiments will be performed with these models using various traffic scenarios, design parameters, and operational procedures. The data from these simulations will be used to determine the engineering parameters for the advanced ISDN communications satellite.

  16. Note: Design and development of wireless controlled aerosol sampling network for large scale aerosol dispersion experiments

    SciTech Connect

    Gopalakrishnan, V.; Subramanian, V.; Baskaran, R.; Venkatraman, B.

    2015-07-15

    Wireless based custom built aerosol sampling network is designed, developed, and implemented for environmental aerosol sampling. These aerosol sampling systems are used in field measurement campaign, in which sodium aerosol dispersion experiments have been conducted as a part of environmental impact studies related to sodium cooled fast reactor. The sampling network contains 40 aerosol sampling units and each contains custom built sampling head and the wireless control networking designed with Programmable System on Chip (PSoC™) and Xbee Pro RF modules. The base station control is designed using graphical programming language LabView. The sampling network is programmed to operate in a preset time and the running status of the samplers in the network is visualized from the base station. The system is developed in such a way that it can be used for any other environment sampling system deployed in wide area and uneven terrain where manual operation is difficult due to the requirement of simultaneous operation and status logging.

  17. Design optimization of RF lines in vacuum environment for the MITICA experiment.

    PubMed

    De Muri, Michela; Pavei, Mauro; Rossetto, Federico; Marcuzzi, Diego; Miorin, Enrico; Deambrosis, Silvia M

    2016-02-01

    This contribution regards the Radio Frequency (RF) transmission line of the Megavolt ITER Injector and Concept Advancement (MITICA) experiment. The original design considered copper coaxial lines of 1″ 5/8, but thermal simulations under operating conditions showed maximum temperatures of the lines at regime not compatible with the prescription of the component manufacturer. Hence, an optimization of the design was necessary. Enhancing thermal radiation and increasing the conductor size were considered for design optimization: thermal analyses were carried out to calculate the temperature of MITICA RF lines during operation, as a function of the emissivity value and of other geometrical parameters. Five coating products to increase the conductor surface emissivity were tested, measuring the outgassing behavior of the selected products and the obtained emissivity values.

  18. Optimizing Mass Spectrometry Analyses: A Tailored Review on the Utility of Design of Experiments

    NASA Astrophysics Data System (ADS)

    Hecht, Elizabeth S.; Oberg, Ann L.; Muddiman, David C.

    2016-05-01

    Mass spectrometry (MS) has emerged as a tool that can analyze nearly all classes of molecules, with its scope rapidly expanding in the areas of post-translational modifications, MS instrumentation, and many others. Yet integration of novel analyte preparatory and purification methods with existing or novel mass spectrometers can introduce new challenges for MS sensitivity. The mechanisms that govern detection by MS are particularly complex and interdependent, including ionization efficiency, ion suppression, and transmission. Performance of both off-line and MS methods can be optimized separately or, when appropriate, simultaneously through statistical designs, broadly referred to as "design of experiments" (DOE). The following review provides a tutorial-like guide into the selection of DOE for MS experiments, the practices for modeling and optimization of response variables, and the available software tools that support DOE implementation in any laboratory. This review comes 3 years after the latest DOE review (Hibbert DB, 2012), which provided a comprehensive overview on the types of designs available and their statistical construction. Since that time, new classes of DOE, such as the definitive screening design, have emerged and new calls have been made for mass spectrometrists to adopt the practice. Rather than exhaustively cover all possible designs, we have highlighted the three most practical DOE classes available to mass spectrometrists. This review further differentiates itself by providing expert recommendations for experimental setup and defining DOE entirely in the context of three case-studies that highlight the utility of different designs to achieve different goals. A step-by-step tutorial is also provided.

  19. Pressurization Risk Assessment of CO2 Reservoirs Utilizing Design of Experiments and Response Surface Methods

    NASA Astrophysics Data System (ADS)

    Guyant, E.; Han, W. S.; Kim, K. Y.; Park, E.; Han, K.

    2015-12-01

    Monitoring of pressure buildup can provide explicit information on reservoir integrity and is an appealing tool, however pressure variation is dependent on a variety of factors causing high uncertainty in pressure predictions. This work evaluated pressurization of a reservoir system in the presence of leakage pathways as well as exploring the effects of compartmentalization of the reservoir utilizing design of experiments (Definitive Screening, Box Behnken, Central Composite, and Latin Hypercube designs) and response surface methods. Two models were developed, 1) an idealized injection scenario in order to evaluate the performance of multiple designs, and 2) a complex injection scenario implementing the best performing design to investigate pressurization of the reservoir system. A holistic evaluation of scenario 1, determined that the Central Composite design would be used for the complex injection scenario. The complex scenario evaluated 5 risk factors: reservoir, seal, leakage pathway and fault permeabilities, and horizontal position of the pathway. A total of 60 response surface models (RSM) were developed for the complex scenario with an average R2 of 0.95 and a NRMSE of 0.067. Sensitivity to the input factors was dynamic through space and time; at the earliest time (0.05 years) the reservoir permeability was dominant, and for later times (>0.5 years) the fault permeability became dominant for all locations. The RSM's were then used to conduct a Monte Carlo Analysis to further analyze pressurization risks, identifying the P10, P50, P90 values. This identified the in zone (lower) P90 values as 2.16, 1.77, and 1.53 MPa and above zone values of 1.35, 1.23, 1.09 MPa for monitoring locations 1, 2, and 3, respectively. In summary, the design of experiments and response surface methods allowed for an efficient sensitivity and uncertainty analysis to be conducted permitting a complete evaluation of the pressurization across the entire parameter space.

  20. Initial Scaling Studies and Conceptual Thermal Fluids Experiments for the Prismatic NGNP Point Design

    SciTech Connect

    D. M. McEligot; G. E. McCreery

    2004-09-01

    The objective of this report is to document the initial high temperature gas reactor scaling studies and conceptual experiment design for gas flow and heat transfer. The general approach of the project is to develop new benchmark experiments for assessment in parallel with CFD and coupled CFD/ATHENA/RELAP5-3D calculations for the same geometry. Two aspects of the complex flow in an NGNP are being addressed: (1) flow and thermal mixing in the lower plenum ("hot streaking" issue) and (2) turbulence and resulting temperature distributions in reactor cooling channels ("hot channel" issue). Current prismatic NGNP concepts are being examined to identify their proposed flow conditions and geometries over the range from normal operation to decay heat removal in a pressurized cooldown. Approximate analyses are being applied to determine key non-dimensional parameters and their magnitudes over this operating range. For normal operation, the flow in the coolant channels can be considered to be dominant forced convection with slight transverse property variation. The flow in the lower plenum can locally be considered to be a situation of multiple buoyant jets into a confined density-stratified crossflow -- with obstructions. Experiments are needed for the combined features of the lower plenum flows. Missing from the typical jet experiments are interactions with nearby circular posts and with vertical posts in the vicinity of vertical walls - with near stagnant surroundings at one extreme and significant crossflow at the other. Two heat transfer experiments are being considered. One addresses the "hot channel" problem, if necessary. The second experiment will treat heated jets entering a model plenum. Unheated MIR (Matched-Index-of-Refraction) experiments are first steps when the geometry is complicated. One does not want to use a computational technique which will not even handle constant properties properly. The MIR experiment will simulate flow features of the paths of jets

  1. Graphic design and scientific research: the experience of the INGV Laboratorio Grafica e Immagini

    NASA Astrophysics Data System (ADS)

    Riposati, Daniela; D'Addezio, Giuliana; Chesi, Angela; Di Laura, Francesca; Palone, Sabrina

    2016-04-01

    The Laboratorio Grafica e Immagini is the INGV reference structure for the graphic and visual communication supporting institutional and research activities. Part of the activity is focused on the production of different materials concerning the INGV Educational and Outreach projects on the main themes of Geophysics and natural hazards. The forefront results of research activity, in fact, are periodically transferred to the public through an intense and comprehensive plan of scientific dissemination. In 10 years of activity, the Laboratorio has become an essential point of reference for this production, widely known within the scientific community. Positive experiences are the result of a strict relationship between graphic design and scientific research, in particular the process concerning the collaborative work between designers and researchers. In projects such as the realization of museum exhibition or the production of illustrative brochures, generally designed for broad-spectrum public, the goal is to make easier the understanding and to support the scientific message, making concepts enjoyable and fruitful through the emotional involvement that visual image can arouse. Our graphics and editorial products through composition of signs and images by using differt tools on different media (the use of colors, lettering, graphic design, visual design, web design etc.) link to create a strong identity "INGV style", in order to make them easily recognizable in Educational and Outreach projects: in one words "branding". For example, a project product package might include a logo or other artwork, organized text and pure design elements such as shapes and colour, which unify the piece. Colour is used not only to help the "brand" stand out from the international overview, but in our case to have a unifying outcome across all the INGV sections. We also analysed the restyling project of different materials, one of the most important features of graphic design

  2. Application of Design of Experiments and Surrogate Modeling within the NASA Advanced Concepts Office, Earth-to-Orbit Design Process

    NASA Technical Reports Server (NTRS)

    Zwack, Mathew R.; Dees, Patrick D.; Holt, James B.

    2016-01-01

    Decisions made during early conceptual design have a large impact upon the expected life-cycle cost (LCC) of a new program. It is widely accepted that up to 80% of such cost is committed during these early design phases.1 Therefore, to help minimize LCC, decisions made during conceptual design must be based upon as much information as possible. To aid in the decision making for new launch vehicle programs, the Advanced Concepts Office (ACO) at NASA Marshall Space Flight Center (MSFC) provides rapid turnaround pre-phase A and phase A concept definition studies. The ACO team utilizes a proven set of tools to provide customers with a full vehicle mass breakdown to tertiary subsystems, preliminary structural sizing based upon worst-case flight loads, and trajectory optimization to quantify integrated vehicle performance for a given mission.2 Although the team provides rapid turnaround for single vehicle concepts, the scope of the trade space can be limited due to analyst availability and the manpower requirements for manual execution of the analysis tools. In order to enable exploration of a broader design space, the ACO team has implemented an Advanced Design Methods (ADM) based approach. This approach applies the concepts of Design of Experiments (DOE) and surrogate modeling to more exhaustively explore the trade space and provide the customer with additional design information to inform decision making. This paper will first discuss the automation of the ACO tool set, which represents a majority of the development e ort. In order to t a surrogate model within tolerable error bounds a number of DOE cases are needed. This number will scale with the number of variable parameters desired and the complexity of the system's response to those variables. For all but the smallest design spaces, the number of cases required cannot be produced within an acceptable timeframe using a manual process. Therefore, automation of the tools was a key enabler for the successful

  3. Design of physical cloud seeding experiments for the Arizona atmospheric modification research program

    NASA Astrophysics Data System (ADS)

    Super, A. B.; Medina, J. G.; McPartland, J. T.

    1991-02-01

    Cloud seeding experiments were designed by the Bureau of Reclamation for winter orographic cloud systems over the Mogollon Rim of Arizona. The experiments are intended to test whether key physical processes proceed as hypothesized during both ground based and aircraft seeding with silver iodide. The experiments are also intended to document each significant link in the chain of physical events following release of seeding material up to, and including, snowfall at the ground at a small research area about 60 km south-southeast of Flagstaff. The physical experimentation should lead to a substantially improved understanding of winter seeding potential in clouds over Arizonia's higher terrain. Such understanding and documentation are a logical prelude to any future experimentation intended to determine seeding impacts over a large area during several winters. Several analysis approaches are suggested to evaluate the physical experiments which range from detailed case study examination to exploratory statistical analysis of experiments pooled into similar classes. Experimental coordination and organization are addressed, and budgets are presented for a five year program.

  4. Efficient Optimization of Stimuli for Model-Based Design of Experiments to Resolve Dynamical Uncertainty

    PubMed Central

    Mdluli, Thembi; Buzzard, Gregery T.; Rundell, Ann E.

    2015-01-01

    This model-based design of experiments (MBDOE) method determines the input magnitudes of an experimental stimuli to apply and the associated measurements that should be taken to optimally constrain the uncertain dynamics of a biological system under study. The ideal global solution for this experiment design problem is generally computationally intractable because of parametric uncertainties in the mathematical model of the biological system. Others have addressed this issue by limiting the solution to a local estimate of the model parameters. Here we present an approach that is independent of the local parameter constraint. This approach is made computationally efficient and tractable by the use of: (1) sparse grid interpolation that approximates the biological system dynamics, (2) representative parameters that uniformly represent the data-consistent dynamical space, and (3) probability weights of the represented experimentally distinguishable dynamics. Our approach identifies data-consistent representative parameters using sparse grid interpolants, constructs the optimal input sequence from a greedy search, and defines the associated optimal measurements using a scenario tree. We explore the optimality of this MBDOE algorithm using a 3-dimensional Hes1 model and a 19-dimensional T-cell receptor model. The 19-dimensional T-cell model also demonstrates the MBDOE algorithm’s scalability to higher dimensions. In both cases, the dynamical uncertainty region that bounds the trajectories of the target system states were reduced by as much as 86% and 99% respectively after completing the designed experiments in silico. Our results suggest that for resolving dynamical uncertainty, the ability to design an input sequence paired with its associated measurements is particularly important when limited by the number of measurements. PMID:26379275

  5. EDITORIAL Wireless sensor networks: design for real-life deployment and deployment experiences Wireless sensor networks: design for real-life deployment and deployment experiences

    NASA Astrophysics Data System (ADS)

    Gaura, Elena; Roedig, Utz; Brusey, James

    2010-12-01

    modalities and (iv) system solutions with high end-user added value and cost benefits. The common thread is deployment and deployment evaluation. In particular, satisfaction of application requirements, involvement of the end-user in the design and deployment process, satisfactory system performance and user acceptance are concerns addressed in many of the contributions. The contributions form a valuable set, which help to identify the priorities for research in this burgeoning area: Robust, reliable and efficient data collection in embedded wireless multi-hop networks are essential elements in creating a true deploy-and-forget user experience. Maintaining full connectivity within a WSN, in a real world environment populated by other WSNs, WiFi networks or Bluetooth devices that constitute sources of interference is a key element in any application, but more so for those that are safety-critical, such as disaster response. Awareness of the effects of wireless channel, physical position and line-of-sight on received signal strength in real-world, outdoor environments will shape the design of many outdoor applications. Thus, the quantification of such effects is valuable knowledge for designers. Sensors' failure detection, scalability and commercialization are common challenges in many long-term monitoring applications; transferable solutions are evidenced here in the context of pollutant detection and water quality. Innovative, alternative thinking is often needed to achieve the desired long-lived networks when power-hungry sensors are foreseen components; in some instances, the very problems of wireless technology, such as RF irregularity, can be transformed into advantages. The importance of an iterative design and evaluation methodology—from analysis to simulation to real-life deployment—should be well understood by all WSN developers. The value of this is highlighted in the context of a challenging WPAN video-surveillance application based on a novel Nomadic Access

  6. Improving Proteome Coverage on a LTQ-Orbitrap Using Design of Experiments

    NASA Astrophysics Data System (ADS)

    Andrews, Genna L.; Dean, Ralph A.; Hawkridge, Adam M.; Muddiman, David C.

    2011-04-01

    Design of experiments (DOE) was used to determine improved settings for a LTQ-Orbitrap XL to maximize proteome coverage of Saccharomyces cerevisiae. A total of nine instrument parameters were evaluated with the best values affording an increase of approximately 60% in proteome coverage. Utilizing JMP software, 2 DOE screening design tables were generated and used to specify parameter values for instrument methods. DOE 1, a fractional factorial design, required 32 methods fully resolving the investigation of six instrument parameters involving only half the time necessary for a full factorial design of the same resolution. It was advantageous to complete a full factorial design for the analysis of three additional instrument parameters. Measured with a maximum of 1% false discovery rate, protein groups, unique peptides, and spectral counts gauged instrument performance. Randomized triplicate nanoLC-LTQ-Orbitrap XL MS/MS analysis of the S. cerevisiae digest demonstrated that the following five parameters significantly influenced proteome coverage of the sample: (1) maximum ion trap ionization time; (2) monoisotopic precursor selection; (3) number of MS/MS events; (4) capillary temperature; and (5) tube lens voltage. Minimal influence on the proteome coverage was observed for the remaining four parameters (dynamic exclusion duration, resolving power, minimum count threshold to trigger a MS/MS event, and normalized collision energy). The DOE approach represents a time- and cost-effective method for empirically optimizing MS-based proteomics workflows including sample preparation, LC conditions, and multiple instrument platforms.

  7. Optimizing the experimental design of soil columns in saturated and unsaturated transport experiments.

    PubMed

    Lewis, Jeffrey; Sjöstrom, Jan

    2010-06-25

    Soil column experiments in both the saturated and unsaturated regimes are widely used for applied and theoretical studies in such diverse fields as transport model evaluation, fate and transport of pesticides, explosives, microbes, heavy metals and non aqueous phase liquids, and for evapotranspiration studies. The apparent simplicity of constructing soil columns conceals a number of technical issues which can seriously affect the outcome of an experiment, such as the presence or absence of macropores, artificial preferential flow paths, non-ideal infiltrate injection and unrealistic moisture regimes. This review examines the literature to provide an analysis of the state of the art for constructing both saturated and unsaturated soil columns. Common design challenges are discussed and best practices for potential solutions are presented. This article discusses both basic principles and the practical advantages and disadvantages of various experimental approaches. Both repacked and monolith-type columns are discussed. The information in this review will assist soil scientists, hydrogeologists and environmental professionals in optimizing the construction and operation of soil column experiments in order to achieve their objectives, while avoiding serious design flaws which can compromise the integrity of their results. PMID:20452088

  8. Anvil cell gasket design for high pressure nuclear magnetic resonance experiments beyond 30 GPa

    SciTech Connect

    Meier, Thomas; Haase, Jürgen

    2015-12-15

    Nuclear magnetic resonance (NMR) experiments are reported at up to 30.5 GPa of pressure using radiofrequency (RF) micro-coils with anvil cell designs. These are the highest pressures ever reported with NMR, and are made possible through an improved gasket design based on nano-crystalline powders embedded in epoxy resin. Cubic boron-nitride (c-BN), corundum (α-Al{sub 2}O{sub 3}), or diamond based composites have been tested, also in NMR experiments. These composite gaskets lose about 1/2 of their initial height up to 30.5 GPa, allowing for larger sample quantities and preventing damages to the RF micro-coils compared to precipitation hardened CuBe gaskets. It is shown that NMR shift and resolution are less affected by the composite gaskets as compared to the more magnetic CuBe. The sensitivity can be as high as at normal pressure. The new, inexpensive, and simple to engineer gaskets are thus superior for NMR experiments at high pressures.

  9. Assessment of the recycling potential of fresh concrete waste using a factorial design of experiments.

    PubMed

    Correia, S L; Souza, F L; Dienstmann, G; Segadães, A M

    2009-11-01

    Recycling of industrial wastes and by-products can help reduce the cost of waste treatment prior to disposal and eventually preserve natural resources and energy. To assess the recycling potential of a given waste, it is important to select a tool capable of giving clear indications either way, with the least time and work consumption, as is the case of modelling the system properties using the results obtained from statistical design of experiments. In this work, the aggregate reclaimed from the mud that results from washout and cleaning operations of fresh concrete mixer trucks (fresh concrete waste, FCW) was recycled into new concrete with various water/cement ratios, as replacement of natural fine aggregates. A 3(2) factorial design of experiments was used to model fresh concrete consistency index and hardened concrete water absorption and 7- and 28-day compressive strength, as functions of FCW content and water/cement ratio, and the resulting regression equations and contour plots were validated with confirmation experiments. The results showed that the fresh concrete workability worsened with the increase in FCW content but the water absorption (5-10 wt.%), 7-day compressive strength (26-36 MPa) and 28-day compressive strength (32-44 MPa) remained within the specified ranges, thus demonstrating that the aggregate reclaimed from FCW can be recycled into new concrete mixtures with lower natural aggregate content. PMID:19596189

  10. Optimizing the experimental design of soil columns in saturated and unsaturated transport experiments.

    PubMed

    Lewis, Jeffrey; Sjöstrom, Jan

    2010-06-25

    Soil column experiments in both the saturated and unsaturated regimes are widely used for applied and theoretical studies in such diverse fields as transport model evaluation, fate and transport of pesticides, explosives, microbes, heavy metals and non aqueous phase liquids, and for evapotranspiration studies. The apparent simplicity of constructing soil columns conceals a number of technical issues which can seriously affect the outcome of an experiment, such as the presence or absence of macropores, artificial preferential flow paths, non-ideal infiltrate injection and unrealistic moisture regimes. This review examines the literature to provide an analysis of the state of the art for constructing both saturated and unsaturated soil columns. Common design challenges are discussed and best practices for potential solutions are presented. This article discusses both basic principles and the practical advantages and disadvantages of various experimental approaches. Both repacked and monolith-type columns are discussed. The information in this review will assist soil scientists, hydrogeologists and environmental professionals in optimizing the construction and operation of soil column experiments in order to achieve their objectives, while avoiding serious design flaws which can compromise the integrity of their results.

  11. Anvil cell gasket design for high pressure nuclear magnetic resonance experiments beyond 30 GPa.

    PubMed

    Meier, Thomas; Haase, Jürgen

    2015-12-01

    Nuclear magnetic resonance (NMR) experiments are reported at up to 30.5 GPa of pressure using radiofrequency (RF) micro-coils with anvil cell designs. These are the highest pressures ever reported with NMR, and are made possible through an improved gasket design based on nano-crystalline powders embedded in epoxy resin. Cubic boron-nitride (c-BN), corundum (α-Al2O3), or diamond based composites have been tested, also in NMR experiments. These composite gaskets lose about 1/2 of their initial height up to 30.5 GPa, allowing for larger sample quantities and preventing damages to the RF micro-coils compared to precipitation hardened CuBe gaskets. It is shown that NMR shift and resolution are less affected by the composite gaskets as compared to the more magnetic CuBe. The sensitivity can be as high as at normal pressure. The new, inexpensive, and simple to engineer gaskets are thus superior for NMR experiments at high pressures. PMID:26724046

  12. Anvil cell gasket design for high pressure nuclear magnetic resonance experiments beyond 30 GPa

    NASA Astrophysics Data System (ADS)

    Meier, Thomas; Haase, Jürgen

    2015-12-01

    Nuclear magnetic resonance (NMR) experiments are reported at up to 30.5 GPa of pressure using radiofrequency (RF) micro-coils with anvil cell designs. These are the highest pressures ever reported with NMR, and are made possible through an improved gasket design based on nano-crystalline powders embedded in epoxy resin. Cubic boron-nitride (c-BN), corundum (α-Al2O3), or diamond based composites have been tested, also in NMR experiments. These composite gaskets lose about 1/2 of their initial height up to 30.5 GPa, allowing for larger sample quantities and preventing damages to the RF micro-coils compared to precipitation hardened CuBe gaskets. It is shown that NMR shift and resolution are less affected by the composite gaskets as compared to the more magnetic CuBe. The sensitivity can be as high as at normal pressure. The new, inexpensive, and simple to engineer gaskets are thus superior for NMR experiments at high pressures.

  13. Mechanical Design and Development of TES Bolometer Detector Arrays for the Advanced ACTPol Experiment

    NASA Technical Reports Server (NTRS)

    Ward, Jonathan T.; Austermann, Jason; Beall, James A.; Choi, Steve K.; Crowley, Kevin T.; Devlin, Mark J.; Duff, Shannon M.; Gallardo, Patricio M.; Henderson, Shawn W.; Ho, Shuay-Pwu Patty; Hilton, Gene; Hubmayr, Johannes; Khavari, Niloufar; Klein, Jeffrey; Koopman, Brian J.; Li, Dale; McMahon, Jeffrey; Mumby, Grace; Nati, Federico; Wollack, Edward J.

    2016-01-01

    The next generation Advanced ACTPol (AdvACT) experiment is currently underway and will consist of four Transition Edge Sensor (TES) bolometer arrays, with three operating together, totaling 5800 detectors on the sky. Building on experience gained with the ACTPol detector arrays, AdvACT will utilize various new technologies, including 150 mm detector wafers equipped with multichroic pixels, allowing for a more densely packed focal plane. Each set of detectors includes a feedhorn array of stacked silicon wafers which form a spline pro le leading to each pixel. This is then followed by a waveguide interface plate, detector wafer, back short cavity plate, and backshort cap. Each array is housed in a custom designed structure manufactured from high purity copper and then gold plated. In addition to the detector array assembly, the array package also encloses cryogenic readout electronics. We present the full mechanical design of the AdvACT high frequency (HF) detector array package along with a detailed look at the detector array stack assemblies. This experiment will also make use of extensive hardware and software previously developed for ACT, which will be modi ed to incorporate the new AdvACT instruments. Therefore, we discuss the integration of all AdvACT arrays with pre-existing ACTPol infrastructure.

  14. Optimization of biomolecule separation by combining microscale filtration and design-of-experiment methods.

    PubMed

    Kazemi, Amir S; Kawka, Karina; Latulippe, David R

    2016-10-01

    There is considerable interest in developing microscale (i.e., high-throughput) methods that enable multiple filtration experiments to be run in parallel with smaller sample amounts and thus reduce the overall required time and associated cost to run the filtration tests. Previous studies to date have focused on simply evaluating the filtration capacity, not the separation performance. In this work, the stirred-well filtration (SWF) method was used in combination with design-of-experiment (DOE) methods to optimize the separation performance for three binary mixtures of bio-molecules: protein-protein, protein-polysaccharide, and protein-DNA. Using the parallel based format of the SWF method, eight constant-flux ultrafiltration experiments were conducted at once to study the effects of stirring conditions, permeate flux, and/or solution conditions (pH, ionic strength). Four separate filtration tests were conducted for each combination of process variables; in total, over 100 separate tests were conducted. The sieving coefficient and selectivity results are presented to match the DOE design format and enable a greater understanding of the effects of the different process variables that were studied. The method described herein can be used to rapidly determine the optimal combination of process factors that give the best separation performance for a range of membrane-based separations applications and thus obviate the need to run a large number of traditional lab-scale tests. Biotechnol. Bioeng. 2016;113: 2131-2139. © 2016 Wiley Periodicals, Inc. PMID:27563852

  15. Optimization of biomolecule separation by combining microscale filtration and design-of-experiment methods.

    PubMed

    Kazemi, Amir S; Kawka, Karina; Latulippe, David R

    2016-10-01

    There is considerable interest in developing microscale (i.e., high-throughput) methods that enable multiple filtration experiments to be run in parallel with smaller sample amounts and thus reduce the overall required time and associated cost to run the filtration tests. Previous studies to date have focused on simply evaluating the filtration capacity, not the separation performance. In this work, the stirred-well filtration (SWF) method was used in combination with design-of-experiment (DOE) methods to optimize the separation performance for three binary mixtures of bio-molecules: protein-protein, protein-polysaccharide, and protein-DNA. Using the parallel based format of the SWF method, eight constant-flux ultrafiltration experiments were conducted at once to study the effects of stirring conditions, permeate flux, and/or solution conditions (pH, ionic strength). Four separate filtration tests were conducted for each combination of process variables; in total, over 100 separate tests were conducted. The sieving coefficient and selectivity results are presented to match the DOE design format and enable a greater understanding of the effects of the different process variables that were studied. The method described herein can be used to rapidly determine the optimal combination of process factors that give the best separation performance for a range of membrane-based separations applications and thus obviate the need to run a large number of traditional lab-scale tests. Biotechnol. Bioeng. 2016;113: 2131-2139. © 2016 Wiley Periodicals, Inc.

  16. The Role of Design-of-Experiments in Managing Flow in Compact Air Vehicle Inlets

    NASA Technical Reports Server (NTRS)

    Anderson, Bernhard H.; Miller, Daniel N.; Gridley, Marvin C.; Agrell, Johan

    2003-01-01

    It is the purpose of this study to demonstrate the viability and economy of Design-of-Experiments methodologies to arrive at microscale secondary flow control array designs that maintain optimal inlet performance over a wide range of the mission variables and to explore how these statistical methods provide a better understanding of the management of flow in compact air vehicle inlets. These statistical design concepts were used to investigate the robustness properties of low unit strength micro-effector arrays. Low unit strength micro-effectors are micro-vanes set at very low angles-of-incidence with very long chord lengths. They were designed to influence the near wall inlet flow over an extended streamwise distance, and their advantage lies in low total pressure loss and high effectiveness in managing engine face distortion. The term robustness is used in this paper in the same sense as it is used in the industrial problem solving community. It refers to minimizing the effects of the hard-to-control factors that influence the development of a product or process. In Robustness Engineering, the effects of the hard-to-control factors are often called noise , and the hard-to-control factors themselves are referred to as the environmental variables or sometimes as the Taguchi noise variables. Hence Robust Optimization refers to minimizing the effects of the environmental or noise variables on the development (design) of a product or process. In the management of flow in compact inlets, the environmental or noise variables can be identified with the mission variables. Therefore this paper formulates a statistical design methodology that minimizes the impact of variations in the mission variables on inlet performance and demonstrates that these statistical design concepts can lead to simpler inlet flow management systems.

  17. Enhanced ergonomics approaches for product design: a user experience ecosystem perspective and case studies.

    PubMed

    Xu, Wei

    2014-01-01

    This paper first discusses the major inefficiencies faced in current human factors and ergonomics (HFE) approaches: (1) delivering an optimal end-to-end user experience (UX) to users of a solution across its solution lifecycle stages; (2) strategically influencing the product business and technology capability roadmaps from a UX perspective and (3) proactively identifying new market opportunities and influencing the platform architecture capabilities on which the UX of end products relies. In response to these challenges, three case studies are presented to demonstrate how enhanced ergonomics design approaches have effectively addressed the challenges faced in current HFE approaches. Then, the enhanced ergonomics design approaches are conceptualised by a user-experience ecosystem (UXE) framework, from a UX ecosystem perspective. Finally, evidence supporting the UXE, the advantage and the formalised process for executing UXE and methodological considerations are discussed. Practitioner Summary: This paper presents enhanced ergonomics approaches to product design via three case studies to effectively address current HFE challenges by leveraging a systematic end-to-end UX approach, UX roadmaps and emerging UX associated with prioritised user needs and usages. Thus, HFE professionals can be more strategic, creative and influential.

  18. Enhanced ergonomics approaches for product design: a user experience ecosystem perspective and case studies.

    PubMed

    Xu, Wei

    2014-01-01

    This paper first discusses the major inefficiencies faced in current human factors and ergonomics (HFE) approaches: (1) delivering an optimal end-to-end user experience (UX) to users of a solution across its solution lifecycle stages; (2) strategically influencing the product business and technology capability roadmaps from a UX perspective and (3) proactively identifying new market opportunities and influencing the platform architecture capabilities on which the UX of end products relies. In response to these challenges, three case studies are presented to demonstrate how enhanced ergonomics design approaches have effectively addressed the challenges faced in current HFE approaches. Then, the enhanced ergonomics design approaches are conceptualised by a user-experience ecosystem (UXE) framework, from a UX ecosystem perspective. Finally, evidence supporting the UXE, the advantage and the formalised process for executing UXE and methodological considerations are discussed. Practitioner Summary: This paper presents enhanced ergonomics approaches to product design via three case studies to effectively address current HFE challenges by leveraging a systematic end-to-end UX approach, UX roadmaps and emerging UX associated with prioritised user needs and usages. Thus, HFE professionals can be more strategic, creative and influential. PMID:24405167

  19. The Design and Monitoring of the Timing and Synchronization System at the NOvA Experiment

    NASA Astrophysics Data System (ADS)

    Vasel, Justin; NOvA Collaboration

    2016-03-01

    NOvA is an accelerator-based, long-baseline neutrino oscillation experiment designed to probe the mass hierarchy and mixing structure of the neutrino sector. The experiment consists of a near detector at Fermilab and a far detector 810 km away in northern Minnesota positioned to receive neutrinos from Fermilab's NuMI beam. A GPS-based timing system has been designed and built to synchronize the 344,064 far detector readout elements and 20,192 near detector readout elements to an absolute timing precision that provides a channel-to-channel variation of less than 10 ns. This is done while simultaneously synchronizing the readout timing of the near and far detectors to the Fermilab accelerator complex to allow for the detection of the individual neutrino beam spills in each of the detectors. This presentation will outline the design of NOvA's timing system and discuss the means by which we monitor its performance to ensure the quality of the physics data being collected.

  20. Design and Testing of a Breadboard Electrical Power Control Unit for the Fluid Combustion Facility Experiment

    NASA Technical Reports Server (NTRS)

    Kimnach, Greg L.; Lebron, Ramon C.

    1999-01-01

    The Fluid Combustion Facility (FCF) Project and the Power Technology Division at the NASA Glenn Research Center (GRC) at Lewis Field in Cleveland, OH along with the Sundstrand Corporation in Rockford, IL are jointly developing an Electrical Power Converter Unit (EPCU) for the Fluid Combustion Facility to be flown on the International Space Station (ISS). The FCF facility experiment contains three racks: A core rack, a combustion rack, and a fluids rack. The EPCU will be used as the power interface to the ISS 120V(sub dc) power distribution system by each FCF experiment rack which requires 28V(sub dc). The EPCU is a modular design which contains three 120V(sub dc)-to-28V(sub dc) full-bridge, power converters rated at 1 kW(sub e) each bus transferring input relays and solid-state, current-limiting input switches, 48 current-limiting, solid-state, output switches; and control and telemetry hardware. The EPCU has all controls required to autonomously share load demand between the power feeds and--if absolutely necessary--shed loads. The EPCU, which maximizes the usage of allocated ISS power and minimizes loss of power to loads, can be paralleled with other EPCUs. This paper overviews the electrical design and operating characteristics of the EPCU and presents test data from the breadboard design.