Science.gov

Sample records for experiment cepex design

  1. Central Equatorial Pacific Experiment (CEPEX). Design document

    SciTech Connect

    Not Available

    1993-04-01

    The Earth`s climate has varied significantly in the past, yet climate records reveal that in the tropics, sea surface temperatures seem to have been remarkably stable, varying by less than a few degrees Celsius over geologic time. Today, the large warm pool of the western Pacific shows similar characteristics. Its surface temperature always exceeds 27{degree}C, but never 31{degree}C. Heightened interest in this observation has been stimulated by questions of global climate change and the exploration of stabilizing climate feedback processes. Efforts to understand the observed weak sensitivity of tropical sea surface temperatures to climate forcing has led to a number of competing ideas about the nature of this apparent thermostat. Although there remains disagreement on the processes that regulate tropical sea surface temperature, most agree that further progress in resolving these differences requires comprehensive field observations of three-dimensional water vapor concentrations, solar and infrared radiative fluxes, surface fluxes of heat and water vapor, and cloud microphysical properties. This document describes the Central Equatorial Pacific Experiment (CEPEX) plan to collect such observations over the central equatorial Pacific Ocean during March of 1993.

  2. Central Equatorial Pacific Experiment (CEPEX)

    SciTech Connect

    Not Available

    1993-01-01

    The Earth's climate has varied significantly in the past, yet climate records reveal that in the tropics, sea surface temperatures seem to have been remarkably stable, varying by less than a few degrees Celsius over geologic time. Today, the large warm pool of the western Pacific shows similar characteristics. Its surface temperature always exceeds 27[degree]C, but never 31[degree]C. Heightened interest in this observation has been stimulated by questions of global climate change and the exploration of stabilizing climate feedback processes. Efforts to understand the observed weak sensitivity of tropical sea surface temperatures to climate forcing has led to a number of competing ideas about the nature of this apparent thermostat. Although there remains disagreement on the processes that regulate tropical sea surface temperature, most agree that further progress in resolving these differences requires comprehensive field observations of three-dimensional water vapor concentrations, solar and infrared radiative fluxes, surface fluxes of heat and water vapor, and cloud microphysical properties. This document describes the Central Equatorial Pacific Experiment (CEPEX) plan to collect such observations over the central equatorial Pacific Ocean during March of 1993.

  3. SEDS experiment design definition

    NASA Technical Reports Server (NTRS)

    Carroll, Joseph A.; Alexander, Charles M.; Oldson, John C.

    1990-01-01

    The Small Expendable-tether Deployment System (SEDS) was developed to design, build, integrate, fly, and safely deploy and release an expendable tether. A suitable concept for an on-orbit test of SEDS was developed. The following tasks were performed: (1) Define experiment objectives and requirements; (2) Define experiment concepts to reach those objectives; (3) Support NASA in experiment concept selection and definition; (4) Perform analyses and tests of SEDS hardware; (5) Refine the selected SEDS experiment concept; and (6) Support interactive SEDS system definition process. Results and conclusions are given.

  4. Design Experiments in Educational Research.

    ERIC Educational Resources Information Center

    Cobb, Paul; Confrey, Jere; diSessa, Andrea; Lehrer, Richard; Schauble, Leona

    2003-01-01

    Indicates the range of purposes and variety of settings in which design experiments have been conducted, delineating five crosscutting features that collectively differentiate design experiments from other methodologies. Clarifies what is involved in preparing for and carrying out a design experiment and in conducting a retrospective analysis of…

  5. Exploratory designs for computational experiments

    SciTech Connect

    Morris, M.X.; Mitchell, T.J.

    1992-10-01

    Recent work by Johnson, Moore and Ylvisaker (1990) establishes equivalence of the maximin distance design criterion and an entropy criterion motivated by function prediction in a Bayesian setting. The latter criterion has been used by Currin, Mitchell, Morris, and Ylvisaker (1991) to design experiments for which the motivating application is approximation of a complex deterministic computer model. Because computer experiments often have a large number of controlled variables (inputs), maximin designs of moderate size are often concentrated in the corners of the cuboidal design region, i.e. each input is represented at only two levels. Here we will examine some maximin distance designs constructed within the class of Latin hypercube arrangements. The goal of this is to find designs which offer a compromise between the entropy/maximin criterion, and good projective properties in each dimension (as guaranteed by Latin hypercubes). A simulated annealing search algorithm is persented for constructing these designs, and patterns apparent in the optimal designs are discussed.

  6. Designing experiments through compressed sensing.

    SciTech Connect

    Young, Joseph G.; Ridzal, Denis

    2013-06-01

    In the following paper, we discuss how to design an ensemble of experiments through the use of compressed sensing. Specifically, we show how to conduct a small number of physical experiments and then use compressed sensing to reconstruct a larger set of data. In order to accomplish this, we organize our results into four sections. We begin by extending the theory of compressed sensing to a finite product of Hilbert spaces. Then, we show how these results apply to experiment design. Next, we develop an efficient reconstruction algorithm that allows us to reconstruct experimental data projected onto a finite element basis. Finally, we verify our approach with two computational experiments.

  7. Introduction to Statistically Designed Experiments

    SciTech Connect

    Heaney, Mike

    2016-09-13

    Statistically designed experiments can save researchers time and money by reducing the number of necessary experimental trials, while resulting in more conclusive experimental results. Surprisingly, many researchers are still not aware of this efficient and effective experimental methodology. As reported in a 2013 article from Chemical & Engineering News, there has been a resurgence of this methodology in recent years (http://cen.acs.org/articles/91/i13/Design-Experiments-Makes-Comeback.html?h=2027056365). This presentation will provide a brief introduction to statistically designed experiments. The main advantages will be reviewed along with the some basic concepts such as factorial and fractional factorial designs. The recommended sequential approach to experiments will be introduced and finally a case study will be presented to demonstrate this methodology.

  8. Designing for Entertaining Everyday Experiences

    NASA Astrophysics Data System (ADS)

    Inakage, Masa; Arakawa, Takahiro; Iguchi, Kenji; Katsumoto, Yuichiro; Katsura, Makoto; Osawa, Takeshi; Tokuhisa, Satoru; Ueki, Atsuro

    Entertainment is one of the essential elements in the human society. Entertainment includes “fun” in our everyday life activities, from meeting friends to relaxing at hot spas. Everyday artifacts can become entertaining media if these artifacts and environment are designed to be responsive. This chapter discusses the researches of entertaining artifacts to share how to design responsive artifacts for entertaining experience in our everyday life.

  9. Elements of Design of Experiments

    DTIC Science & Technology

    2007-06-01

    in Operations Research -- Sverdrup Technology TMG OA and DOE Champion – 53rd Wing Design of Experiments -- 15 Years Green Flag Ex ‘79 F-16C IOT&E...Manufacturing Engineering CV-22 AGM Simulation WMD Effectiveness Army BDA Army Operator Effectiveness 53rd TMG Consultant Wind Tunnel Testing Air Force NASA

  10. Designing Effective Undergraduate Research Experiences

    NASA Astrophysics Data System (ADS)

    Severson, S.

    2010-12-01

    I present a model for designing student research internships that is informed by the best practices of the Center for Adaptive Optics (CfAO) Professional Development Program. The dual strands of the CfAO education program include: the preparation of early-career scientists and engineers in effective teaching; and changing the learning experiences of students (e.g., undergraduate interns) through inquiry-based "teaching laboratories." This paper will focus on the carry-over of these ideas into the design of laboratory research internships such as the CfAO Mainland internship program as well as NSF REU (Research Experiences for Undergraduates) and senior-thesis or "capstone" research programs. Key ideas in maximizing student learning outcomes and generating productive research during internships include: defining explicit content, scientific process, and attitudinal goals for the project; assessment of student prior knowledge and experience, then following up with formative assessment throughout the project; setting reasonable goals with timetables and addressing motivation; and giving students ownership of the research by implementing aspects of the inquiry process within the internship.

  11. Role-Based Design: Design Experiences

    ERIC Educational Resources Information Center

    Miller, Charles; Hokanson, Brad; Doering, Aaron; Brandt, Tom

    2010-01-01

    This is the fourth and final installment in a series of articles presenting a new outlook on the methods of instructional design. These articles examine the nature of the process of instructional design and are meant to stimulate discussion about the roles of designers in the fields of instructional design, the learning sciences, and interaction…

  12. Social Design Experiments: Toward Equity by Design

    ERIC Educational Resources Information Center

    Gutiérrez, Kris D.; Jurow, A. Susan

    2016-01-01

    In this article, we advance an approach to design research that is organized around a commitment to transforming the educational and social circumstances of members of non-dominant communities as a means of promoting social equity and learning. We refer to this approach as social design experimentation. The goals of social design experiments…

  13. Tokamak Physics Experiment divertor design

    SciTech Connect

    Anderson, P.M.

    1995-12-31

    The Tokamak Physics Experiment (TPX) tokamak requires a symmetric up/down double-null divertor capable of operation with steady-state heat flux as high as 7.5 MW/m{sup 2}. The divertor is designed to operate in the radiative mode and employs a deep slot configuration with gas puffing lines to enhance radiative divertor operation. Pumping is provided by cryopumps that pump through eight vertical ports in the floor and ceiling of the vessel. The plasma facing surface is made of carbon-carbon composite blocks (macroblocks) bonded to multiple parallel copper tubes oriented vertically. Water flowing at 6 m/s is used, with the critical heat flux (CHF) margin improved by the use of enhanced heat transfer surfaces. In order to extend the operating period where hands on maintenance is allowed and to also reduce dismantling and disposal costs, the TPX design emphasizes the use of low activation materials. The primary materials used in the divertor are titanium, copper, and carbon-carbon composite. The low activation material selection and the planned physics operation will allow personnel access into the vacuum vessel for the first 2 years of operation. The remote handling system requires that all plasma facing components (PFCs) are configured as modular components of restricted dimensions with special provisions for lifting, alignment, mounting, attachment, and connection of cooling lines, and instrumentation and diagnostics services.

  14. Experimenting with Science Facility Design.

    ERIC Educational Resources Information Center

    Butterfield, Eric

    1999-01-01

    Discusses the modern school science facility and how computers and teaching methods are changing their design. Issues include power, lighting, and space requirements; funding for planning; architect assessment; materials requirements for work surfaces; and classroom flexibility. (GR)

  15. Experiment Design and Analysis Guide - Neutronics & Physics

    SciTech Connect

    Misti A Lillo

    2014-06-01

    The purpose of this guide is to provide a consistent, standardized approach to performing neutronics/physics analysis for experiments inserted into the Advanced Test Reactor (ATR). This document provides neutronics/physics analysis guidance to support experiment design and analysis needs for experiments irradiated in the ATR. This guide addresses neutronics/physics analysis in support of experiment design, experiment safety, and experiment program objectives and goals. The intent of this guide is to provide a standardized approach for performing typical neutronics/physics analyses. Deviation from this guide is allowed provided that neutronics/physics analysis details are properly documented in an analysis report.

  16. Nova pulse power design and operational experience

    NASA Astrophysics Data System (ADS)

    Whitham, K.; Larson, D.; Merritt, B.; Christie, D.

    1987-01-01

    Nova is a 100 TW Nd++ solid state laser designed for experiments with laser fusion at Lawrence Livermore National Laboratory (LLNL). The pulsed power for Nova includes a 58 MJ capacitor bank driving 5336 flashlamps with millisecond pulses and subnanosecond high voltages for electro optics. This paper summarizes the pulsed power designs and the operational experience to date.

  17. Design of satellite flexibility experiments

    NASA Technical Reports Server (NTRS)

    Kaplan, M. H.; Hillard, S. E.

    1977-01-01

    A preliminary study has been completed to begin development of a flight experiment to measure spacecraft control/flexible structure interaction. The work reported consists of two phases: identification of appropriate structural parameters which can be associated with flexibility phenomena, and suggestions for the development of an experiment for a satellite configuration typical of near-future vehicles which are sensitive to such effects. Recommendations are made with respect to the type of data to be collected and instrumentation associated with these data. The approach consists of developing the equations of motion for a vehicle possessing a flexible solar array, then linearizing about some nominal motion of the craft. A set of solutions are assumed for array deflection using a continuous normal mode method and important parameters are exposed. Inflight and ground based measurements are distinguished. Interrelationships between these parameters, measurement techniques, and input requirements are discussed which assure minimization of special vehicle maneuvers and optimization of data to be obtained during the normal flight sequence.

  18. Spaceflight payload design flight experience G-408

    NASA Technical Reports Server (NTRS)

    Durgin, William W.; Looft, Fred J.; Sacco, Albert, Jr.; Thompson, Robert; Dixon, Anthony G.; Roberti, Dino; Labonte, Robert; Moschini, Larry

    1992-01-01

    Worcester Polytechnic Institute's first payload of spaceflight experiments flew aboard Columbia, STS-40, during June of 1991 and culminated eight years of work by students and faculty. The Get Away Special (GAS) payload was installed on the GAS bridge assembly at the aft end of the cargo bay behind the Spacelab Life Sciences (SLS-1) laboratory. The Experiments were turned on by astronaut signal after reaching orbit and then functioned for 72 hours. Environmental and experimental measurements were recorded on three cassette tapes which, together with zeolite crystals grown on orbit, formed the basis of subsequent analyses. The experiments were developed over a number of years by undergraduate students meeting their project requirements for graduation. The experiments included zeolite crystal growth, fluid behavior, and microgravity acceleration measurement in addition to environmental data acquisition. Preparation also included structural design, thermal design, payload integration, and experiment control. All of the experiments functioned on orbit and the payload system performed within design estimates.

  19. Cage allocation designs for rodent carcinogenicity experiments

    PubMed Central

    Herzberg, Agnes M.; Lagakos, Stephen W.

    1991-01-01

    Cage allocation designs for rodent carcinogenicity experiments are discussed and presented with the goal of avoiding dosage group biases related to cage location. Considerations in selecting a cage design are first discussed in general terms. Specific designs are presented for use in experiments involving three, four, and five dose groups and with one, four, and five rodents per cage. Priorities for balancing treatment groups include horizontal position on shelf and shelf of rack, nearest neighbor balance, and male–female balance. It is proposed that these balance criteria be considered together with practical issues, such as the ability to accurately conform to a design and to determine a sensible and efficient design for each experiment. PMID:17539183

  20. Cage allocation designs for rodent carcinogenicity experiments.

    PubMed Central

    Herzberg, A M; Lagakos, S W

    1992-01-01

    Cage allocation designs for rodent carcinogenicity experiments are discussed and presented with the goal of avoiding dosage group biases related to cage location. Considerations in selecting a cage design are first discussed in general terms. Specific designs are presented for use in experiments involving three, four, and five dose groups and with one, four, and five rodents per cage. Priorities for balancing treatment groups include horizontal position on shelf and shelf of rack, nearest neighbor balance, and male-female balance. It is proposed that these balance criteria be considered together with practical issues, such as the ability to accurately conform to a design and to determine a sensible and efficient design for each experiment. PMID:1295494

  1. Design for Engaging Experience and Social Interaction

    ERIC Educational Resources Information Center

    Harteveld, Casper; ten Thij, Eleonore; Copier, Marinka

    2011-01-01

    One of the goals of game designers is to design for an engaging experience and for social interaction. The question is how. We know that games can be engaging and allow for social interaction, but how do we achieve this or even improve on it? This article provides an overview of several scientific approaches that deal with this question. It…

  2. Tractable Experiment Design via Mathematical Surrogates

    SciTech Connect

    Williams, Brian J.

    2016-02-29

    This presentation summarizes the development and implementation of quantitative design criteria motivated by targeted inference objectives for identifying new, potentially expensive computational or physical experiments. The first application is concerned with estimating features of quantities of interest arising from complex computational models, such as quantiles or failure probabilities. A sequential strategy is proposed for iterative refinement of the importance distributions used to efficiently sample the uncertain inputs to the computational model. In the second application, effective use of mathematical surrogates is investigated to help alleviate the analytical and numerical intractability often associated with Bayesian experiment design. This approach allows for the incorporation of prior information into the design process without the need for gross simplification of the design criterion. Illustrative examples of both design problems will be presented as an argument for the relevance of these research problems.

  3. Optimal design of isotope labeling experiments.

    PubMed

    Yang, Hong; Mandy, Dominic E; Libourel, Igor G L

    2014-01-01

    Stable isotope labeling experiments (ILE) constitute a powerful methodology for estimating metabolic fluxes. An optimal label design for such an experiment is necessary to maximize the precision with which fluxes can be determined. But often, precision gained in the determination of one flux comes at the expense of the precision of other fluxes, and an appropriate label design therefore foremost depends on the question the investigator wants to address. One could liken ILE to shadows that metabolism casts on products. Optimal label design is the placement of the lamp; creating clear shadows for some parts of metabolism and obscuring others.An optimal isotope label design is influenced by: (1) the network structure; (2) the true flux values; (3) the available label measurements; and, (4) commercially available substrates. The first two aspects are dictated by nature and constrain any optimal design. The second two aspects are suitable design parameters. To create an optimal label design, an explicit optimization criterion needs to be formulated. This usually is a property of the flux covariance matrix, which can be augmented by weighting label substrate cost. An optimal design is found by using such a criterion as an objective function for an optimizer. This chapter uses a simple elementary metabolite units (EMU) representation of the TCA cycle to illustrate the process of experimental design of isotope labeled substrates.

  4. Analysis of designed experiments with complex aliasing

    SciTech Connect

    Hamada, M.; Wu, C.F.J. )

    1992-07-01

    Traditionally, Plackett-Burman (PB) designs have been used in screening experiments for identifying important main effects. The PB designs whose run sizes are not a power of two have been criticized for their complex aliasing patterns, which according to conventional wisdom gives confusing results. This paper goes beyond the traditional approach by proposing the analysis strategy that entertains interactions in addition to main effects. Based on the precepts of effect sparsity and effect heredity, the proposed procedure exploits the designs' complex aliasing patterns, thereby turning their 'liability' into an advantage. Demonstration of the procedure on three real experiments shows the potential for extracting important information available in the data that has, until now, been missed. Some limitations are discussed, and extentions to overcome them are given. The proposed procedure also applies to more general mixed level designs that have become increasingly popular. 16 refs.

  5. Affective loop experiences: designing for interactional embodiment

    PubMed Central

    Höök, Kristina

    2009-01-01

    Involving our corporeal bodies in interaction can create strong affective experiences. Systems that both can be influenced by and influence users corporeally exhibit a use quality we name an affective loop experience. In an affective loop experience, (i) emotions are seen as processes, constructed in the interaction, starting from everyday bodily, cognitive or social experiences; (ii) the system responds in ways that pull the user into the interaction, touching upon end users' physical experiences; and (iii) throughout the interaction the user is an active, meaning-making individual choosing how to express themselves—the interpretation responsibility does not lie with the system. We have built several systems that attempt to create affective loop experiences with more or less successful results. For example, eMoto lets users send text messages between mobile phones, but in addition to text, the messages also have colourful and animated shapes in the background chosen through emotion-gestures with a sensor-enabled stylus pen. Affective Diary is a digital diary with which users can scribble their notes, but it also allows for bodily memorabilia to be recorded from body sensors mapping to users' movement and arousal and placed along a timeline. Users can see patterns in their bodily reactions and relate them to various events going on in their lives. The experiences of building and deploying these systems gave us insights into design requirements for addressing affective loop experiences, such as how to design for turn-taking between user and system, how to create for ‘open’ surfaces in the design that can carry users' own meaning-making processes, how to combine modalities to create for a ‘unity’ of expression, and the importance of mirroring user experience in familiar ways that touch upon their everyday social and corporeal experiences. But a more important lesson gained from deploying the systems is how emotion processes are co-constructed and

  6. Affective loop experiences: designing for interactional embodiment.

    PubMed

    Höök, Kristina

    2009-12-12

    Involving our corporeal bodies in interaction can create strong affective experiences. Systems that both can be influenced by and influence users corporeally exhibit a use quality we name an affective loop experience. In an affective loop experience, (i) emotions are seen as processes, constructed in the interaction, starting from everyday bodily, cognitive or social experiences; (ii) the system responds in ways that pull the user into the interaction, touching upon end users' physical experiences; and (iii) throughout the interaction the user is an active, meaning-making individual choosing how to express themselves-the interpretation responsibility does not lie with the system. We have built several systems that attempt to create affective loop experiences with more or less successful results. For example, eMoto lets users send text messages between mobile phones, but in addition to text, the messages also have colourful and animated shapes in the background chosen through emotion-gestures with a sensor-enabled stylus pen. Affective Diary is a digital diary with which users can scribble their notes, but it also allows for bodily memorabilia to be recorded from body sensors mapping to users' movement and arousal and placed along a timeline. Users can see patterns in their bodily reactions and relate them to various events going on in their lives. The experiences of building and deploying these systems gave us insights into design requirements for addressing affective loop experiences, such as how to design for turn-taking between user and system, how to create for 'open' surfaces in the design that can carry users' own meaning-making processes, how to combine modalities to create for a 'unity' of expression, and the importance of mirroring user experience in familiar ways that touch upon their everyday social and corporeal experiences. But a more important lesson gained from deploying the systems is how emotion processes are co-constructed and experienced

  7. OSM's cone design and installation experience

    SciTech Connect

    Van Dyke, M.W.

    1983-06-29

    The concrete filled steel cone offers an alternative solution in sealing vertical mine shafts. This paper gives the design and installation experiences of the Office of Surface Mining when dealing with abandoned coal mines. This same solution can also be used with other types of shaft closures. 4 figures.

  8. Power and replication - designing powerful experiments

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Biological research is expensive, with monetary costs to granting agencies and emotional costs to researchers. As such, biological researchers should always follow the mantra, "failure is not an option." A failed experimental design is generally manifested as an experiment with high P-values, leavin...

  9. Designing a Curriculum for Clinical Experiences

    ERIC Educational Resources Information Center

    Henning, John E.; Erb, Dorothy J.; Randles, Halle Schoener; Fults, Nanette; Webb, Kathy

    2016-01-01

    The purpose of this article is to describe a collaborative effort among five teacher preparation programs to create a conceptual tool designed to put clinical experiences at the center of our programs. The authors refer to the resulting product as a clinical curriculum. The clinical curriculum describes a developmental sequence of clinical…

  10. Response surface designs for experiments in bioprocessing.

    PubMed

    Gilmour, Steven G

    2006-06-01

    Many processes in the biological industries are studied using response surface methodology. The use of biological materials, however, means that run-to-run variation is typically much greater than that in many experiments in mechanical or chemical engineering and so the designs used require greater replication. The data analysis which is performed may involve some variable selection, as well as fitting polynomial response surface models. This implies that designs should allow the parameters of the model to be estimated nearly orthogonally. A class of three-level response surface designs is introduced which allows all except the quadratic parameters to be estimated orthogonally, as well as having a number of other useful properties. These subset designs are obtained by using two-level factorial designs in subsets of the factors, with the other factors being held at their middle level. This allows their properties to be easily explored. Replacing some of the two-level designs with fractional replicates broadens the class of useful designs, especially with five or more factors, and sometimes incomplete subsets can be used. It is very simple to include a few two- and four-level factors in these designs by excluding subsets with these factors at the middle level. Subset designs can be easily modified to include factors with five or more levels by allowing a different pair of levels to be used in different subsets.

  11. Conceptual design for spacelab pool boiling experiment

    NASA Technical Reports Server (NTRS)

    Lienhard, J. H.; Peck, R. E.

    1978-01-01

    A pool boiling heat transfer experiment to be incorporated with a larger two-phase flow experiment on Spacelab was designed to confirm (or alter) the results of earth-normal gravity experiments which indicate that the hydrodynamic peak and minimum pool boiling heat fluxes vanish at very low gravity. Twelve small sealed test cells containing water, methanol or Freon 113 and cylindrical heaters of various sizes are to be built. Each cell will be subjected to one or more 45 sec tests in which the surface heat flux on the heaters is increased linearly until the surface temperature reaches a limiting value of 500 C. The entire boiling process will be photographed in slow-motion. Boiling curves will be constructed from thermocouple and electric input data, for comparison with the motion picture records. The conduct of the experiment will require no more than a few hours of operator time.

  12. Conceptual design of Dipole Research Experiment (DREX)

    NASA Astrophysics Data System (ADS)

    Qingmei, XIAO; Zhibin, WANG; Xiaogang, WANG; Chijie, XIAO; Xiaoyi, YANG; Jinxing, ZHENG

    2017-03-01

    A new terrella-like device for laboratory simulation of inner magnetosphere plasmas, Dipole Research Experiment, is scheduled to be built at the Harbin Institute of Technology (HIT), China, as a major state scientific research facility for space physics studies. It is designed to provide a ground experimental platform to reproduce the inner magnetosphere to simulate the processes of trapping, acceleration, and transport of energetic charged particles restrained in a dipole magnetic field configuration. The scaling relation of hydromagnetism between the laboratory plasma of the device and the geomagnetosphere plasma is applied to resemble geospace processes in the Dipole Research Experiment plasma. Multiple plasma sources, different kinds of coils with specific functions, and advanced diagnostics are designed to be equipped in the facility for multi-functions. The motivation, design criteria for the Dipole Research Experiment experiments and the means applied to generate the plasma of desired parameters in the laboratory are also described. Supported by National Natural Science Foundation of China (Nos. 11505040, 11261140326 and 11405038), China Postdoctoral Science Foundation (Nos. 2016M591518, 2015M570283) and Project Supported by Natural Scientific Research Innovation Foundation in Harbin Institute of Technology (No. 2017008).

  13. Advanced ISDN satellite designs and experiments

    NASA Technical Reports Server (NTRS)

    Pepin, Gerard R.

    1992-01-01

    The research performed by GTE Government Systems and the University of Colorado in support of the NASA Satellite Communications Applications Research (SCAR) Program is summarized. Two levels of research were undertaken. The first dealt with providing interim services Integrated Services Digital Network (ISDN) satellite (ISIS) capabilities that accented basic rate ISDN with a ground control similar to that of the Advanced Communications Technology Satellite (ACTS). The ISIS Network Model development represents satellite systems like the ACTS orbiting switch. The ultimate aim is to move these ACTS ground control functions on-board the next generation of ISDN communications satellite to provide full-service ISDN satellite (FSIS) capabilities. The technical and operational parameters for the advanced ISDN communications satellite design are obtainable from the simulation of ISIS and FSIS engineering software models of the major subsystems of the ISDN communications satellite architecture. Discrete event simulation experiments would generate data for analysis against NASA SCAR performance measure and the data obtained from the ISDN satellite terminal adapter hardware (ISTA) experiments, also developed in the program. The Basic and Option 1 phases of the program are also described and include the following: literature search, traffic mode, network model, scenario specifications, performance measures definitions, hardware experiment design, hardware experiment development, simulator design, and simulator development.

  14. Statistical considerations in design of spacelab experiments

    NASA Technical Reports Server (NTRS)

    Robinson, J.

    1978-01-01

    After making an analysis of experimental error sources, statistical models were developed for the design and analysis of potential Space Shuttle experiments. Guidelines for statistical significance and/or confidence limits of expected results were also included. The models were then tested out on the following proposed Space Shuttle biomedical experiments: (1) bone density by computer tomography; (2) basal metabolism; and (3) total body water. Analysis of those results and therefore of the models proved inconclusive due to the lack of previous research data and statistical values. However, the models were seen as possible guides to making some predictions and decisions.

  15. Simulator design for advanced ISDN satellite design and experiments

    NASA Technical Reports Server (NTRS)

    Pepin, Gerald R.

    1992-01-01

    This simulation design task completion report documents the simulation techniques associated with the network models of both the Interim Service ISDN (integrated services digital network) Satellite (ISIS) and the Full Service ISDN Satellite (FSIS) architectures. The ISIS network model design represents satellite systems like the Advanced Communication Technology Satellite (ACTS) orbiting switch. The FSIS architecture, the ultimate aim of this element of the Satellite Communications Applications Research (SCAR) program, moves all control and switching functions on-board the next generation ISDN communication satellite. The technical and operational parameters for the advanced ISDN communications satellite design will be obtained from the simulation of ISIS and FSIS engineering software models for their major subsystems. Discrete events simulation experiments will be performed with these models using various traffic scenarios, design parameters and operational procedures. The data from these simulations will be used to determine the engineering parameters for the advanced ISDN communications satellite.

  16. CMM Interim Check Design of Experiments (U)

    SciTech Connect

    Montano, Joshua Daniel

    2015-07-29

    Coordinate Measuring Machines (CMM) are widely used in industry, throughout the Nuclear Weapons Complex and at Los Alamos National Laboratory (LANL) to verify part conformance to design definition. Calibration cycles for CMMs at LANL are predominantly one year in length and include a weekly interim check to reduce risk. The CMM interim check makes use of Renishaw’s Machine Checking Gauge which is an off-the-shelf product simulates a large sphere within a CMM’s measurement volume and allows for error estimation. As verification on the interim check process a design of experiments investigation was proposed to test a couple of key factors (location and inspector). The results from the two-factor factorial experiment proved that location influenced results more than the inspector or interaction.

  17. Design of A Microgravity Spray Cooling Experiment

    DTIC Science & Technology

    2006-07-01

    bubbles will coalesce into a large bubble on the surface of the heater in reduced gravity. During subcooled boiling , thermocapillary flows can...flights, and in-orbit experiments. Two-phase, one-component flow with heat transfer in microgravity is seen in many thermal management systems such...to predict the behavior of, and to design, prototypes for microgravity.2 Microgravity research on pool boiling with and without subcooling has been

  18. Design Calculations For NIF Convergent Ablator Experiments

    SciTech Connect

    Olson, R E; Hicks, D G; Meezan, N B; Callahan, D A; Landen, O L; Jones, O S; Langer, S H; Kline, J L; Wilson, D C; Rinderknecht, H; Zylstra, A; Petrasso, R D

    2011-10-25

    The NIF convergent ablation tuning effort is underway. In the early experiments, we have discovered that the design code simulations over-predict the capsule implosion velocity and shock flash rhor, but under-predict the hohlraum x-ray flux measurements. The apparent inconsistency between the x-ray flux and radiography data implies that there are important unexplained aspects of the hohlraum and/or capsule behavior.

  19. Design Calculations for NIF Convergent Ablator Experiments

    NASA Astrophysics Data System (ADS)

    Olson, R. E.; Callahan, D. A.; Hicks, D. G.; Landen, O. L.; Langer, S. H.; Meezan, N. B.; Spears, B. K.; Widmann, K.; Kline, J. L.; Wilson, D. C.; Petrasso, R. D.; Leeper, R. J.

    2010-11-01

    Design calculations for NIF convergent ablator experiments will be described. The convergent ablator experiments measure the implosion trajectory, velocity, and ablation rate of an x-ray driven capsule and are a important component of the U. S. National Ignition Campaign at NIF. The design calculations are post-processed to provide simulations of the key diagnostics -- 1) Dante measurements of hohlraum x-ray flux and spectrum, 2) streaked radiographs of the imploding ablator shell, 3) wedge range filter measurements of D-He3 proton output spectra, and 4) GXD measurements of the imploded core. The simulated diagnostics will be compared to the experimental measurements to provide an assessment of the accuracy of the design code predictions of hohlraum radiation temperature, capsule ablation rate, implosion velocity, shock flash areal density, and x-ray bang time. Post-shot versions of the design calculations are used to enhance the understanding of the experimental measurements and will assist in choosing parameters for subsequent shots and the path towards optimal ignition capsule tuning. *SNL, LLNL, and LANL are operated under US DOE contracts DE-AC04-94AL85000. DE-AC52-07NA27344, and DE-AC04-94AL85000.

  20. Design of a water electrolysis flight experiment

    NASA Technical Reports Server (NTRS)

    Lee, M. Gene; Grigger, David J.; Thompson, C. Dean; Cusick, Robert J.

    1993-01-01

    Supply of oxygen (O2) and hydrogen (H2) by electolyzing water in space will play an important role in meeting the National Aeronautics and Space Administration's (NASA's) needs and goals for future space missios. Both O2 and H2 are envisioned to be used in a variety of processes including crew life support, spacecraft propulsion, extravehicular activity, electrical power generation/storage as well as in scientific experiment and manufacturing processes. The Electrolysis Performance Improvement Concept Study (EPICS) flight experiment described herein is sponsored by NASA Headquarters as a part of the In-Space Technology Experiment Program (IN-STEP). The objective of the EPICS is to further contribute to the improvement of the SEF technology, specifially by demonstrating and validating the SFE electromechanical process in microgravity as well as investigating perrformance improvements projected possible in a microgravity environment. This paper defines the experiment objective and presents the results of the preliminary design of the EPICS. The experiment will include testing three subscale self-contained SFE units: one containing baseline components, and two units having variations in key component materials. Tests will be conducted at varying current and thermal condition.

  1. JASMINE project Instrument design and centroiding experiment

    NASA Astrophysics Data System (ADS)

    Yano, Taihei; Gouda, Naoteru; Kobayashi, Yukiyasu; Yamada, Yoshiyuki

    JASMINE will study the fundamental structure and evolution of the Milky Way Galaxy. To accomplish these objectives, JASMINE will measure trigonometric parallaxes, positions and proper motions of about 10 million stars with a precision of 10 μarcsec at z = 14 mag. In this paper the instrument design (optics, detectors, etc.) of JASMINE is presented. We also show a CCD centroiding experiment for estimating positions of star images. The experimental result shows that the accuracy of estimated distances has a variance of less than 0.01 pixel.

  2. Preliminary design of the redundant software experiment

    NASA Technical Reports Server (NTRS)

    Campbell, Roy; Deimel, Lionel; Eckhardt, Dave, Jr.; Kelly, John; Knight, John; Lauterbach, Linda; Lee, Larry; Mcallister, Dave; Mchugh, John

    1985-01-01

    The goal of the present experiment is to characterize the fault distributions of highly reliable software replicates, constructed using techniques and environments which are similar to those used in comtemporary industrial software facilities. The fault distributions and their effect on the reliability of fault tolerant configurations of the software will be determined through extensive life testing of the replicates against carefully constructed randomly generated test data. Each detected error will be carefully analyzed to provide insight in to their nature and cause. A direct objective is to develop techniques for reducing the intensity of coincident errors, thus increasing the reliability gain which can be achieved with fault tolerance. Data on the reliability gains realized, and the cost of the fault tolerant configurations can be used to design a companion experiment to determine the cost effectiveness of the fault tolerant strategy. Finally, the data and analysis produced by this experiment will be valuable to the software engineering community as a whole because it will provide a useful insight into the nature and cause of hard to find, subtle faults which escape standard software engineering validation techniques and thus persist far into the software life cycle.

  3. Vision Guided Intelligent Robot Design And Experiments

    NASA Astrophysics Data System (ADS)

    Slutzky, G. D.; Hall, E. L.

    1988-02-01

    The concept of an intelligent robot is an important topic combining sensors, manipulators, and artificial intelligence to design a useful machine. Vision systems, tactile sensors, proximity switches and other sensors provide the elements necessary for simple game playing as well as industrial applications. These sensors permit adaption to a changing environment. The AI techniques permit advanced forms of decision making, adaptive responses, and learning while the manipulator provides the ability to perform various tasks. Computer languages such as LISP and OPS5, have been utilized to achieve expert systems approaches in solving real world problems. The purpose of this paper is to describe several examples of visually guided intelligent robots including both stationary and mobile robots. Demonstrations will be presented of a system for constructing and solving a popular peg game, a robot lawn mower, and a box stacking robot. The experience gained from these and other systems provide insight into what may be realistically expected from the next generation of intelligent machines.

  4. Judgement post-stratification for designed experiments.

    PubMed

    Du, Juan; MacEachern, Steven N

    2008-06-01

    In many scientific studies, information that is not easily translated into covariates is ignored in the analysis. However, this type of information may significantly improve inference. In this research, we apply the idea of judgment post-stratification to utilize such information. Specifically, we consider experiments that are conducted under a completely randomized design. Sets of experimental units are formed, and the units in a set are ranked. Estimation is performed conditional on the sets and ranks. We propose a new estimator for a treatment contrast. We improve the new estimator by Rao-Blackwellization. Asymptotic distribution theory and corresponding inferential procedures for both estimators are developed. Simulation studies quantify the superiority of the new estimators and show their desirable properties for small and moderate sample sizes. The impact of the new techniques is illustrated with data from a clinical trial.

  5. Design and implementation of the STAR experiment`s DAQ

    SciTech Connect

    Ljubicic, A. Jr.; Botlo, M.; Heistermann, F.

    1997-12-01

    The STAR experiment is one of the two large detectors currently being built at the Relativistic Heavy Ion Collider (RHIC) at Brookhaven National Laboratory, Upton, New York, USA. The major issue of STAR`s DAQ is the large amount of data that has to be processed as fast as possible. The required data rate is of the order of 90 Gbits/s which has to be processed and scaled down to about 15 MBytes/s and stored to tape or other permanent archiving media. To be able to do so the STAR DAQ uses a custom built ASIC which preprocesses the raw data for later use by a software Level 3 trigger. The Level 3 trigger selects events to be archived depending on physics criteria based upon the particle track information extracted during Level 3 processing. The design presented is a massively parallel multi-processor system which consists of front end microprocessors hierarchically organized within a VME crate system. Each VME crate contains 6 custom made Receiver Boards with 3 Intel I960HD processors per board for a total of 18 processors per crate. The STAR`s TPC detector uses 24 such crates and the SVT detector will use 4 crates for a total of 504 microprocessors.

  6. Interim Service ISDN Satellite (ISIS) hardware experiment design for advanced ISDN satellite design and experiments

    NASA Technical Reports Server (NTRS)

    Pepin, Gerard R.

    1992-01-01

    The Interim Service Integrated Services Digital Network (ISDN) Satellite (ISIS) Hardware Experiment Design for Advanced Satellite Designs describes the design of the ISDN Satellite Terminal Adapter (ISTA) capable of translating ISDN protocol traffic into time division multiple access (TDMA) signals for use by a communications satellite. The ISTA connects the Type 1 Network Termination (NT1) via the U-interface on the line termination side of the CPE to the V.35 interface for satellite uplink. The same ISTA converts in the opposite direction the V.35 to U-interface data with a simple switch setting.

  7. Design Point for a Spheromak Compression Experiment

    NASA Astrophysics Data System (ADS)

    Woodruff, Simon; Romero-Talamas, Carlos A.; O'Bryan, John; Stuber, James; Darpa Spheromak Team

    2015-11-01

    Two principal issues for the spheromak concept remain to be addressed experimentally: formation efficiency and confinement scaling. We are therefore developing a design point for a spheromak experiment that will be heated by adiabatic compression, utilizing the CORSICA and NIMROD codes as well as analytic modeling with target parameters R_initial =0.3m, R_final =0.1m, T_initial =0.2keV, T_final =1.8keV, n_initial =1019m-3 and n_final = 1021m-3, with radial convergence of C =3. This low convergence differentiates the concept from MTF with C =10 or more, since the plasma will be held in equilibrium throughout compression. We present results from CORSICA showing the placement of coils and passive structure to ensure stability during compression, and design of the capacitor bank needed to both form the target plasma and compress it. We specify target parameters for the compression in terms of plasma beta, formation efficiency and energy confinement. Work performed under DARPA grant N66001-14-1-4044.

  8. Design and commissioning experience of SALT facility

    NASA Astrophysics Data System (ADS)

    De Kock, Mariana; Venter, Sarel

    2004-09-01

    The commissioning experience of the facility for SALT is compared to the results of the analysis done during the design. A false steel floor incorporating forced ventilation that extends around the telescope azimuth pier is installed to prevent heat radiating from the concrete surfaces on nights when the ambient temperature drops to below room temperature. An infrared scan was done on this floor to verify that no heat is radiated into the telescope chamber from either the concrete or the warmer rooms underneath. The SALT site is windy all year round, and in order to utilize this natural resource and get better ventilation, adjustable louvers are used for natural ventilation. The control system automatically adjusts the louver openings depending on the wind speed and relative direction of the dome opening to achieve enough air changes during wind still nights. The louvers are throttled to limit windshake on the structure on windy nights. Results of the computational fluid dynamic analysis (CFD) and actual measurements are presented showing adequate temperature distribution at low wind speeds. The correlation between the CFD and actual measurements are discussed, with reference to the surface and air temperatures in the telescope chamber under different ambient conditions. The telescope chamber and dome are built out of insulation panels to limit energy losses during the day when the chamber is air conditioned. It also ensures thermal inertia of the building is low and consequently allows its temperature to react quickly to changes in external ambient temperature. The correlation between the expected and actual capacity of the air conditioners are also discussed.

  9. Shear wall experiments and design in Japan

    SciTech Connect

    Park, Y.J.; Hofmayer, C.

    1994-12-01

    This paper summarizes the results of recent survey studies on the available experimental data bases and design codes/standards for reinforced concrete (RC) shear wall structures in Japan. Information related to the seismic design of RC reactor buildings and containment structures was emphasized in the survey. The seismic requirements for concrete structures, particularly those related to shear strength design, are outlined. Detailed descriptions are presented on the development of Japanese shear wall equations, design requirements for containment structures, and ductility requirements.

  10. Distributed Design and Analysis of Computer Experiments

    SciTech Connect

    Doak, Justin

    2002-11-11

    DDACE is a C++ object-oriented software library for the design and analysis of computer experiments. DDACE can be used to generate samples from a variety of sampling techniques. These samples may be used as input to a application code. DDACE also contains statistical tools such as response surface models and correlation coefficients to analyze input/output relationships between variables in an application code. DDACE can generate input values for uncertain variables within a user's application. For example, a user might like to vary a temperature variable as well as some material variables in a series of simulations. Through the series of simulations the user might be looking for optimal settings of parameters based on some user criteria. Or the user may be interested in the sensitivity to input variability shown by an output variable. In either case, the user may provide information about the suspected ranges and distributions of a set of input variables, along with a sampling scheme, and DDACE will generate input points based on these specifications. The input values generated by DDACE and the one or more outputs computed through the user's application code can be analyzed with a variety of statistical methods. This can lead to a wealth of information about the relationships between the variables in the problem. While statistical and mathematical packages may be employeed to carry out the analysis on the input/output relationships, DDACE also contains some tools for analyzing the simulation data. DDACE incorporates a software package called MARS (Multivariate Adaptive Regression Splines), developed by Jerome Friedman. MARS is used for generating a spline surface fit of the data. With MARS, a model simplification may be calculated using the input and corresponding output, values for the user's application problem. The MARS grid data may be used for generating 3-dimensional response surface plots of the simulation data. DDACE also contains an implementation of an

  11. Designing Effective Research Experiences for Undergraduates (Invited)

    NASA Astrophysics Data System (ADS)

    Jones Whyte, P.; Dalbotten, D. M.

    2009-12-01

    The undergraduate research experience has been recognized as a valuable component of preparation for graduate study. As competition for spaces in graduate schools become more keen students benefit from a formal introduction to the life of a scholar. Over the last twenty years a model of preparing students for graduate study with the research experience as the base has been refined at the University of Minnesota. The experience includes assignment with a faculty member and a series of seminars that support the experience. The seminars cover topics to include academic writing, scholarly literature review, writing of the abstract, research subject protection protocols, GRE test preparation, opportunities to interact with graduate student, preparing the graduate school application, and preparation of a poster to demonstrate the results of the research. The next phase of the process is to determine the role of the undergraduate research experience in the graduate school admission process.

  12. An Architectural Experience for Interface Design

    ERIC Educational Resources Information Center

    Gong, Susan P.

    2016-01-01

    The problem of human-computer interface design was brought to the foreground with the emergence of the personal computer, the increasing complexity of electronic systems, and the need to accommodate the human operator in these systems. With each new technological generation discovering the interface design problems of its own technologies, initial…

  13. The Unstructured Student-Designed Research Type of Laboratory Experiment.

    ERIC Educational Resources Information Center

    Macias-Machin, Agustin; And Others

    1990-01-01

    Discusses the use of an individually designed experiment for a chemical engineering laboratory course. Lists main ideas of the method. Provides an example of the experiment including ways to answer the questions and extensions. (YP)

  14. A Photogate Design for Air Track Experiments.

    ERIC Educational Resources Information Center

    Hinrichsen, P. F.

    1988-01-01

    Introduces a photogate arrangement using a photo-reflective sensor for air track experiments. Reports that the sensitivity to sunlight can be eliminated and a mechanically more convenient package produced. Shows the mounting, circuit, and usage of the photogate. (YP)

  15. Statistical design of a uranium corrosion experiment

    SciTech Connect

    Wendelberger, Joanne R; Moore, Leslie M

    2009-01-01

    This work supports an experiment being conducted by Roland Schulze and Mary Ann Hill to study hydride formation, one of the most important forms of corrosion observed in uranium and uranium alloys. The study goals and objectives are described in Schulze and Hill (2008), and the work described here focuses on development of a statistical experiment plan being used for the study. The results of this study will contribute to the development of a uranium hydriding model for use in lifetime prediction models. A parametric study of the effect of hydrogen pressure, gap size and abrasion on hydride initiation and growth is being planned where results can be analyzed statistically to determine individual effects as well as multi-variable interactions. Input to ESC from this experiment will include expected hydride nucleation, size, distribution, and volume on various uranium surface situations (geometry) as a function of age. This study will also address the effect of hydrogen threshold pressure on corrosion nucleation and the effect of oxide abrasion/breach on hydriding processes. Statistical experiment plans provide for efficient collection of data that aids in understanding the impact of specific experiment factors on initiation and growth of corrosion. The experiment planning methods used here also allow for robust data collection accommodating other sources of variation such as the density of inclusions, assumed to vary linearly along the cast rods from which samples are obtained.

  16. Hybrid Rocket Experiment Station for Capstone Design

    NASA Technical Reports Server (NTRS)

    Conley, Edgar; Hull, Bethanne J.

    2012-01-01

    Portable hybrid rocket motors and test stands can be seen in many papers but none have been reported on the design or instrumentation at such a small magnitude. The design of this hybrid rocket and test stand is to be small and portable (suitcase size). This basic apparatus will be used for demonstrations in rocket propulsion. The design had to include all of the needed hardware to operate the hybrid rocket unit (with the exception of the external Oxygen tank). The design of this project includes making the correlation between the rocket's thrust and its size, the appropriate transducers (physical size, resolution, range, and cost), compatability with a laptop analog card, the ease of setup, and its portability.

  17. Thermal Characterization of Functionally Graded Materials: Design of Optimum Experiments

    NASA Technical Reports Server (NTRS)

    Cole, Kevin D.

    2003-01-01

    This paper is a study of optimal experiment design applied to the measure of thermal properties in functionally graded materials. As a first step, a material with linearly-varying thermal properties is analyzed, and several different tran- sient experimental designs are discussed. An optimality criterion, based on sen- sitivity coefficients, is used to identify the best experimental design. Simulated experimental results are analyzed to verify that the identified best experiment design has the smallest errors in the estimated parameters. This procedure is general and can be applied to design of experiments for a variety of materials.

  18. Designing a successful HMD-based experience

    NASA Technical Reports Server (NTRS)

    Pierce, J. S.; Pausch, R.; Sturgill, C. B.; Christiansen, K. D.; Kaiser, M. K. (Principal Investigator)

    1999-01-01

    For entertainment applications, a successful virtual experience based on a head-mounted display (HMD) needs to overcome some or all of the following problems: entering a virtual world is a jarring experience, people do not naturally turn their heads or talk to each other while wearing an HMD, putting on the equipment is hard, and people do not realize when the experience is over. In the Electric Garden at SIGGRAPH 97, we presented the Mad Hatter's Tea Party, a shared virtual environment experienced by more than 1,500 SIGGRAPH attendees. We addressed these HMD-related problems with a combination of back story, see-through HMDs, virtual characters, continuity of real and virtual objects, and the layout of the physical and virtual environments.

  19. Hypersonic drone vehicle design: A multidisciplinary experience

    NASA Technical Reports Server (NTRS)

    1988-01-01

    UCLA's Advanced Aeronautic Design group focussed their efforts on design problems of an unmanned hypersonic vehicle. It is felt that a scaled hypersonic drone is necesary to bridge the gap between present theory on hypersonics and the future reality of the National Aerospace Plane (NASP) for two reasons: (1) to fulfill a need for experimental data in the hypersonic regime, and (2) to provide a testbed for the scramjet engine which is to be the primary mode of propulsion for the NASP. The group concentrated on three areas of great concern to NASP design: propulsion, thermal management, and flight systems. Problem solving in these areas was directed toward design of the drone with the idea that the same design techniques could be applied to the NASP. A 70 deg swept double-delta wing configuration, developed in the 70's at the NASA Langley, was chosen as the aerodynamic and geometric model for the drone. This vehicle would be air launched from a B-1 at Mach 0.8 and 48,000 feet, rocket boosted by two internal engines to Mach 10 and 100,000 feet, and allowed to cruise under power of the scramjet engine until burnout. It would then return to base for an unpowered landing. Preliminary energy calculations based on flight requirements give the drone a gross launch weight of 134,000 pounds and an overall length of 85 feet.

  20. Batch sequential designs for computer experiments

    SciTech Connect

    Moore, Leslie M; Williams, Brian J; Loeppky, Jason L

    2009-01-01

    Computer models simulating a physical process are used in many areas of science. Due to the complex nature of these codes it is often necessary to approximate the code, which is typically done using a Gaussian process. In many situations the number of code runs available to build the Guassian process approximation is limited. When the initial design is small or the underlying response surface is complicated this can lead to poor approximations of the code output. In order to improve the fit of the model, sequential design strategies must be employed. In this paper we introduce two simple distance based metrics that can be used to augment an initial design in a batch sequential manner. In addition we propose a sequential updating strategy to an orthogonal array based Latin hypercube sample. We show via various real and simulated examples that the distance metrics and the extension of the orthogonal array based Latin hypercubes work well in practice.

  1. Modal identification experiment design for large space structures

    NASA Technical Reports Server (NTRS)

    Kim, Hyoung M.; Doiron, Harold H.

    1991-01-01

    This paper describes an on-orbit modal identification experiment design for large space structures. Space Station Freedom (SSF) systems design definition and structural dynamic models were used as representative large space structures for optimizing experiment design. Important structural modes of study models were selected to provide a guide for experiment design and used to assess the design performance. A pulsed random excitation technique using propulsion jets was developed to identify closely-spaced modes. A measuremenat location selection approach was developed to estimate accurate mode shapes as well as frequencies and damping factors. The data acquisition system and operational scenarios were designed to have minimal impacts on the SSF. A comprehensive simulation was conducted to assess the overall performance of the experiment design.

  2. Some experiences in aircraft aeroelastic design using Preliminary Aeroelastic Design of Structures (PAD)

    NASA Technical Reports Server (NTRS)

    Radovcich, N. A.

    1984-01-01

    The design experience associated with a benchmark aeroelastic design of an out of production transport aircraft is discussed. Current work being performed on a high aspect ratio wing design is reported. The Preliminary Aeroelastic Design of Structures (PADS) system is briefly summarized and some operational aspects of generating the design in an automated aeroelastic design environment are discussed.

  3. Designing Learning Experiences for Deeper Understanding

    ERIC Educational Resources Information Center

    Stripling, Barbara K.; Harada, Violet H.

    2012-01-01

    Planning is the less visible part of the teaching and learning process; however, it serves as the blueprint for student learning. To conceptualize the unit or project as a holistic learning experience, the authors created the C.L.E.A.R. G.O.A.L.S. guidelines that address the major elements of unit planning. An essential step is identifying the…

  4. Bi-Component Droplet Combustion Experiment Designed

    NASA Technical Reports Server (NTRS)

    Dietrich, Daniel L.

    2002-01-01

    The combustion of liquid fuels is a major source of energy in the world today, and the majority of these fuels are burned in the form of a spray. The research at the NASA Glenn Research Center in droplet combustion has the overall goal of providing a better understanding of spray combustion by studying the smallest element in a spray, the single droplet. The Bi-Component Droplet Combustion Experiment (BCDCE) extends the work at Glenn from pure, or single-component, fuels to an idealized liquid fuel composed of two completely miscible components. The project is a collaborative effort between Glenn and Prof. B.D. Shaw of the University of California, Davis. The BCDCE project is planned to fly onboard the International Space Station in the Multi-User Droplet Combustion Apparatus. The unique feature of this experiment is that it will be the first droplet combustion experiment to perform a detailed characterization of the flow inside a liquid fuel droplet. The experiment will use a relatively new technique called Digital Particle Imaging Velocimetry (DPIV) to characterize the liquid flow. In this technique, very small (approx. 5-mm diameter) particles are dispersed throughout a liquid droplet. These particles are illuminated by a thin laser sheet. Images of the particle motion are recorded on a computer, which then tracks the motion of the particles to determine the flow characteristics.

  5. Experience in Constructions: Designing a Wall

    ERIC Educational Resources Information Center

    Glenn, Barbara

    1978-01-01

    Viewing a contemporary artist's works to learn about the artist and his/her personal vision is one thing for elementary school students. Adding an actual experience of doing makes the exposure much more alive. Students at Snail Lake Elementary School in Moundsview, Minnesota, viewed a Louise Nevelson exhibit and were inspired to new uses of art…

  6. Hypersonic drone design: A multidisciplinary experience

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Efforts were focused on design problems of an unmanned hypersonic vehicle. It is felt that a scaled hypersonic drone is necessary to bridge the gap between present theory on hypersonics and the future reality of the National Aerospace Plane (NASP) for two reasons: to fulfill a need for experimental data in the hypersonic regime, and to provide a testbed for the scramjet engine which is to be the primary mode of propulsion for the NASP. Three areas of great concern to NASP design were examined: propulsion, thermal management, and flight systems. Problem solving in these areas was directed towards design of the drone with the idea that the same design techniques could be applied to the NASP. A seventy degree swept double delta wing configuration, developed in the 70's at NASA Langley, was chosen as the aerodynamic and geometric model for the drone. This vehicle would be air-launched from a B-1 at Mach 0.8 and 48,000 feet, rocket boosted by two internal engines to Mach 10 and 100,000 feet, and allowed to cruise under power of the scramjet engine until burnout. It would then return to base for an unpowered landing. Preliminary energy calculations based upon the flight requirements give the drone a gross launch weight of 134,000 lb. and an overall length of 85 feet.

  7. Principles of Designing Interpretable Optogenetic Behavior Experiments

    ERIC Educational Resources Information Center

    Allen, Brian D.; Singer, Annabelle C.; Boyden, Edward S.

    2015-01-01

    Over the last decade, there has been much excitement about the use of optogenetic tools to test whether specific cells, regions, and projection pathways are necessary or sufficient for initiating, sustaining, or altering behavior. However, the use of such tools can result in side effects that can complicate experimental design or interpretation.…

  8. Learning Experience as Transaction: A Framework for Instructional Design

    ERIC Educational Resources Information Center

    Parrish, Patrick E.; Wilson, Brent G.; Dunlap, Joanna C.

    2011-01-01

    This article presents a framework for understanding learning experience as an object for instructional design--as an object for design as well as research and understanding. Compared to traditional behavioral objectives or discrete cognitive skills, the object of experience is more holistic, requiring simultaneous attention to cognition, behavior,…

  9. Student designed experiments to learn fluids

    NASA Astrophysics Data System (ADS)

    Stern, Catalina

    2013-11-01

    Lasers and high speed cameras are a wonderful tool to visualize the very complex behavior of fluids, and to help students grasp concepts like turbulence, surface tension and vorticity. In this work we present experiments done by physics students in their senior year at the School of Science of the National University of Mexico as a final project in the continuum mechanics course. Every semester, the students make an oral presentation of their work and videos and images are kept in the web page ``Pasión por los Fluidos''. I acknowledge support from the Physics Department of Facultad de Ciencias, Universidad Nacional Autónoma de México.

  10. Experiment to measure vacuum birefringence: Conceptual design

    NASA Astrophysics Data System (ADS)

    Mueller, Guido; Tanner, David; Doebrich, Babette; Poeld, Jan; Lindner, Axel; Willke, Benno

    2016-03-01

    Vacuum birefringence is another lingering challenge which will soon become accessible to experimental verification. The effect was first calculated by Euler and Heisenberg in 1936 and is these days described as a one-loop correction to the differential index of refraction between light which is polarized parallel and perpendicular to an external magnetic field. Our plan is to realize (and slightly modify) an idea which was originally published by Hall, Ye, and Ma using advanced LIGO and LISA technology and the infrastructure of the ALPS light-shining-through-walls experiment following the ALPS IIc science run. This work is supported by the Deutsche Forschungsgemeinschaft and the Heising-Simons Foundation.

  11. Design for a High Energy Density Kelvin-Helmholtz Experiment

    SciTech Connect

    Hurricane, O A

    2007-10-29

    While many high energy density physics (HEDP) Rayleigh-Taylor and Richtmyer-Meshkov instability experiments have been fielded as part of basic HEDP and astrophysics studies, not one HEDP Kelvin-Helmholtz (KH) experiment has been successfully performed. Herein, a design for a novel HEDP x-ray driven KH experiment is presented along with supporting radiation-hydrodynamic simulation and theory.

  12. Roles in Innovative Software Teams: A Design Experiment

    NASA Astrophysics Data System (ADS)

    Aaen, Ivan

    With inspiration from role-play and improvisational theater, we are developing a framework for innovation in software teams called Essence. Based on agile principles, Essence is designed for teams of developers and an onsite customer. This paper reports from teaching experiments inspired by design science, where we tried to assign differentiated roles to team members. The experiments provided valuable insights into the design of roles in Essence. These insights are used for redesigning how roles are described and conveyed in Essence.

  13. Proper battery system design for GAS experiments

    NASA Technical Reports Server (NTRS)

    Calogero, Stephen A.

    1992-01-01

    The purpose of this paper is to help the GAS experimenter to design a battery system that meets mission success requirements while at the same time reducing the hazards associated with the battery system. Lead-acid, silver-zinc and alkaline chemistry batteries will be discussed. Lithium batteries will be briefly discussed with emphasis on back-up power supply capabilities. The hazards associated with different battery configurations will be discussed along with the controls necessary to make the battery system two-fault tolerant.

  14. JASMINE Project --Instrument Design and Centroiding Experiment--

    NASA Astrophysics Data System (ADS)

    Yano, T.; Gouda, N.; Kobayashi, Y.; Yamada, Y.; Jasmine Working Group

    JASMINE is the acronym of the Japan Astrometry Satellite Mission for INfrared z-band 0 9 micron Exploration and is planned to be launched around 2015 The main objective of JASMINE is to study the fundamental structure and evolution of the Milky Way Galaxy Another important objective is to investigate stellar physics In order to accomplish these objectives JASMINE will measure trigonometric parallaxes positions and proper motions of about ten million stars during the observational program with the precision of 10 microarcsec at z 14mag We present the instrument design of JASMINE optics detectors etc and techniques for estimating the centroiding of satar images to accomplish the objectives In order to obtain measurements of astrometric parameters with high accuracy the optics with a long focal length and a wide focal plane is requested The Korsch system 3-mirror system is one of the convincing models However the center of the field is totally vignetted because of the fold mirror Therefore we consider the improved Korsch system in which the center of the field is not vignetted We obtain the diffraction limited optical design with small distortion We place dozens of CCD arrays with high quantum efficiency at z-band on the focal plane This new type of detectors is now being developed mainly at National Astronomical Observatory of Japan In order to accomplish the objective we must estimate positions of star images on the CCD array with sub-pixel accuracy Therefore we need a technique to obtain precise positions of star

  15. On the design of closed recapture experiments.

    PubMed

    Alunni Fegatelli, Danilo; Farcomeni, Alessio

    2016-11-01

    We propose a method to plan the number of occasions of recapture experiments for population size estimation. We do so by fixing the smallest number of capture occasions so that the expected length of the profile confidence interval is less than or equal to a fixed threshold. In some cases, we solve the optimization problem in closed form. For more complex models we use numerical optimization. We detail models assuming homogeneous, time-varying, subject-specific capture probabilities, behavioral response to capture, and combining behavioral response with subject-specific effects. The principle we propose can be extended to plan any other model specification. We formally show the validity of the approach by proving distributional convergence. We illustrate with simulations and challenging examples in epidemiology and ecology. We report that in many cases adding as few as two sampling occasions may substantially reduce the length of confidence intervals.

  16. Design of a Microgravity Spray Cooling Experiment

    NASA Technical Reports Server (NTRS)

    Baysinger, Kerri M.; Yerkes, Kirk L.; Michalak, Travis E.; Harris, Richard J.; McQuillen, John

    2004-01-01

    An analytical and experimental study was conducted for the application of spray cooling in a microgravity and high-g environment. Experiments were carried out aboard the NASA KC-135 reduced gravity aircraft, which provided the microgravity and high-g environments. In reduced gravity, surface tension flow was observed around the spray nozzle, due to unconstrained liquid in the test chamber and flow reversal at the heat source. A transient analytical model was developed to predict the temperature and the spray heat transfer coefficient within the heated region. Comparison of the experimental transient temperature variation with analytical results showed good agreement for low heat input values. The transient analysis also verified that thermal equilibrium within the heated region could be reached during the 20-25s reduced gravity portion of the flight profile.

  17. Hypersonic Wind Tunnel Calibration Using the Modern Design of Experiments

    NASA Technical Reports Server (NTRS)

    Rhode, Matthew N.; DeLoach, Richard

    2005-01-01

    A calibration of a hypersonic wind tunnel has been conducted using formal experiment design techniques and response surface modeling. Data from a compact, highly efficient experiment was used to create a regression model of the pitot pressure as a function of the facility operating conditions as well as the longitudinal location within the test section. The new calibration utilized far fewer design points than prior experiments, but covered a wider range of the facility s operating envelope while revealing interactions between factors not captured in previous calibrations. A series of points chosen randomly within the design space was used to verify the accuracy of the response model. The development of the experiment design is discussed along with tactics used in the execution of the experiment to defend against systematic variation in the results. Trends in the data are illustrated, and comparisons are made to earlier findings.

  18. Design and Analysis of AN Static Aeroelastic Experiment

    NASA Astrophysics Data System (ADS)

    Hou, Ying-Yu; Yuan, Kai-Hua; Lv, Ji-Nan; Liu, Zi-Qiang

    2016-06-01

    Static aeroelastic experiments are very common in the United States and Russia. The objective of static aeroelastic experiments is to investigate deformation and loads of elastic structure in flow field. Generally speaking, prerequisite of this experiment is that the stiffness distribution of structure is known. This paper describes a method for designing experimental models, in the case where the stiffness distribution and boundary condition of a real aircraft are both uncertain. The stiffness distribution form of the structure can be calculated via finite element modeling and simulation calculation and F141 steels and rigid foam are used to make elastic model. In this paper, the design and manufacturing process of static aeroelastic models is presented and a set of experiment model was designed to simulate the stiffness of the designed wings, a set of experiments was designed to check the results. The test results show that the experimental method can effectively complete the design work of elastic model. This paper introduces the whole process of the static aeroelastic experiment, and the experimental results are analyzed. This paper developed a static aeroelasticity experiment technique and established an experiment model targeting at the swept wing of a certain kind of large aspect ratio aircraft.

  19. Block designs in method transfer experiments.

    PubMed

    Altan, Stan; Shoung, Jyh-Ming

    2008-01-01

    Method transfer is a part of the pharmaceutical development process in which an analytical (chemical) procedure developed in one laboratory (typically the research laboratory) is about to be adopted by one or more recipient laboratories (production or commercial operations). The objective is to show that the recipient laboratory is capable of performing the procedure in an acceptable manner. In the course of carrying out a method transfer, other questions may arise related to fixed or random factors of interest, such as analyst, apparatus, batch, supplier of analytical reagents, and so forth. Estimates of reproducibility and repeatability may also be of interest. This article focuses on the application of various block designs that have been found useful in the comprehensive study of method transfer beyond the laboratory effect alone. An equivalence approach to the comparison of laboratories can still be carried out on either the least squares means or subject-specific means of the laboratories to justify a method transfer or to compare analytical methods.

  20. Divertor design for the Tokamak Physics Experiment

    SciTech Connect

    Hill, D.N.; Braams, B.; Brooks, J.N.

    1994-05-01

    In this paper we discuss the present divertor design for the planned TPX tokamak, which will explore the physics and technology of steady-state (1000s pulses) heat and particle removal in high confinement (2--4{times} L-mode), high beta ({beta}{sub N} {ge} 3) divertor plasmas sustained by non-inductive current drive. The TPX device will operate in the double-null divertor configuration, with actively cooled graphite targets forming a deep (0.5 m) slot at the outer strike point. The peak heat flux on, the highly tilted (74{degrees} from normal) re-entrant (to recycle ions back toward the separatrix) will be in the range of 4--6 MW/m{sup 2} with 18 MW of neutral beams and RF heating power. The combination of active pumping and gas puffing (deuterium plus impurities), along with higher heating power (45 MW maximum) will allow testing of radiative divertor concepts at ITER-like power densities.

  1. Recent experience with design and manufacture of cine lenses

    NASA Astrophysics Data System (ADS)

    Thorpe, Michael D.; Dalzell, Kristen E.

    2015-09-01

    Modern cine lenses require a high degree of aberration correction over a large and ever expanding image size. At low to medium volume production levels, these highly corrected designs also require a workable tolerance set and compensation scheme for successful manufacture. In this paper we discuss the design and manufacture of cine lenses with reference to current designs both internal and in the patent literature and some experience in design, tolerancing and manufacturing these lenses in medium volume production.

  2. Optimal experiment design for identification of large space structures

    NASA Technical Reports Server (NTRS)

    Bayard, D. S.; Hadaegh, F. Y.; Meldrum, D. R.

    1988-01-01

    The optimal experiment design for on-orbit identification of modal frequency and damping parameters in large flexible space structures is discussed. The main result is a separation principle for D-optimal design which states that under certain conditions the sensor placement problem is decoupled from the input design problem. This decoupling effect significantly simplifies the overall optimal experiment design determination for large MIMO structural systems with many unknown modal parameters. The error from using the uncoupled design is estimated in terms of the inherent damping of the structure. A numerical example is given, demonstrating the usefulness of the simplified criteria in determining optimal designs for on-orbit Space Station identification experiments.

  3. User experience interaction design for digital educational games

    NASA Astrophysics Data System (ADS)

    Yuan, Jiugen; Zhang, Wenting; Xing, Ruonan

    2014-04-01

    Leading the elements of games into education is the newest teaching concepts in the field of educational technology, which is by using healthy games to impel and preserve the learner's motivation, improve the learning efficiency and bring one experience in learning something by playing games. First of all, this article has introduced the concept of Digital Game and User Experience and brought the essence of Digital Game to light to construct the frame of user experience interaction design for digital educational games and offer one design idea for the development of related products and hoping that Digital Game will bring us continuous innovation experience.

  4. Electromechanical co-design and experiment of structurally integrated antenna

    NASA Astrophysics Data System (ADS)

    Zhou, Jinzhu; Huang, Jin; Song, Liwei; Zhang, Dan; Ma, Yunchao

    2015-03-01

    This paper proposes an electromechanical co-design method of a structurally integrated antenna to simultaneously meet mechanical and electrical requirements. The method consists of three stages. The first stage involves finishing an initial design of the microstrip antenna without a facesheet or honeycomb, according to some predefined performances. Subsequently, the facesheet and honeycomb of the structurally integrated antenna are designed using an electromechanical co-design optimization. Based on the results from the first and second stages, a fine full-wave electromagnetic model is developed and the coarse design results are further optimized to meet the electrical performance. The co-design method is applied to the design of a 2.5 GHz structurally integrated antenna, and then the designed antenna is fabricated. Experiments from the mechanical and electrical performances are conducted, and the results confirm the effectiveness of the co-design method. This method shows great promise for the multidisciplinary design of a structurally integrated antenna.

  5. Design Considerations for Large Mass Ultra-Low Background Experiments

    SciTech Connect

    Aguayo Navarrete, Estanislao; Reid, Douglas J.; Fast, James E.; Orrell, John L.

    2011-07-01

    Summary The objective of this document is to present the designers of the next generation of large-mass, ultra-low background experiments with lessons learned and design strategies from previous experimental work. Design issues divided by topic into mechanical, thermal and electrical requirements are addressed. Large mass low-background experiments have been recognized by the scientific community as appropriate tools to aid in the refinement of the standard model. The design of these experiments is very costly and a rigorous engineering review is required for their success. The extreme conditions that the components of the experiment must withstand (heavy shielding, vacuum/pressure and temperature gradients), in combination with unprecedented noise levels, necessitate engineering guidance to support quality construction and safe operating conditions. Physical properties and analytical results of typical construction materials are presented. Design considerations for achieving ultra-low-noise data acquisition systems are addressed. Five large-mass, low-background conceptual designs for the one-tonne scale germanium experiment are proposed and analyzed. The result is a series of recommendations for future experiments engineering and for the Majorana simulation task group to evaluate the different design approaches.

  6. Electrical design of payload G-534: The Pool Boiling Experiment

    NASA Technical Reports Server (NTRS)

    Francisco, David R.

    1992-01-01

    Payload G-534, the Pool Boiling Experiment (PBE), is a Get Away Special that is scheduled to fly on the shuttle in 1992. This paper will give a brief overall description of the experiment with the main discussion being the electrical design with a detailed description of the power system and interface to the GAS electronics. The batteries used and their interface to the experiment Power Control Unit (PCU) and GAS electronics will be examined. The design philosophy for the PCU will be discussed in detail. The criteria for selection of fuses, relays, power semiconductors and other electrical components along with grounding and shielding policy for the entire experiment will be presented. The intent of this paper is to discuss the use of military tested parts and basic design guidelines to build a quality experiment for minimal additional cost.

  7. Selecting the best design for nonstandard toxicology experiments.

    PubMed

    Webb, Jennifer M; Smucker, Byran J; Bailer, A John

    2014-10-01

    Although many experiments in environmental toxicology use standard statistical experimental designs, there are situations that arise where no such standard design is natural or applicable because of logistical constraints. For example, the layout of a laboratory may suggest that each shelf serve as a block, with the number of experimental units per shelf either greater than or less than the number of treatments in a way that precludes the use of a typical block design. In such cases, an effective and powerful alternative is to employ optimal experimental design principles, a strategy that produces designs with precise statistical estimates. Here, a D-optimal design was generated for an experiment in environmental toxicology that has 2 factors, 16 treatments, and constraints similar to those described above. After initial consideration of a randomized complete block design and an intuitive cyclic design, it was decided to compare a D-optimal design and a slightly more complicated version of the cyclic design. Simulations were conducted generating random responses under a variety of scenarios that reflect conditions motivated by a similar toxicology study, and the designs were evaluated via D-efficiency as well as by a power analysis. The cyclic design performed well compared to the D-optimal design.

  8. Adaptive multibeam phased array design for a Spacelab experiment

    NASA Technical Reports Server (NTRS)

    Noji, T. T.; Fass, S.; Fuoco, A. M.; Wang, C. D.

    1977-01-01

    The parametric tradeoff analyses and design for an Adaptive Multibeam Phased Array (AMPA) for a Spacelab experiment are described. This AMPA Experiment System was designed with particular emphasis to maximize channel capacity and minimize implementation and cost impacts for future austere maritime and aeronautical users, operating with a low gain hemispherical coverage antenna element, low effective radiated power, and low antenna gain-to-system noise temperature ratio.

  9. 2011 AERA Presidential Address: Designing Resilient Ecologies--Social Design Experiments and a New Social Imagination

    ERIC Educational Resources Information Center

    Gutiérrez, Kris D.

    2016-01-01

    This article is about designing for educational possibilities--designs that in their inception, social organization, and implementation squarely address issues of cultural diversity, social inequality, and robust learning. I discuss an approach to design-based research, social design experiments, that privileges a social scientific inquiry…

  10. A Model for Designing Adaptive Laboratory Evolution Experiments.

    PubMed

    LaCroix, Ryan A; Palsson, Bernhard O; Feist, Adam M

    2017-04-15

    The occurrence of mutations is a cornerstone of the evolutionary theory of adaptation, capitalizing on the rare chance that a mutation confers a fitness benefit. Natural selection is increasingly being leveraged in laboratory settings for industrial and basic science applications. Despite increasing deployment, there are no standardized procedures available for designing and performing adaptive laboratory evolution (ALE) experiments. Thus, there is a need to optimize the experimental design, specifically for determining when to consider an experiment complete and for balancing outcomes with available resources (i.e., laboratory supplies, personnel, and time). To design and to better understand ALE experiments, a simulator, ALEsim, was developed, validated, and applied to the optimization of ALE experiments. The effects of various passage sizes were experimentally determined and subsequently evaluated with ALEsim, to explain differences in experimental outcomes. Furthermore, a beneficial mutation rate of 10(-6.9) to 10(-8.4) mutations per cell division was derived. A retrospective analysis of ALE experiments revealed that passage sizes typically employed in serial passage batch culture ALE experiments led to inefficient production and fixation of beneficial mutations. ALEsim and the results described here will aid in the design of ALE experiments to fit the exact needs of a project while taking into account the resources required and will lower the barriers to entry for this experimental technique.IMPORTANCE ALE is a widely used scientific technique to increase scientific understanding, as well as to create industrially relevant organisms. The manner in which ALE experiments are conducted is highly manual and uniform, with little optimization for efficiency. Such inefficiencies result in suboptimal experiments that can take multiple months to complete. With the availability of automation and computer simulations, we can now perform these experiments in an optimized

  11. Application of optimal design methodologies in clinical pharmacology experiments.

    PubMed

    Ogungbenro, Kayode; Dokoumetzidis, Aristides; Aarons, Leon

    2009-01-01

    Pharmacokinetics and pharmacodynamics data are often analysed by mixed-effects modelling techniques (also known as population analysis), which has become a standard tool in the pharmaceutical industries for drug development. The last 10 years has witnessed considerable interest in the application of experimental design theories to population pharmacokinetic and pharmacodynamic experiments. Design of population pharmacokinetic experiments involves selection and a careful balance of a number of design factors. Optimal design theory uses prior information about the model and parameter estimates to optimize a function of the Fisher information matrix to obtain the best combination of the design factors. This paper provides a review of the different approaches that have been described in the literature for optimal design of population pharmacokinetic and pharmacodynamic experiments. It describes options that are available and highlights some of the issues that could be of concern as regards practical application. It also discusses areas of application of optimal design theories in clinical pharmacology experiments. It is expected that as the awareness about the benefits of this approach increases, more people will embrace it and ultimately will lead to more efficient population pharmacokinetic and pharmacodynamic experiments and can also help to reduce both cost and time during drug development.

  12. Factor Analysis of Dichotomous Memory Items from a Designed Experiment.

    ERIC Educational Resources Information Center

    Hofacker, Charles F.

    Recently, confirmatory factor analysis has been extended to the case of dichotomous data (e.g., Muthen, 1978). In this study, confirmatory factor analysis was applied to all-or-none recall data from a designed experiment. In the experiment, subjects read pairs of English nouns and then tried to recall the right hand member of the pair when…

  13. Shuttle wave experiments. [space plasma investigations: design and instrumentation

    NASA Technical Reports Server (NTRS)

    Calvert, W.

    1976-01-01

    Wave experiments on shuttle are needed to verify dispersion relations, to study nonlinear and exotic phenomena, to support other plasma experiments, and to test engineering designs. Techniques based on coherent detection and bistatic geometry are described. New instrumentation required to provide modules for a variety of missions and to incorporate advanced signal processing and control techniques is discussed. An experiment for Z to 0 coupling is included.

  14. Thermal design, analysis and testing of the Halogen Occultation Experiment

    NASA Technical Reports Server (NTRS)

    Foss, Richard A.; Smith, Dewey M.

    1987-01-01

    This paper briefly introduces the Halogen Occultation Experiment (HALOE) and describes the thermal requirements in some detail. The thermal design of the HALOE is described, together with the design process and the analytical techniques used to arrive at this design. The flight hardware has undergone environmental testing in a thermal vacuum chamber to validate the thermal design. The HALOE is a unique problem in thermal control due to its variable solar loading, its extremely sensitive optical components and the high degree of pointing accuracy required. This paper describes the flight hardware, the design process and its verification.

  15. Building a Framework for Engineering Design Experiences in High School

    ERIC Educational Resources Information Center

    Denson, Cameron D.; Lammi, Matthew

    2014-01-01

    In this article, Denson and Lammi put forth a conceptual framework that will help promote the successful infusion of engineering design experiences into high school settings. When considering a conceptual framework of engineering design in high school settings, it is important to consider the complex issue at hand. For the purposes of this…

  16. Design of spatial experiments: Model fitting and prediction

    SciTech Connect

    Fedorov, V.V.

    1996-03-01

    The main objective of the paper is to describe and develop model oriented methods and algorithms for the design of spatial experiments. Unlike many other publications in this area, the approach proposed here is essentially based on the ideas of convex design theory.

  17. Working Theory into and out of Design Experiments

    ERIC Educational Resources Information Center

    Palincsar, Annemarie Sullivan

    2005-01-01

    In this response, I advocate for the value of considering theory in the design-based research that Gersten describes in Behind the Scenes of an Intervention Research Study. I argue that such an emphasis: is consistent with the literature on design experiments, is integral to advancing knowledge building within domains, serves to advance the work…

  18. On Design Experiment Teaching in Engineering Quality Cultivation

    ERIC Educational Resources Information Center

    Chen, Xiao

    2008-01-01

    Design experiment refers to that designed and conducted by students independently and is surely an important method to cultivate students' comprehensive quality. According to the development and requirements of experimental teaching, this article carries out a study and analysis on the purpose, significance, denotation, connotation and…

  19. Resolution of an Orbital Issue: A Designed Experiment

    NASA Technical Reports Server (NTRS)

    Huddleston, Lisa L.

    2011-01-01

    Design of Experiments (DOE) is a systematic approach to investigation of a system or process. A series of structured tests are designed in which planned changes are made to the input variables of a process or system. The effects of these changes on a pre-defined output are then assessed. DOE is a formal method of maximizing information gained while minimizing resources required.

  20. Thinking about "Design Thinking": A Study of Teacher Experiences

    ERIC Educational Resources Information Center

    Retna, Kala S.

    2016-01-01

    Schools are continuously looking for new ways of enhancing student learning to equip students with skills that would enable them to cope with twenty-first century demands. One promising approach focuses on design thinking. This study examines teacher's perceptions, experiences and challenges faced in adopting design thinking. There is a lack of…

  1. Factorial Design: An Eight Factor Experiment Using Paper Helicopters

    NASA Technical Reports Server (NTRS)

    Kozma, Michael

    1996-01-01

    The goal of this paper is to present the analysis of the multi-factor experiment (factorial design) conducted in EG490, Junior Design at Loyola College in Maryland. The discussion of this paper concludes the experimental analysis and ties the individual class papers together.

  2. Teaching Optimal Design of Experiments Using a Spreadsheet

    ERIC Educational Resources Information Center

    Goos, Peter; Leemans, Herlinde

    2004-01-01

    In this paper, we present an interactive teaching approach to introduce the concept of optimal design of experiments to students. Our approach is based on the use of spreadsheets. One advantage of this approach is that no complex mathematical theory is needed nor that any design construction algorithm has to be discussed at the introductory stage.…

  3. Advances in Experiment Design for High Performance Aircraft

    NASA Technical Reports Server (NTRS)

    Morelli, Engene A.

    1998-01-01

    A general overview and summary of recent advances in experiment design for high performance aircraft is presented, along with results from flight tests. General theoretical background is included, with some discussion of various approaches to maneuver design. Flight test examples from the F-18 High Alpha Research Vehicle (HARV) are used to illustrate applications of the theory. Input forms are compared using Cramer-Rao bounds for the standard errors of estimated model parameters. Directions for future research in experiment design for high performance aircraft are identified.

  4. Optimisation of sampling windows design for population pharmacokinetic experiments.

    PubMed

    Ogungbenro, Kayode; Aarons, Leon

    2008-08-01

    This paper describes an approach for optimising sampling windows for population pharmacokinetic experiments. Sampling windows designs are more practical in late phase drug development where patients are enrolled in many centres and in out-patient clinic settings. Collection of samples under the uncontrolled environment at these centres at fixed times may be problematic and can result in uninformative data. Population pharmacokinetic sampling windows design provides an opportunity to control when samples are collected by allowing some flexibility and yet provide satisfactory parameter estimation. This approach uses information obtained from previous experiments about the model and parameter estimates to optimise sampling windows for population pharmacokinetic experiments within a space of admissible sampling windows sequences. The optimisation is based on a continuous design and in addition to sampling windows the structure of the population design in terms of the proportion of subjects in elementary designs, number of elementary designs in the population design and number of sampling windows per elementary design is also optimised. The results obtained showed that optimal sampling windows designs obtained using this approach are very efficient for estimating population PK parameters and provide greater flexibility in terms of when samples are collected. The results obtained also showed that the generalized equivalence theorem holds for this approach.

  5. Functional design to support CDTI/DABS flight experiments

    NASA Technical Reports Server (NTRS)

    Goka, T.

    1982-01-01

    The objectives of this project are to: (1) provide a generalized functional design of CDTI avionics using the FAA developd DABS/ATARS ground system as the 'traffic sensor', (2) specify software modifications and/or additions to the existing DABS/ATARS ground system to support CDTI avionics, (3) assess the existing avionics of a NASA research aircraft in terms of CDTI applications, and (4) apply the generalized functional design to provide research flight experiment capability. DABS Data Link Formats are first specified for CDTI flight experiments. The set of CDTI/DABS Format specifications becomes a vehicle to coordinate the CDTI avionics and ground system designs, and hence, to develop overall system requirements. The report is the first iteration of a system design and development effort to support eventual CDTI flight test experiments.

  6. Vestibular Function Research (VFR) experiment. Phase B: Design definition study

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The Vestibular Functions Research (VFR) Experiment was established to investigate the neurosensory and related physiological processes believed to be associated with the space flight nausea syndrome and to develop logical means for its prediction, prevention and treatment. The VFR Project consists of ground and spaceflight experimentation using frogs as specimens. The phase B Preliminary Design Study provided for the preliminary design of the experiment hardware, preparation of performance and hardware specification and a Phase C/D development plan, establishment of STS (Space Transportation System) interfaces and mission operations, and the study of a variety of hardware, experiment and mission options. The study consist of three major tasks: (1) mission mode trade-off; (2) conceptual design; and (3) preliminary design.

  7. Conceptual design for high mass imploding liner experiments

    SciTech Connect

    Reinovsky, R.E.; Clark, D.A.; Ekdahl, C.A.

    1996-12-31

    We have summarized some of the motivation behind high energy liner experiments. We have identified the 100-cm-diameter Disk Explosive-Magnetic Gene promising candidate for powering such experiments and described a phenomenological modeling approach used to understand the limits of DEMG operation. We have explored the liner implosion parameter space that can be addressed by such systems and have selected a design point from which to develop a conceptual experiment. We have applied the phenomenological model to the point design parameters and used 1 D MHD tools to assess the behavior of the liner for parameters at the design point. We have not to optimized the choice of pulse power or liner parameters. We conclude that operating in the velocity range of 10-20 km/s, kinetic energies around 100 MJ are practical with currents approaching 200 MA in the liner. Higher velocities (up to almost 40 km/s) are achieved on the inner surface of a thick liner when the liner collapses to I -cm radius. At 6-cm radius the non- optimized liners explored here are attractive drivers for experiments exploring the compression of magnetized plasmas and at 1 cm they are equally attractive drivers for shock wave experiments in the pressure range of 30-100 Mbar. An experiment based on this design concept is scheduled to be conducted in VNIIEF in August 1996.

  8. Design of Orion Soil Impact Study using the Modern Design of Experiments

    NASA Technical Reports Server (NTRS)

    DeLoach, Richard

    2010-01-01

    Two conventional One Factor At a Time (OFAT) test matrices under consideration for an Orion Landing System subscale soil impact study are reviewed. Certain weaknesses in the designs, systemic to OFAT experiment designs generally, are identified. An alternative test matrix is proposed that is based in the Modern Design of Experiments (MDOE), which achieves certain synergies by combining the original two test matrices into one. The attendant resource savings are quantified and the impact on uncertainty is discussed.

  9. Design of Experiments, Model Calibration and Data Assimilation

    SciTech Connect

    Williams, Brian J.

    2014-07-30

    This presentation provides an overview of emulation, calibration and experiment design for computer experiments. Emulation refers to building a statistical surrogate from a carefully selected and limited set of model runs to predict unsampled outputs. The standard kriging approach to emulation of complex computer models is presented. Calibration refers to the process of probabilistically constraining uncertain physics/engineering model inputs to be consistent with observed experimental data. An initial probability distribution for these parameters is updated using the experimental information. Markov chain Monte Carlo (MCMC) algorithms are often used to sample the calibrated parameter distribution. Several MCMC algorithms commonly employed in practice are presented, along with a popular diagnostic for evaluating chain behavior. Space-filling approaches to experiment design for selecting model runs to build effective emulators are discussed, including Latin Hypercube Design and extensions based on orthogonal array skeleton designs and imposed symmetry requirements. Optimization criteria that further enforce space-filling, possibly in projections of the input space, are mentioned. Designs to screen for important input variations are summarized and used for variable selection in a nuclear fuels performance application. This is followed by illustration of sequential experiment design strategies for optimization, global prediction, and rare event inference.

  10. Trade-offs in the design of experiments.

    PubMed

    Wiley, R Haven

    2009-11-01

    This comment supplements and clarifies issues raised by J. C. Schank and T. J. Koehnle (2009) in their critique of experimental design. First, the pervasiveness of trade-offs in the design of experiments is emphasized (Wiley, 2003). Particularly germane to Schank and Koehnle's discussion are the inevitable trade-offs in any decisions to include blocking or to standardize conditions in experiments. Second, the interpretation of multiple tests of a hypothesis is clarified. Only when interest focuses on any, rather than each, of N possible responses is it appropriate to adjust criteria for statistical significance of the results. Finally, a misunderstanding is corrected about a disadvantage of large experiments (Wiley, 2003). Experiments with large samples raise the possibility of small, but statistically significant, biases even after randomization of treatments. Because these small biases are difficult for experimenters and readers to notice, large experiments demonstrating small effects require special scrutiny. Such experiments are justified only when they involve minimal human intervention and maximal standardization. Justifications for the inevitable trade-offs in experimental design require careful attention when reporting any experiment.

  11. Structural Design Feasibility Study for the Global Climate Experiment

    SciTech Connect

    Lewin,K.F.; Nagy, J.

    2008-12-01

    Neon, Inc. is proposing to establish a Global Change Experiment (GCE) Facility to increase our understanding of how ecological systems differ in their vulnerability to changes in climate and other relevant global change drivers, as well as provide the mechanistic basis for forecasting ecological change in the future. The experimental design was initially envisioned to consist of two complementary components; (A) a multi-factor experiment manipulating CO{sub 2}, temperature and water availability and (B) a water balance experiment. As the design analysis and cost estimates progressed, it became clear that (1) the technical difficulties of obtaining tight temperature control and maintaining elevated atmospheric carbon dioxide levels within an enclosure were greater than had been expected and (2) the envisioned study would not fit into the expected budget envelope if this was done in a partially or completely enclosed structure. After discussions between NEON management, the GCE science team, and Keith Lewin, NEON, Inc. requested Keith Lewin to expand the scope of this design study to include open-field exposure systems. In order to develop the GCE design to the point where it can be presented within a proposal for funding, a feasibility study of climate manipulation structures must be conducted to determine design approaches and rough cost estimates, and to identify advantages and disadvantages of these approaches including the associated experimental artifacts. NEON, Inc requested this design study in order to develop concepts for the climate manipulation structures to support the NEON Global Climate Experiment. This study summarizes the design concepts considered for constructing and operating the GCE Facility and their associated construction, maintenance and operations costs. Comparisons and comments about experimental artifacts, construction challenges and operational uncertainties are provided to assist in selecting the final facility design. The overall goal

  12. Experience-based design, co-design and experience-based co-design in palliative and end-of-life care.

    PubMed

    Borgstrom, Erica; Barclay, Stephen

    2017-02-16

    Experience-based design, co-design, and experience-based co-design can be used within healthcare to design services that improve the patient, carer and staff experience of the services. As palliative and end-of-life care centrally value person-centred care, we believe that service designers, commissioners and those tasked with making quality improvements will be interested in this growing field. This paper outlines these approaches-with a particular emphasis on experience-based co-design-and describes how they are and can be used within palliative and end-of-life care. Based on a rapid review and several case studies, this article highlights the key lessons learnt from previous projects using these approaches and discusses areas for improvement in current reporting of service design projects.

  13. A Bubble Mixture Experiment Project for Use in an Advanced Design of Experiments Class

    ERIC Educational Resources Information Center

    Steiner, Stefan H.; Hamada, Michael; White, Bethany J.Giddings; Kutsyy, Vadim; Mosesova, Sofia; Salloum, Geoffrey

    2007-01-01

    This article gives an example of how student-conducted experiments can enhance a course in the design of experiments. We focus on a project whose aim is to find a good mixture of water, soap and glycerin for making soap bubbles. This project is relatively straightforward to implement and understand. At its most basic level the project introduces…

  14. Designing for our future selves: the Swedish experience.

    PubMed

    Benktzon, M

    1993-02-01

    The social context of Sweden provides a good environment for research and development of products and technical aids for the disabled and elderly. However, the model used by Swedish ergonomists and designers in Ergonomi Design Gruppen emphasizes how the application of experience gained from designing such aids can lead to better products for everyone. Three main examples are given to demonstrate how ergonomics studies and prototype/model evaluation by the target users have led to new designs for familiar objects: eating implements, walking sticks and coffee pots. Addressing particular aspects of design for people with specific difficulties, and problems associated with the use of everyday items, has led to designs which are acceptable to a broader range of users.

  15. Photon detection system designs for the Deep Underground Neutrino Experiment

    NASA Astrophysics Data System (ADS)

    Whittington, D.

    2016-05-01

    The Deep Underground Neutrino Experiment (DUNE) will be a premier facility for exploring long-standing questions about the boundaries of the standard model. Acting in concert with the liquid argon time projection chambers underpinning the far detector design, the DUNE photon detection system will capture ultraviolet scintillation light in order to provide valuable timing information for event reconstruction. To maximize the active area while maintaining a small photocathode coverage, the experiment will utilize a design based on plastic light guides coated with a wavelength-shifting compound, along with silicon photomultipliers, to collect and record scintillation light from liquid argon. This report presents recent preliminary performance measurements of this baseline design and several alternative designs which promise significant improvements in sensitivity to low-energy interactions.

  16. Photon Detection System Designs for the Deep Underground Neutrino Experiment

    SciTech Connect

    Whittington, Denver

    2015-11-19

    The Deep Underground Neutrino Experiment (DUNE) will be a premier facility for exploring long-standing questions about the boundaries of the standard model. Acting in concert with the liquid argon time projection chambers underpinning the far detector design, the DUNE photon detection system will capture ultraviolet scintillation light in order to provide valuable timing information for event reconstruction. To maximize the active area while maintaining a small photocathode coverage, the experiment will utilize a design based on plastic light guides coated with a wavelength-shifting compound, along with silicon photomultipliers, to collect and record scintillation light from liquid argon. This report presents recent preliminary performance measurements of this baseline design and several alternative designs which promise significant improvements in sensitivity to low-energy interactions.

  17. Operational experience and design recommendations for teleoperated flight hardware

    NASA Technical Reports Server (NTRS)

    Burgess, T. W.; Kuban, D. P.; Hankins, W. W.; Mixon, R. W.

    1988-01-01

    Teleoperation (remote manipulation) will someday supplement/minimize astronaut extravehicular activity in space to perform such tasks as satellite servicing and repair, and space station construction and servicing. This technology is being investigated by NASA with teleoperation of two space-related tasks having been demonstrated at the Oak Ridge National Lab. The teleoperator experiments are discussed and the results of these experiments are summarized. The related equipment design recommendations are also presented. In addition, a general discussion of equipment design for teleoperation is also presented.

  18. Preliminary Design Program: Vapor Compression Distillation Flight Experiment Program

    NASA Technical Reports Server (NTRS)

    Schubert, F. H.; Boyda, R. B.

    1995-01-01

    This document provides a description of the results of a program to prepare a preliminary design of a flight experiment to demonstrate the function of a Vapor Compression Distillation (VCD) Wastewater Processor (WWP) in microgravity. This report describes the test sequence to be performed and the hardware, control/monitor instrumentation and software designs prepared to perform the defined tests. the purpose of the flight experiment is to significantly reduce the technical and programmatic risks associated with implementing a VCD-based WWP on board the International Space Station Alpha.

  19. Linear design considerations for TO-10 candidate experiment

    SciTech Connect

    Atchison, Walter A; Rousculp, Christopher L

    2011-01-12

    As part of the LANL/VNIIEF collaboration a high velocity cylindrical liner driven Hugoniot experiment is being designed to be driven by a VNIEF Disk Explosive Magnetic (flux compression) Generator (DEMG). Several variations in drive current and liner thickness have been proposed. This presentation will describe the LANL 1D and 2D simulations used to evaluate those designs. The presentation will also propose an analysis technique to assess a high current drive systems ability to stably and optimally drive a cylindrical aluminum liner for this type of experiment.

  20. Designing a Hybrid Laminar-Flow Control Experiment: The CFD-Experiment Connection

    NASA Technical Reports Server (NTRS)

    Streett, C. L.

    2003-01-01

    The NASA/Boeing hybrid laminar flow control (HLFC) experiment, designed during 1993-1994 and conducted in the NASA LaRC 8-foot Transonic Pressure Tunnel in 1995, utilized computational fluid dynamics and numerical simulation of complex fluid mechanics to an unprecedented extent for the design of the test article and measurement equipment. CFD was used in: the design of the test wing, which was carried from definition of desired disturbance growth characteristics, through to the final airfoil shape that would produce those growth characteristics; the design of the suction-surface perforation pattern that produced enhanced crossflow-disturbance growth: and in the design of the hot-wire traverse system that produced minimal influence on measured disturbance growth. These and other aspects of the design of the test are discussed, after the historical and technical context of the experiment is described.

  1. Design of a microwave calorimeter for the microwave tokamak experiment

    SciTech Connect

    Marinak, M. )

    1988-10-07

    The initial design of a microwave calorimeter for the Microwave Tokamak Experiment is presented. The design is optimized to measure the refraction and absorption of millimeter rf microwaves as they traverse the toroidal plasma of the Alcator C tokamak. Techniques utilized can be adapted for use in measuring high intensity pulsed output from a microwave device in an environment of ultra high vacuum, intense fields of ionizing and non-ionizing radiation and intense magnetic fields. 16 refs.

  2. Radiometer experiment for the aeroassist flight experiment. [Thermal protection data for Orbital Transfer Vehicle design

    NASA Technical Reports Server (NTRS)

    Davy, W. C.; Park, C.; Arnold, J. O.; Balakrishnan, A.

    1985-01-01

    A forthcoming NASA flight experiment is described that provides an opportunity to obtain a large base of radiometric data for high-altitude, high-velocity thermochemically nonequilibrated-flow conditions. As a preliminary to the design of a radiometer for this experiment, an approximate method for predicting both equilibrium and nonequilibrium radiative surface fluxes is described. Spectral results for one trajectory state, a velocity of 10 km/sec at an altitude of 85 km, are presented. These results are then used to develop some of the instrument parameters that will be needed for designing of the three genre of radiometers that are proposed for this experiment.

  3. Optimizing an experimental design for an electromagnetic experiment

    NASA Astrophysics Data System (ADS)

    Roux, E.; Garcia, X.

    2012-04-01

    Most of geophysical studies focus on data acquisition and analysis, but another aspect which is gaining importance is the discussion on acquisition of suitable datasets. This can be done through the design of an optimal experiment. An optimal experiment maximizes geophysical information while maintaining the cost of the experiment as low as possible. This requires a careful selection of recording parameters as source and receivers locations or range of periods needed to image the target. We are developing a method to design an optimal experiment in the context of detecting and monitoring a CO2 reservoir using controlled-source electromagnetic (CSEM) data. Using an algorithm for a simple one-dimensional (1D) situation, we look for the most suitable locations for source and receivers and optimum characteristics of the source to image the subsurface. One main advantage of this kind of technique to design an experiment is that it does not require the acquisition of any data and can thus be easily conducted before any geophysical survey. Our algorithm is based on a genetic algorithm which has been proved to be an efficient technique to examine a wide range of possible surveys and select the one that gives superior resolution. Each configuration is associated to one value of the objective function that characterizes the quality of this particular design. Here, we describe the method used to optimize an experimental design. Then, we validate this new technique and explore the different issues of experimental design by simulating a CSEM survey with a realistic 1D layered model.

  4. Optimizing an experimental design for an electromagnetic experiment

    NASA Astrophysics Data System (ADS)

    Roux, Estelle; Garcia, Xavier

    2013-04-01

    Most of geophysical studies focus on data acquisition and analysis, but another aspect which is gaining importance is the discussion on acquisition of suitable datasets. This can be done through the design of an optimal experiment. Optimizing an experimental design implies a compromise between maximizing the information we get about the target and reducing the cost of the experiment, considering a wide range of constraints (logistical, financial, experimental …). We are currently developing a method to design an optimal controlled-source electromagnetic (CSEM) experiment to detect a potential CO2 reservoir and monitor this reservoir during and after CO2 injection. Our statistical algorithm combines the use of linearized inverse theory (to evaluate the quality of one given design via the objective function) and stochastic optimization methods like genetic algorithm (to examine a wide range of possible surveys). The particularity of our method is that it uses a multi-objective genetic algorithm that searches for designs that fit several objective functions simultaneously. One main advantage of this kind of technique to design an experiment is that it does not require the acquisition of any data and can thus be easily conducted before any geophysical survey. Our new experimental design algorithm has been tested with a realistic one-dimensional resistivity model of the Earth in the region of study (northern Spain CO2 sequestration test site). We show that a small number of well distributed observations have the potential to resolve the target. This simple test also points out the importance of a well chosen objective function. Finally, in the context of CO2 sequestration that motivates this study, we might be interested in maximizing the information we get about the reservoir layer. In that case, we show how the combination of two different objective functions considerably improve its resolution.

  5. Preliminary design of two Space Shuttle fluid physics experiments

    NASA Technical Reports Server (NTRS)

    Gat, N.; Kropp, J. L.

    1984-01-01

    The mid-deck lockers of the STS and the requirements for operating an experiment in this region are described. The design of the surface tension induced convection and the free surface phenomenon experiments use a two locker volume with an experiment unique structure as a housing. A manual mode is developed for the Surface Tension Induced Convection experiment. The fluid is maintained in an accumulator pre-flight. To begin the experiment, a pressurized gas drives the fluid into the experiment container. The fluid is an inert silicone oil and the container material is selected to be comparable. A wound wire heater, located axisymmetrically above the fluid can deliver three wattages to a spot on the fluid surface. These wattages vary from 1-15 watts. Fluid flow is observed through the motion of particles in the fluid. A 5 mw He/Ne laser illuminates the container. Scattered light is recorded by a 35mm camera. The free surface phenomena experiment consists of a trapezoidal cell which is filled from the bottom. The fluid is photographed at high speed using a 35mm camera which incorporated the entire cell length in the field of view. The assembly can incorporate four cells in one flight. For each experiment, an electronics block diagram is provided. A control panel concept is given for the surface induced convection. Both experiments are within the mid-deck locker weight and c-g limits.

  6. Experiences of Computer Science Curriculum Design: A Phenomenological Study

    ERIC Educational Resources Information Center

    Sloan, Arthur; Bowe, Brian

    2015-01-01

    This paper presents a qualitative study of 12 computer science lecturers' experiences of curriculum design of several degree programmes during a time of transition from year-long to semesterised courses, due to institutional policy change. The background to the study is outlined, as are the reasons for choosing the research methodology. The main…

  7. Transducer Design Experiments for Ground-Penetrating Acoustic Systems

    DTIC Science & Technology

    2007-11-02

    subsurface imaging experiments have utilized a source (Tx) and receiver (Rx) configuration in which signals produced by a transmitter at the soil surface...development in the field of acoustic subsurface imaging are as follows. First, a transmitter designed to minimize the emission of surface waves, while

  8. Art & Design Software Development Using IBM Handy (A Personal Experience).

    ERIC Educational Resources Information Center

    McWhinnie, Harold J.

    This paper presents some of the results from a course in art and design. The course involved the use of simple computer programs for the arts. Attention was geared to the development of graphic components for educational software. The purpose of the course was to provide, through lectures and extensive hands on experience, a basic introduction to…

  9. From Content to Context: Videogames as Designed Experience

    ERIC Educational Resources Information Center

    Squire, Kurt

    2006-01-01

    Interactive immersive entertainment, or videogame playing, has emerged as a major entertainment and educational medium. As research and development initiatives proliferate, educational researchers might benefit by developing more grounded theories about them. This article argues for framing game play as a "designed experience." Players'…

  10. A system for designing and simulating particle physics experiments

    NASA Astrophysics Data System (ADS)

    Żelazny, Roman; Strzałkowski, Piotr

    1987-01-01

    In view of the rapid development of experimental facilities and their costs, the systematic design and preparation of particle physics experiments have become crucial. A software system is proposed as an aid for the experimental designer, mainly for experimental geometry analysis and experimental simulation. The following model is adopted: the description of an experiment is formulated in a language (here called XL) and put by its processor in a data base. The language is based on the entity-relationship-attribute approach. The information contained in the data base can be reported and analysed by an analyser (called XA) and modifications can be made at any time. In particular, the Monte Carlo methods can be used in experiment simulation for both physical phenomena in experimental set-up and detection analysis. The general idea of the system is based on the design concept of ISDOS project information systems. The characteristics of the simulation module are similar to those of the CERN Geant system, but some extensions are proposed. The system could be treated as a component of a greater, integrated software environment for the design of particle physics experiments, their monitoring and data processing.

  11. Materials Experience as a Foundation for Materials and Design Education

    ERIC Educational Resources Information Center

    Pedgley, Owain; Rognoli, Valentina; Karana, Elvin

    2016-01-01

    An important body of research has developed in recent years, explaining ways in which product materials influence user experiences. A priority now is to ensure that the research findings are adopted within an educational context to deliver contemporary curricula for students studying the subject of materials and design. This paper reports on an…

  12. Designing Nursing Simulation Clinical Experiences to Promote Critical Inquiry

    ERIC Educational Resources Information Center

    Beattie, Bev; Koroll, Donna; Price, Susan

    2010-01-01

    The use of high fidelity simulation (HFS) learning opportunities in nursing education has received increased attention in the literature. This article describes the design of a systematic framework used to promote critical inquiry and provide meaningful simulation clinical experiences for second year nursing students. Critical inquiry, as defined…

  13. Designing Undergraduate Research Experiences: A Multiplicity of Options

    NASA Astrophysics Data System (ADS)

    Manduca, C. A.

    2001-12-01

    Research experiences for undergraduate students can serve many goals including: developing student understanding of the process of science; providing opportunities for students to develop professional skills or test career plans; completing publishable research; enabling faculty professional development; or enhancing the visibility of a science program. The large range of choices made in the design of an undergraduate research program or opportunity must reflect the goals of the program, the needs and abilities of the students and faculty, and the available resources including both time and money. Effective program design, execution, and evaluation can all be enhanced if the goals of the program are clearly articulated. Student research experiences can be divided into four components: 1) defining the research problem; 2) developing the research plan or experiment design; 3) collecting and interpreting data, and 4) communicating results. In each of these components, the program can be structured in a wide variety of ways and students can be given more or less guidance or freedom. While a feeling of ownership of the research project appears to be very important, examples of successful projects displaying a wide range of design decisions are available. Work with the Keck Geology Consortium suggests that four strategies can enhance the likelihood of successful student experiences: 1) students are well-prepared for research experience (project design must match student preparation); 2) timelines and events are structured to move students through intermediate goals to project completion; 3) support for the emotional, financial, academic and technical challenges of a research project is in place; 4) strong communications between students and faculty set clear expectations and enable mid-course corrections in the program or project design. Creating a research culture for the participants or embedding a project in an existing research culture can also assist students in

  14. Electromagnetic sunscreen model: design of experiments on particle specifications.

    PubMed

    Lécureux, Marie; Deumié, Carole; Enoch, Stefan; Sergent, Michelle

    2015-10-01

    We report a numerical study on sunscreen design and optimization. Thanks to the combined use of electromagnetic modeling and design of experiments, we are able to screen the most relevant parameters of mineral filters and to optimize sunscreens. Several electromagnetic modeling methods are used depending on the type of particles, density of particles, etc. Both the sun protection factor (SPF) and the UVB/UVA ratio are considered. We show that the design of experiments' model should include interactions between materials and other parameters. We conclude that the material of the particles is a key parameter for the SPF and the UVB/UVA ratio. Among the materials considered, none is optimal for both. The SPF is also highly dependent on the size of the particles.

  15. Optimal Experiment Design for Thermal Characterization of Functionally Graded Materials

    NASA Technical Reports Server (NTRS)

    Cole, Kevin D.

    2003-01-01

    The purpose of the project was to investigate methods to accurately verify that designed , materials meet thermal specifications. The project involved heat transfer calculations and optimization studies, and no laboratory experiments were performed. One part of the research involved study of materials in which conduction heat transfer predominates. Results include techniques to choose among several experimental designs, and protocols for determining the optimum experimental conditions for determination of thermal properties. Metal foam materials were also studied in which both conduction and radiation heat transfer are present. Results of this work include procedures to optimize the design of experiments to accurately measure both conductive and radiative thermal properties. Detailed results in the form of three journal papers have been appended to this report.

  16. Design of a proof of principle high current transport experiment

    SciTech Connect

    Lund, S.M.; Bangerter, R.O.; Barnard, J.J.; Celata, C.M.; Faltens, A.; Friedman, A.; Kwan, J.W.; Lee, E.P.; Seidl, P.A.

    2000-01-15

    Preliminary designs of an intense heavy-ion beam transport experiment to test issues for Heavy Ion Fusion (HIF) are presented. This transport channel will represent a single high current density beam at full driver scale and will evaluate practical issues such as aperture filling factors, electrons, halo, imperfect vacuum, etc., that cannot be fully tested using scaled experiments. Various machine configurations are evaluated in the context of the range of physics and technology issues that can be explored in a manner relevant to a full scale driver. it is anticipated that results from this experiment will allow confident construction of next generation ''Integrated Research Experiments'' leading to a full scale driver for energy production.

  17. Explorations in Teaching Sustainable Design: A Studio Experience in Interior Design/Architecture

    ERIC Educational Resources Information Center

    Gurel, Meltem O.

    2010-01-01

    This article argues that a design studio can be a dynamic medium to explore the creative potential of the complexity of sustainability from its technological to social ends. The study seeks to determine the impact of an interior design/architecture studio experience that was initiated to teach diverse meanings of sustainability and to engage the…

  18. Developing Conceptual Hypersonic Airbreathing Engines Using Design of Experiments Methods

    NASA Technical Reports Server (NTRS)

    Ferlemann, Shelly M.; Robinson, Jeffrey S.; Martin, John G.; Leonard, Charles P.; Taylor, Lawrence W.; Kamhawi, Hilmi

    2000-01-01

    Designing a hypersonic vehicle is a complicated process due to the multi-disciplinary synergy that is required. The greatest challenge involves propulsion-airframe integration. In the past, a two-dimensional flowpath was generated based on the engine performance required for a proposed mission. A three-dimensional CAD geometry was produced from the two-dimensional flowpath for aerodynamic analysis, structural design, and packaging. The aerodynamics, engine performance, and mass properties arc inputs to the vehicle performance tool to determine if the mission goals were met. If the mission goals were not met, then a flowpath and vehicle redesign would begin. This design process might have to be performed several times to produce a "closed" vehicle. This paper will describe an attempt to design a hypersonic cruise vehicle propulsion flowpath using a Design of' Experiments method to reduce the resources necessary to produce a conceptual design with fewer iterations of the design cycle. These methods also allow for more flexible mission analysis and incorporation of additional design constraints at any point. A design system was developed using an object-based software package that would quickly generate each flowpath in the study given the values of the geometric independent variables. These flowpath geometries were put into a hypersonic propulsion code and the engine performance was generated. The propulsion results were loaded into statistical software to produce regression equations that were combined with an aerodynamic database to optimize the flowpath at the vehicle performance level. For this example, the design process was executed twice. The first pass was a cursory look at the independent variables selected to determine which variables are the most important and to test all of the inputs to the optimization process. The second cycle is a more in-depth study with more cases and higher order equations representing the design space.

  19. Fuel and Core Design Experiences in Cofrentes NPP

    SciTech Connect

    Garcia-Delgado, L.; Lopez-Carbonell, M.T.; Gomez-Bernal, I.

    2002-07-01

    The electricity market deregulation in Spain is increasing the need for innovations in nuclear power generation, which can be achieved in the fuel area by improving fuel and core designs and by introducing vendors competition. Iberdrola has developed the GIRALDA methodology for design and licensing of Cofrentes reloads, and has introduced mixed cores with fuel from different vendors. The application of GIRALDA is giving satisfactory results, and is showing its capability to adequately reproduce the core behaviour. The nuclear design team is acquiring an invaluable experience and a deep knowledge of the core, very useful to support cycle operation. Continuous improvements are expected for the future in design strategies as well as in the application of new technologies to redesign the methodology processes. (authors)

  20. Optimizing an experimental design for an electromagnetic experiment

    NASA Astrophysics Data System (ADS)

    Roux, E.; Garcia, X. A.

    2011-12-01

    Most of geophysical studies focus on data acquisition and analysis, but another aspect which is gaining importance is the discussion on the acquisition of suitable datasets. This can be done through the design of an optimal experiment. An optimal experiment maximizes geophysical information while maintaining the cost of the experiment as low as possible. This requires a careful selection of recording parameters as source and receivers locations or range of periods needed to image the target. We are developing a method to design an optimal experiment in the context of detecting and monitoring a CO2 reservoir using controlled-source electromagnetic (CSEM) data. Using an algorithm for a simple one-dimensional (1D) situation, we look for the most suitable locations for source and receivers and optimum characteristics of the source to image the subsurface. One main advantage of this kind of technique to design an experiment is that it does not require the acquisition of any data and can thus be easily conducted before any geophysical survey. Our algorithm is based on a genetic algorithm which has been proved to be an efficient technique to examine a wide range of possible surveys and select the one that gives superior resolution. Each particular design needs to be quantified. Different quantities have been used to estimate the "goodness" of a model, most of them being sensitive to the eigenvalues of the corresponding inversion problem. Here we show a comparison of results obtained using different objective functions. Then, we simulate a CSEM survey with a realistic 1D structure and discuss the optimum recording parameters determined by our method.

  1. Optimized design and analysis of sparse-sampling FMRI experiments.

    PubMed

    Perrachione, Tyler K; Ghosh, Satrajit S

    2013-01-01

    Sparse-sampling is an important methodological advance in functional magnetic resonance imaging (fMRI), in which silent delays are introduced between MR volume acquisitions, allowing for the presentation of auditory stimuli without contamination by acoustic scanner noise and for overt vocal responses without motion-induced artifacts in the functional time series. As such, the sparse-sampling technique has become a mainstay of principled fMRI research into the cognitive and systems neuroscience of speech, language, hearing, and music. Despite being in use for over a decade, there has been little systematic investigation of the acquisition parameters, experimental design considerations, and statistical analysis approaches that bear on the results and interpretation of sparse-sampling fMRI experiments. In this report, we examined how design and analysis choices related to the duration of repetition time (TR) delay (an acquisition parameter), stimulation rate (an experimental design parameter), and model basis function (an analysis parameter) act independently and interactively to affect the neural activation profiles observed in fMRI. First, we conducted a series of computational simulations to explore the parameter space of sparse design and analysis with respect to these variables; second, we validated the results of these simulations in a series of sparse-sampling fMRI experiments. Overall, these experiments suggest the employment of three methodological approaches that can, in many situations, substantially improve the detection of neurophysiological response in sparse fMRI: (1) Sparse analyses should utilize a physiologically informed model that incorporates hemodynamic response convolution to reduce model error. (2) The design of sparse fMRI experiments should maintain a high rate of stimulus presentation to maximize effect size. (3) TR delays of short to intermediate length can be used between acquisitions of sparse-sampled functional image volumes to increase

  2. SSSFD manipulator engineering using statistical experiment design techniques

    NASA Technical Reports Server (NTRS)

    Barnes, John

    1991-01-01

    The Satellite Servicer System Flight Demonstration (SSSFD) program is a series of Shuttle flights designed to verify major on-orbit satellite servicing capabilities, such as rendezvous and docking of free flyers, Orbital Replacement Unit (ORU) exchange, and fluid transfer. A major part of this system is the manipulator system that will perform the ORU exchange. The manipulator must possess adequate toolplate dexterity to maneuver a variety of EVA-type tools into position to interface with ORU fasteners, connectors, latches, and handles on the satellite, and to move workpieces and ORUs through 6 degree of freedom (dof) space from the Target Vehicle (TV) to the Support Module (SM) and back. Two cost efficient tools were combined to perform a study of robot manipulator design parameters. These tools are graphical computer simulations and Taguchi Design of Experiment methods. Using a graphics platform, an off-the-shelf robot simulation software package, and an experiment designed with Taguchi's approach, the sensitivities of various manipulator kinematic design parameters to performance characteristics are determined with minimal cost.

  3. Design of experiments applications in bioprocessing: concepts and approach.

    PubMed

    Kumar, Vijesh; Bhalla, Akriti; Rathore, Anurag S

    2014-01-01

    Most biotechnology unit operations are complex in nature with numerous process variables, feed material attributes, and raw material attributes that can have significant impact on the performance of the process. Design of experiments (DOE)-based approach offers a solution to this conundrum and allows for an efficient estimation of the main effects and the interactions with minimal number of experiments. Numerous publications illustrate application of DOE towards development of different bioprocessing unit operations. However, a systematic approach for evaluation of the different DOE designs and for choosing the optimal design for a given application has not been published yet. Through this work we have compared the I-optimal and D-optimal designs to the commonly used central composite and Box-Behnken designs for bioprocess applications. A systematic methodology is proposed for construction of the model and for precise prediction of the responses for the three case studies involving some of the commonly used unit operations in downstream processing. Use of Akaike information criterion for model selection has been examined and found to be suitable for the applications under consideration.

  4. Experimental Design and Power Calculation for RNA-seq Experiments.

    PubMed

    Wu, Zhijin; Wu, Hao

    2016-01-01

    Power calculation is a critical component of RNA-seq experimental design. The flexibility of RNA-seq experiment and the wide dynamic range of transcription it measures make it an attractive technology for whole transcriptome analysis. These features, in addition to the high dimensionality of RNA-seq data, bring complexity in experimental design, making an analytical power calculation no longer realistic. In this chapter we review the major factors that influence the statistical power of detecting differential expression, and give examples of power assessment using the R package PROPER.

  5. Skylab Medical Experiments Altitude Test /SMEAT/ facility design and operation.

    NASA Technical Reports Server (NTRS)

    Hinners, A. H., Jr.; Correale, J. V.

    1973-01-01

    This paper presents the design approaches and test facility operation methods used to successfully accomplish a 56-day test for Skylab to permit evaluation of selected Skylab medical experiments in a ground test simulation of the Skylab environment with an astronaut crew. The systems designed for this test include the two-gas environmental control system, the fire suppression and detection system, equipment transfer lock, ground support equipment, safety systems, potable water system, waste management system, lighting and power system, television monitoring, communications and recreation systems, and food freezer.

  6. Aircraft integrated design and analysis: A classroom experience

    NASA Technical Reports Server (NTRS)

    Weisshaar, Terrence A.

    1989-01-01

    AAE 451 is the capstone course required of all senior undergraduates in the School of Aeronautics and Astronautics at Purdue University. During the past year the first steps of a long evolutionary process were taken to change the content and expectations of this course. These changes are the result of the availability of advanced computational capabilities and sophisticated electronic media availability at Purdue. This presentation will describe both the long range objectives and this year's experience using the High Speed Commercial Transport design, the AIAA Long Duration Aircraft design and RPV design proposal as project objectives. The central goal of these efforts is to provide a user-friendly, computer-software-based environment to supplement traditional design course methodology. The Purdue University Computer Center (PUCC), the Engineering Computer Network (ECN) and stand-alone PC's are being used for this development. This year's accomplishments center primarily on aerodynamics software obtained from NASA/Langley and its integration into the classroom. Word processor capability for oral and written work and computer graphics were also blended into the course. A total of ten HSCT designs were generated, ranging from twin-fuselage aircraft, forward swept wing aircraft to the more traditional delta and double-delta wing aircraft. Four Long Duration Aircraft designs were submitted, together with one RPV design tailored for photographic surveillance.

  7. Scaling studies and conceptual experiment designs for NGNP CFD assessment

    SciTech Connect

    D. M. McEligot; G. E. McCreery

    2004-11-01

    The objective of this report is to document scaling studies and conceptual designs for flow and heat transfer experiments intended to assess CFD codes and their turbulence models proposed for application to prismatic NGNP concepts. The general approach of the project is to develop new benchmark experiments for assessment in parallel with CFD and coupled CFD/systems code calculations for the same geometry. Two aspects of the complex flow in an NGNP are being addressed: (1) flow and thermal mixing in the lower plenum ("hot streaking" issue) and (2) turbulence and resulting temperature distributions in reactor cooling channels ("hot channel" issue). Current prismatic NGNP concepts are being examined to identify their proposed flow conditions and geometries over the range from normal operation to decay heat removal in a pressurized cooldown. Approximate analyses have been applied to determine key non-dimensional parameters and their magnitudes over this operating range. For normal operation, the flow in the coolant channels can be considered to be dominant turbulent forced convection with slight transverse property variation. In a pressurized cooldown (LOFA) simulation, the flow quickly becomes laminar with some possible buoyancy influences. The flow in the lower plenum can locally be considered to be a situation of multiple hot jets into a confined crossflow -- with obstructions. Flow is expected to be turbulent with momentumdominated turbulent jets entering; buoyancy influences are estimated to be negligible in normal full power operation. Experiments are needed for the combined features of the lower plenum flows. Missing from the typical jet experiments available are interactions with nearby circular posts and with vertical posts in the vicinity of vertical walls - with near stagnant surroundings at one extreme and significant crossflow at the other. Two types of heat transfer experiments are being considered. One addresses the "hot channel" problem, if necessary

  8. Design of the NASA Lewis 4-Port Wave Rotor Experiment

    NASA Technical Reports Server (NTRS)

    Wilson, Jack

    1997-01-01

    Pressure exchange wave rotors, used in a topping stage, are currently being considered as a possible means of increasing the specific power, and reducing the specific fuel consumption of gas turbine engines. Despite this interest, there is very little information on the performance of a wave rotor operating on the cycle (i.e., set of waves) appropriate for use in a topping stage. One such cycle, which has the advantage of being relatively easy to incorporate into an engine, is the four-port cycle. Consequently, an experiment to measure the performance of a four-port wave rotor for temperature ratios relevant to application as a topping cycle for a gas turbine engine has been designed and built at NASA Lewis. The design of the wave rotor is described, together with the constraints on the experiment.

  9. Thermal design support for the Explorer gamma ray experiment telescope

    NASA Technical Reports Server (NTRS)

    Almgren, D. W.; Lee, W. D.; Mathias, S.

    1975-01-01

    The results of a thermal design definition study for the GSFC Explorer Gamma Ray Experiment Telescope (EGRET) were documented. A thermal computer model of EGRET with 241 nodes was developed and used to analyze the thermal performance of the experiment for a range of orbits, payload orientations and internal power dissipations. The recommended thermal design utilizes a small radiator with an area of 1.78 square foot on the anti-sun side of the mission adaptor and circumferential heat pipes on the interior of the same adaptor to transfer heat from the electronics compartments to the single radiator. Fifty watts of thermostatically controlled heater power are used to control the temperature level to 10 C + or - 20 C inside the insulated dome structure.

  10. Design and development of PROBA-3 rendezvous experiment

    NASA Astrophysics Data System (ADS)

    Bastante, Juan C.; Vasconcelos, José; Hagenfeldt, Miguel; Peñín, Luis F.; Dinis, João; Rebordão, José

    2014-09-01

    PROBA-3 is a technology demonstration mission with the objective of, among others, raising the Formation Flying (FF) technology up to Technology Readiness Level (TRL) 8 or 9. The context of this mission has strong synergies with the knowledge areas considered in Rendezvous (RV), namely the fields of GNC, metrology, actuator systems, etc. This common ground between FF and RV allowed for a dedicated Rendezvous Experiment (RVX) to be performed in the scope of PROBA-3. The RVX is based only on camera measurements, and designed for highly elliptical orbits with strong constraints on relative position and attitude. This paper presents the design and development of the RVX experiment, with the goal to demonstrate the feasibility of vision-based RV and to increase the associated TRL.

  11. Design and experiment performances of an inchworm type rotary actuator.

    PubMed

    Li, Jianping; Zhao, Hongwei; Shao, Mingkun; Zhou, Xiaoqin; Huang, Hu; Fan, Zunqiang

    2014-08-01

    A piezo-driven rotary actuator by means of inchworm principle is proposed in this paper. Six piezo-stacks and flexure hinges are used to realize large rotation ranges with high accuracy both in the forward and backward motions. Four right-angle flexure hinges and two right-circular flexure hinges are applied in the stator. The motion principle and theoretical analysis of the designed actuator are discussed. In order to investigate the working characteristics, a prototype actuator was manufactured and a series of experiment tests were carried out. The test results indicate that the maximum rotation velocity is 71,300 μrad/s; the maximum output torque is 19.6 N mm. The experiment results confirm that the designed actuator can obtain large rotation motion ranges with relatively high output torques and different rotation speeds on the condition of different driving voltages and frequencies.

  12. Statistical design of quantitative mass spectrometry-based proteomic experiments.

    PubMed

    Oberg, Ann L; Vitek, Olga

    2009-05-01

    We review the fundamental principles of statistical experimental design, and their application to quantitative mass spectrometry-based proteomics. We focus on class comparison using Analysis of Variance (ANOVA), and discuss how randomization, replication and blocking help avoid systematic biases due to the experimental procedure, and help optimize our ability to detect true quantitative changes between groups. We also discuss the issues of pooling multiple biological specimens for a single mass analysis, and calculation of the number of replicates in a future study. When applicable, we emphasize the parallels between designing quantitative proteomic experiments and experiments with gene expression microarrays, and give examples from that area of research. We illustrate the discussion using theoretical considerations, and using real-data examples of profiling of disease.

  13. The design and analysis of state-trace experiments.

    PubMed

    Prince, Melissa; Brown, Scott; Heathcote, Andrew

    2012-03-01

    State-trace analysis (Bamber, 1979) addresses a question of interest in many areas of psychological research: Does 1 or more than 1 latent (i.e., not directly observed) variable mediate an interaction between 2 experimental manipulations? There is little guidance available on how to design an experiment suited to state-trace analysis, despite its increasing use, and existing statistical methods for state-trace analysis are problematic. We provide a framework for designing and refining a state-trace experiment and statistical procedures for the analysis of accuracy data using Klugkist, Kato, and Hoijtink's (2005) method of estimating Bayes factors. The statistical procedures provide estimates of the evidence favoring 1 versus more than 1 latent variable, as well as evidence that can be used to refine experimental methodology.

  14. Analysis of Variance in the Modern Design of Experiments

    NASA Technical Reports Server (NTRS)

    Deloach, Richard

    2010-01-01

    This paper is a tutorial introduction to the analysis of variance (ANOVA), intended as a reference for aerospace researchers who are being introduced to the analytical methods of the Modern Design of Experiments (MDOE), or who may have other opportunities to apply this method. One-way and two-way fixed-effects ANOVA, as well as random effects ANOVA, are illustrated in practical terms that will be familiar to most practicing aerospace researchers.

  15. DIME Students Discuss Final Drop Tower Experiment Design

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Students discuss fine points of their final design for the Drop Tower experiment during the second Dropping in a Microgravity Environment (DIME) competition held April 23-25, 2002, at NASA's Glenn Research Center. Competitors included two teams from Sycamore High School, Cincinnati, OH, and one each from Bay High School, Bay Village, OH, and COSI Academy, Columbus, OH. DIME is part of NASA's education and outreach activities. Details are on line at http://microgravity.grc.nasa.gov/DIME_2002.html.

  16. Capstone Dichotomies: A Proposed Framework for Characterizing Capstone Design Experiences

    DTIC Science & Technology

    2015-03-18

    discipline has freedom in how they achieve these outcomes, so long as it is a deliberate and traceable approach back to the desired outcomes. This freedom...allows each discipline to tailor their capstone design experience tot hose appropriate to their domains. When students are developed fully within a...single discipline program that also offers their capstone, the structure promotes alignment of the student, instructor, and advisor expectations. However

  17. The Modern Design of Experiments: A Technical and Marketing Framework

    NASA Technical Reports Server (NTRS)

    DeLoach, R.

    2000-01-01

    A new wind tunnel testing process under development at NASA Langley Research Center, called Modern Design of Experiments (MDOE), differs from conventional wind tunnel testing techniques on a number of levels. Chief among these is that MDOE focuses on the generation of adequate prediction models rather than high-volume data collection. Some cultural issues attached to this and other distinctions between MDOE and conventional wind tunnel testing are addressed in this paper.

  18. Optimizing the design of geophysical experiments: Is it worthwhile?

    NASA Astrophysics Data System (ADS)

    Curtis, Andrew; Maurer, Hansruedi

    Determining the structure, composition, and state of the Earth's subsurface from measured data is the principal task of many geophysical experiments and surveys. Standard procedures involve the recording of appropriate data sets followed by the application of data analysis techniques to extract the desired information. While the importance of new tools for the analysis stage of an experiment is well recognized, much less attention seems to be paid to improving the data acquisition.A measure of the effort allocated to data analysis research relative to that devoted to data acquisition research is presented in Figure 1. Since 1955 there have been more than 10,000 publications on inversion methods alone, but in the same period only 100 papers on experimental design have appeared in journals. Considering that the acquisition component of an experiment defines what information will be contained in the data, and that no amount of data analysis can compensate for the lack of such information, we suggest that greater effort be made to improve survey planning techniques. Furthermore, given that logistical and financial constraints are often stringent and that relationships between geophysical data and model parameters describing the Earths subsurface are generally complicated, optimizing the design of an experiment may be quite challenging. Here we review experimental design procedures that optimize the benefit of a field survey, such that maximum information about the target structures is obtained at minimum cost. We also announce a new Web site and e-mail group set up as a forum for communication on survey design research and application.

  19. Recent developments in virtual experience design and production

    NASA Astrophysics Data System (ADS)

    Fisher, Scott S.

    1995-03-01

    Today, the media of VR and Telepresence are in their infancy and the emphasis is still on technology and engineering. But, it is not the hardware people might use that will determine whether VR becomes a powerful medium--instead, it will be the experiences that they are able to have that will drive its acceptance and impact. A critical challenge in the elaboration of these telepresence capabilities will be the development of environments that are as unpredictable and rich in interconnected processes as an actual location or experience. This paper will describe the recent development of several Virtual Experiences including: `Menagerie', an immersive Virtual Environment inhabited by virtual characters designed to respond to and interact with its users; and `The Virtual Brewery', an immersive public VR installation that provides multiple levels of interaction in an artistic interpretation of the brewing process.

  20. Design of the new METAS watt balance experiment Mark II

    NASA Astrophysics Data System (ADS)

    Baumann, H.; Eichenberger, A.; Cosandier, F.; Jeckelmann, B.; Clavel, R.; Reber, D.; Tommasini, D.

    2013-06-01

    The kilogram is the last unit of the international system of units (SI) still based on a material artefact, the international prototype of the kilogram (IPK). The comparisons made in the last hundred years have clearly revealed a long-term relative drift between the IPK and the official copies kept under similar conditions at the Bureau International des Poids et Mesures. A promising route towards a new definition of the kilogram based on a fundamental constant is represented by the watt balance experiment which links the mass unit to the Planck constant h. For more than ten years, the Federal Institute of Metrology METAS has been actively working in the conception and development of a watt balance experiment. This paper describes the new design of the Mark II METAS watt balance. The metrological characteristics of the different components of the experiment are described and discussed.

  1. Optimal color design of psychological counseling room by design of experiments and response surface methodology.

    PubMed

    Liu, Wenjuan; Ji, Jianlin; Chen, Hua; Ye, Chenyu

    2014-01-01

    Color is one of the most powerful aspects of a psychological counseling environment. Little scientific research has been conducted on color design and much of the existing literature is based on observational studies. Using design of experiments and response surface methodology, this paper proposes an optimal color design approach for transforming patients' perception into color elements. Six indices, pleasant-unpleasant, interesting-uninteresting, exciting-boring, relaxing-distressing, safe-fearful, and active-inactive, were used to assess patients' impression. A total of 75 patients participated, including 42 for Experiment 1 and 33 for Experiment 2. 27 representative color samples were designed in Experiment 1, and the color sample (L = 75, a = 0, b = -60) was the most preferred one. In Experiment 2, this color sample was set as the 'central point', and three color attributes were optimized to maximize the patients' satisfaction. The experimental results show that the proposed method can get the optimal solution for color design of a counseling room.

  2. Design of a Magnetic Reconnection Experiment in the Collisionless Regime

    NASA Astrophysics Data System (ADS)

    Egedal, J.; Le, A.; Daughton, W. S.

    2012-12-01

    A new model for effective heating of electrons during reconnection is now gaining support from spacecraft observations, theoretical considerations and kinetic simulations [1]. The key ingredient in the model is the physics of trapped electrons whose dynamics causes the electron pressure tensor to be strongly anisotropic [2]. The heating mechanism becomes highly efficient for geometries with low upstream electron pressure, conditions relevant to the magnetotail. We propose a new experiment that will be optimized for the study of kinetic reconnection including the dynamics of trapped electrons and associated pressure anisotropy. This requires an experiment that accesses plasmas with much lower collisionality and lower plasma beta than are available in present reconnection experiments. The new experiment will be designed such that a large variety of magnetic configurations can be established and tailored for continuation of our ongoing study of spontaneous 3D reconnection [3]. The flexible design will also allow for configurations suitable for the study of merging magnetic islands, which may be a source of super thermal electrons in naturally occurring plasmas. [1] Egedal J et al., Nature Physics, 8, 321 (2012). [2] Le A et al., Phys. Rev. Lett. 102, 085001 (2009). [3] Katz N et al., Phys. Rev. Lett. 104, 255004 (2010).;

  3. Optimal experiment design for model selection in biochemical networks

    PubMed Central

    2014-01-01

    Background Mathematical modeling is often used to formalize hypotheses on how a biochemical network operates by discriminating between competing models. Bayesian model selection offers a way to determine the amount of evidence that data provides to support one model over the other while favoring simple models. In practice, the amount of experimental data is often insufficient to make a clear distinction between competing models. Often one would like to perform a new experiment which would discriminate between competing hypotheses. Results We developed a novel method to perform Optimal Experiment Design to predict which experiments would most effectively allow model selection. A Bayesian approach is applied to infer model parameter distributions. These distributions are sampled and used to simulate from multivariate predictive densities. The method is based on a k-Nearest Neighbor estimate of the Jensen Shannon divergence between the multivariate predictive densities of competing models. Conclusions We show that the method successfully uses predictive differences to enable model selection by applying it to several test cases. Because the design criterion is based on predictive distributions, which can be computed for a wide range of model quantities, the approach is very flexible. The method reveals specific combinations of experiments which improve discriminability even in cases where data is scarce. The proposed approach can be used in conjunction with existing Bayesian methodologies where (approximate) posteriors have been determined, making use of relations that exist within the inferred posteriors. PMID:24555498

  4. An industrial approach to design compelling VR and AR experience

    NASA Astrophysics Data System (ADS)

    Richir, Simon; Fuchs, Philippe; Lourdeaux, Domitile; Buche, Cédric; Querrec, Ronan

    2013-03-01

    The convergence of technologies currently observed in the field of VR, AR, robotics and consumer electronic reinforces the trend of new applications appearing every day. But when transferring knowledge acquired from research to businesses, research laboratories are often at a loss because of a lack of knowledge of the design and integration processes in creating an industrial scale product. In fact, the innovation approaches that take a good idea from the laboratory to a successful industrial product are often little known to researchers. The objective of this paper is to present the results of the work of several research teams that have finalized a working method for researchers and manufacturers that allow them to design virtual or augmented reality systems and enable their users to enjoy "a compelling VR experience". That approach, called "the I2I method", present 11 phases from "Establishing technological and competitive intelligence and industrial property" to "Improvements" through the "Definition of the Behavioral Interface, Virtual Environment and Behavioral Software Assistance". As a result of the experience gained by various research teams, this design approach benefits from contributions from current VR and AR research. Our objective is to validate and continuously move such multidisciplinary design team methods forward.

  5. Optimal serial dilutions designs for drug discovery experiments.

    PubMed

    Donev, Alexander N; Tobias, Randall D

    2011-05-01

    Dose-response studies are an essential part of the drug discovery process. They are typically carried out on a large number of chemical compounds using serial dilution experimental designs. This paper proposes a method of selecting the key parameters of these designs (maximum dose, dilution factor, number of concentrations and number of replicated observations for each concentration) depending on the stage of the drug discovery process where the study takes place. This is achieved by employing and extending results from optimal design theory. Population D- and D(S)-optimality are defined and used to evaluate the precision of estimating the potency of the tested compounds. The proposed methodology is easy to use and creates opportunities to reduce the cost of the experiments without compromising the quality of the data obtained in them.

  6. Influence analysis on crossover design experiment in bioequivalence studies.

    PubMed

    Huang, Yufen; Ke, Bo-Shiang

    2014-01-01

    Crossover designs are commonly used in bioequivalence studies. However, the results can be affected by some outlying observations, which may lead to the wrong decision on bioequivalence. Therefore, it is essential to investigate the influence of aberrant observations. Chow and Tse in 1990 discussed this issue by considering the methods based on the likelihood distance and estimates distance. Perturbation theory provides a useful tool for the sensitivity analysis on statistical models. Hence, in this paper, we develop the influence functions via the perturbation scheme proposed by Hampel as an alternative approach on the influence analysis for a crossover design experiment. Moreover, the comparisons between the proposed approach and the method proposed by Chow and Tse are investigated. Two real data examples are provided to illustrate the results of these approaches. Our proposed influence functions show excellent performance on the identification of outlier/influential observations and are suitable for use with small sample size crossover designs commonly used in bioequivalence studies.

  7. Thermal Design and Analysis for the Cryogenic MIDAS Experiment

    NASA Technical Reports Server (NTRS)

    Amundsen, Ruth McElroy

    1997-01-01

    The Materials In Devices As Superconductors (MIDAS) spaceflight experiment is a NASA payload which launched in September 1996 on the Shuttle, and was transferred to the Mir Space Station for several months of operation. MIDAS was developed and built at NASA Langley Research Center (LaRC). The primary objective of the experiment was to determine the effects of microgravity and spaceflight on the electrical properties of high-temperature superconductive (HTS) materials. The thermal challenge on MIDAS was to maintain the superconductive specimens at or below 80 K for the entire operation of the experiment, including all ground testing and 90 days of spaceflight operation. Cooling was provided by a small tactical cryocooler. The superconductive specimens and the coldfinger of the cryocooler were mounted in a vacuum chamber, with vacuum levels maintained by an ion pump. The entire experiment was mounted for operation in a stowage locker inside Mir, with the only heat dissipation capability provided by a cooling fan exhausting to the habitable compartment. The thermal environment on Mir can potentially vary over the range 5 to 40 C; this was the range used in testing, and this wide range adds to the difficulty in managing the power dissipated from the experiment's active components. Many issues in the thermal design are discussed, including: thermal isolation methods for the cryogenic samples; design for cooling to cryogenic temperatures; cryogenic epoxy bonds; management of ambient temperature components self-heating; and fan cooling of the enclosed locker. Results of the design are also considered, including the thermal gradients across the HTS samples and cryogenic thermal strap, electronics and thermal sensor cryogenic performance, and differences between ground and flight performance. Modeling was performed in both SINDA-85 and MSC/PATRAN (with direct geometry import from the CAD design tool Pro/Engineer). Advantages of both types of models are discussed

  8. Aircraft integrated design and analysis: A classroom experience

    NASA Technical Reports Server (NTRS)

    1988-01-01

    AAE 451 is the capstone course required of all senior undergraduates in the School of Aeronautics and Astronautics at Purdue University. During the past year the first steps of a long evolutionary process were taken to change the content and expectations of this course. These changes are the result of the availability of advanced computational capabilities and sophisticated electronic media availability at Purdue. This presentation will describe both the long range objectives and this year's experience using the High Speed Commercial Transport (HSCT) design, the AIAA Long Duration Aircraft design and a Remotely Piloted Vehicle (RPV) design proposal as project objectives. The central goal of these efforts was to provide a user-friendly, computer-software-based, environment to supplement traditional design course methodology. The Purdue University Computer Center (PUCC), the Engineering Computer Network (ECN), and stand-alone PC's were used for this development. This year's accomplishments centered primarily on aerodynamics software obtained from the NASA Langley Research Center and its integration into the classroom. Word processor capability for oral and written work and computer graphics were also blended into the course. A total of 10 HSCT designs were generated, ranging from twin-fuselage and forward-swept wing aircraft, to the more traditional delta and double-delta wing aircraft. Four Long Duration Aircraft designs were submitted, together with one RPV design tailored for photographic surveillance. Supporting these activities were three video satellite lectures beamed from NASA/Langley to Purdue. These lectures covered diverse areas such as an overview of HSCT design, supersonic-aircraft stability and control, and optimization of aircraft performance. Plans for next year's effort will be reviewed, including dedicated computer workstation utilization, remote satellite lectures, and university/industrial cooperative efforts.

  9. Creating meaningful learning experiences: Understanding students' perspectives of engineering design

    NASA Astrophysics Data System (ADS)

    Aleong, Richard James Chung Mun

    , relevance, and transfer. With this framework of student learning, engineering educators can enhance learning experiences by engaging all three levels of students' understanding. The curriculum studies orientation applied the three holistic elements of curriculum---subject matter, society, and the individual---to conceptualize design considerations for engineering curriculum and teaching practice. This research supports the characterization of students' learning experiences to help educators and students optimize their teaching and learning of design education.

  10. Design and modeling of small scale multiple fracturing experiments

    SciTech Connect

    Cuderman, J F

    1981-12-01

    Recent experiments at the Nevada Test Site (NTS) have demonstrated the existence of three distinct fracture regimes. Depending on the pressure rise time in a borehole, one can obtain hydraulic, multiple, or explosive fracturing behavior. The use of propellants rather than explosives in tamped boreholes permits tailoring of the pressure risetime over a wide range since propellants having a wide range of burn rates are available. This technique of using the combustion gases from a full bore propellant charge to produce controlled borehole pressurization is termed High Energy Gas Fracturing (HEGF). Several series of HEGF, in 0.15 m and 0.2 m diameter boreholes at 12 m depths, have been completed in a tunnel complex at NTS where mineback permitted direct observation of fracturing obtained. Because such large experiments are costly and time consuming, smaller scale experiments are desirable, provided results from small experiments can be used to predict fracture behavior in larger boreholes. In order to design small scale gas fracture experiments, the available data from previous HEGF experiments were carefully reviewed, analytical elastic wave modeling was initiated, and semi-empirical modeling was conducted which combined predictions for statically pressurized boreholes with experimental data. The results of these efforts include (1) the definition of what constitutes small scale experiments for emplacement in a tunnel complex at the Nevada Test Site, (2) prediction of average crack radius, in ash fall tuff, as a function of borehole size and energy input per unit length, (3) definition of multiple-hydraulic and multiple-explosive fracture boundaries as a function of boreholes size and surface wave velocity, (4) semi-empirical criteria for estimating stress and acceleration, and (5) a proposal that multiple fracture orientations may be governed by in situ stresses.

  11. Designing Experiments to Discriminate Families of Logic Models

    PubMed Central

    Videla, Santiago; Konokotina, Irina; Alexopoulos, Leonidas G.; Saez-Rodriguez, Julio; Schaub, Torsten; Siegel, Anne; Guziolowski, Carito

    2015-01-01

    Logic models of signaling pathways are a promising way of building effective in silico functional models of a cell, in particular of signaling pathways. The automated learning of Boolean logic models describing signaling pathways can be achieved by training to phosphoproteomics data, which is particularly useful if it is measured upon different combinations of perturbations in a high-throughput fashion. However, in practice, the number and type of allowed perturbations are not exhaustive. Moreover, experimental data are unavoidably subjected to noise. As a result, the learning process results in a family of feasible logical networks rather than in a single model. This family is composed of logic models implementing different internal wirings for the system and therefore the predictions of experiments from this family may present a significant level of variability, and hence uncertainty. In this paper, we introduce a method based on Answer Set Programming to propose an optimal experimental design that aims to narrow down the variability (in terms of input–output behaviors) within families of logical models learned from experimental data. We study how the fitness with respect to the data can be improved after an optimal selection of signaling perturbations and how we learn optimal logic models with minimal number of experiments. The methods are applied on signaling pathways in human liver cells and phosphoproteomics experimental data. Using 25% of the experiments, we obtained logical models with fitness scores (mean square error) 15% close to the ones obtained using all experiments, illustrating the impact that our approach can have on the design of experiments for efficient model calibration. PMID:26389116

  12. Tuning Parameters in Heuristics by Using Design of Experiments Methods

    NASA Technical Reports Server (NTRS)

    Arin, Arif; Rabadi, Ghaith; Unal, Resit

    2010-01-01

    With the growing complexity of today's large scale problems, it has become more difficult to find optimal solutions by using exact mathematical methods. The need to find near-optimal solutions in an acceptable time frame requires heuristic approaches. In many cases, however, most heuristics have several parameters that need to be "tuned" before they can reach good results. The problem then turns into "finding best parameter setting" for the heuristics to solve the problems efficiently and timely. One-Factor-At-a-Time (OFAT) approach for parameter tuning neglects the interactions between parameters. Design of Experiments (DOE) tools can be instead employed to tune the parameters more effectively. In this paper, we seek the best parameter setting for a Genetic Algorithm (GA) to solve the single machine total weighted tardiness problem in which n jobs must be scheduled on a single machine without preemption, and the objective is to minimize the total weighted tardiness. Benchmark instances for the problem are available in the literature. To fine tune the GA parameters in the most efficient way, we compare multiple DOE models including 2-level (2k ) full factorial design, orthogonal array design, central composite design, D-optimal design and signal-to-noise (SIN) ratios. In each DOE method, a mathematical model is created using regression analysis, and solved to obtain the best parameter setting. After verification runs using the tuned parameter setting, the preliminary results for optimal solutions of multiple instances were found efficiently.

  13. Trends in integrated circuit design for particle physics experiments

    NASA Astrophysics Data System (ADS)

    Atkin, E. V.

    2017-01-01

    Integrated circuits are one of the key complex units available to designers of multichannel detector setups. A whole number of factors makes Application Specific Integrated Circuits (ASICs) valuable for Particle Physics and Astrophysics experiments. Among them the most important ones are: integration scale, low power dissipation, radiation tolerance. In order to make possible future experiments in the intensity, cosmic, and energy frontiers today ASICs should provide new level of functionality at a new set of constraints and trade-offs, like low-noise high-dynamic range amplification and pulse shaping, high-speed waveform sampling, low power digitization, fast digital data processing, serialization and data transmission. All integrated circuits, necessary for physical instrumentation, should be radiation tolerant at an earlier not reached level (hundreds of Mrad) of total ionizing dose and allow minute almost 3D assemblies. The paper is based on literary source analysis and presents an overview of the state of the art and trends in nowadays chip design, using partially own ASIC lab experience. That shows a next stage of ising micro- and nanoelectronics in physical instrumentation.

  14. Object oriented design and programming for experiment online applications---Experiences with a prototype application

    SciTech Connect

    Oleynik, G.

    1991-03-01

    The increase in the variety of computer platforms incorporated into online data acquisition systems at Fermilab compels consideration of how to best design and implement applications to be maintainable, reusable and portable. To this end we have evaluated the applicability to Object Oriented Design techniques, and Object Oriented Programming languages for online applications. We report on this evaluation. We are designing a specific application which provides a framework for experimenters to access and display their raw data on UNIX workstations that form part of their distributed online data acquisition system. We have chosen to implement this using the C++ OOP language. We report on our experiences in object oriented design and lessons learned which we will apply to future software development. 14 refs.

  15. Engineering design of the National Spherical Torus Experiment

    SciTech Connect

    C. Neumeyer; P. Heitzenroeder; J. Spitzer, J. Chrzanowski; et al

    2000-05-11

    NSTX is a proof-of-principle experiment aimed at exploring the physics of the ``spherical torus'' (ST) configuration, which is predicted to exhibit more efficient magnetic confinement than conventional large aspect ratio tokamaks, amongst other advantages. The low aspect ratio (R/a, typically 1.2--2 in ST designs compared to 4--5 in conventional tokamaks) decreases the available cross sectional area through the center of the torus for toroidal and poloidal field coil conductors, vacuum vessel wall, plasma facing components, etc., thus increasing the need to deploy all components within the so-called ``center stack'' in the most efficient manner possible. Several unique design features have been developed for the NSTX center stack, and careful engineering of this region of the machine, utilizing materials up to their engineering allowables, has been key to meeting the desired objectives. The design and construction of the machine has been accomplished in a rapid and cost effective manner thanks to the availability of extensive facilities, a strong experience base from the TFTR era, and good cooperation between institutions.

  16. Design, construction, alignment, and calibration of a compact velocimetry experiment

    SciTech Connect

    Kaufman, Morris I.; Malone, Robert M.; Frogget, Brent C.; Romero, Vincent T.; Esquibel, David L.; Iverson, Adam; Lare, Gregory A.; Briggs, Bart; DeVore, Douglas; Cata, Brian; McGillivray, Kevin; Palagi, Martin; et al.,

    2007-08-31

    A velocimetry experiment has been designed to measure shock properties for small, cylindrical, metal targets (8 mm diameter × 2 mm thick). A target is accelerated by high explosives, caught, then retrieved for later inspection. The target is expected to move at a velocity of 0.1 to 3 km/sec. The complete experiment canister is ~105 mm in diameter and 380 mm long. Optical velocimetry diagnostics include the Velocity Interferometer System for Any Reflector (VISAR) and photon Doppler velocimetry (PDV). The packaging of the velocity diagnostics is not allowed to interfere with the foam catchment or an X-ray imaging diagnostic. Using commercial lenses, a single optical relay collects Doppler-shifted light for both VISAR and PDV. The use of fiber optics allows measurement of point velocities on the target surface for accelerations lasting for 3 mm of travel. Operating at 532 nm, the VISAR has separate illumination fibers requiring alignment. The PDV diagnostic operates at 1550 nm but is aligned and calibrated at 670 nm. VISAR and PDV diagnostics are complimentary measurements that image spots in close proximity on the target surface. Because the optical relay uses commercial glass, optical fibers’ axial positions are offset to compensate for chromatic aberrations. The optomechanical design requires careful attention to fiber management, mechanical assembly and disassembly, foam catchment design, and X-ray diagnostic field of view.Calibration and alignment data are archived at each assembly sequence stage. The photon budgets for the VISAR and PDV diagnostics are separately estimated.

  17. Vanguard/PLACE experiment system design and test plan

    NASA Technical Reports Server (NTRS)

    Taylor, R. E.

    1973-01-01

    A system design and test plan are described for operational evaluation of the NASA-Goddard position location and aircraft communications equipment (PLACE), at C band (4/6GHz), using NASA's ship, the USNS Vanguard, and the ATS 3 and ATS 5 synchronous satellites. The Sea Test phase, extending from March 29, 1973 to April 15, 1973 was successfully completed; the principal objectives of the experiment were achieved. Typical PLACE-computed, position-location data is shown for the Vanguard. Position location and voice-quality measurements were excellent; ship position was determined within 2 nmi; high-quality, 2-way voice transmissions resulted as determined from audience participation, intelligibility and articulation-index analysis. A C band/L band satellite trilateration experiment is discussed.

  18. The Light Microscope Module Designed for Space Biological Science Experiments

    NASA Astrophysics Data System (ADS)

    Zheng, Weibo; Zhang, Tao

    The real-time observation and imaging techniques for the space science experiments has become more and more important. This paper will introduce the Light Microscope Module (LMM), which is an automatic acquisition mini-microscope designed for space biological science re-searches. The sample features including the shape and fluorescence images would be observed. The LMM consists of an illumination system of super bright LEDs, an optical system, a colour CCD camera, and a moving machine. Illumination will be provided in three colors (white, blue, and ultraviolet). It would search for the samples, then automatic focus on them and produce digital images. The LLM had been used in the biological science experiments on the Shijian-8 Satellite, which observed sample features including rat embryos and plant flowers, and researched the influence of samples in the space environments.

  19. CELSS experiment model and design concept of gas recycle system

    NASA Technical Reports Server (NTRS)

    Nitta, K.; Oguchi, M.; Kanda, S.

    1986-01-01

    In order to prolong the duration of manned missions around the Earth and to expand the human existing region from the Earth to other planets such as a Lunar Base or a manned Mars flight mission, the controlled ecological life support system (CELSS) becomes an essential factor of the future technology to be developed through utilization of space station. The preliminary system engineering and integration efforts regarding CELSS have been carried out by the Japanese CELSS concept study group for clarifying the feasibility of hardware development for Space station experiments and for getting the time phased mission sets after FY 1992. The results of these studies are briefly summarized and the design and utilization methods of a Gas Recycle System for CELSS experiments are discussed.

  20. Propagation of Computational Uncertainty Using the Modern Design of Experiments

    NASA Technical Reports Server (NTRS)

    DeLoach, Richard

    2007-01-01

    This paper describes the use of formally designed experiments to aid in the error analysis of a computational experiment. A method is described by which the underlying code is approximated with relatively low-order polynomial graduating functions represented by truncated Taylor series approximations to the true underlying response function. A resource-minimal approach is outlined by which such graduating functions can be estimated from a minimum number of case runs of the underlying computational code. Certain practical considerations are discussed, including ways and means of coping with high-order response functions. The distributional properties of prediction residuals are presented and discussed. A practical method is presented for quantifying that component of the prediction uncertainty of a computational code that can be attributed to imperfect knowledge of independent variable levels. This method is illustrated with a recent assessment of uncertainty in computational estimates of Space Shuttle thermal and structural reentry loads attributable to ice and foam debris impact on ascent.

  1. Design and experience with large-size CFB boilers

    SciTech Connect

    Darling, S.L.

    1994-12-31

    CFB boilers have been in operation for many years in industrial steam and power generation applications demonstrating the low SO{sub x}/NO{sub x} emissions and fuel flexibility of the technology. In the past few years, several large-size CFB boilers (over 100 MWe) have entered service and are operating successfully. On the basis of this experience, CFB boilers up to 400 MWe in size are now being offered with full commercial guarantees. Such large CFB boilers will be of interest to countries with strict emission regulations or the need to reduce emissions, and to countries with both a large need for additional power and low grade indigenous solid fuel. This paper will describe Ahlstrom Pyropower`s scale-up of the AHLSTROM PYROFLOW CFB boiler, experience with large-size CFB boilers and the design features of CFB boilers in the 400 MWe size range.

  2. Design of laboratory experiments to study radiation-driven implosions

    NASA Astrophysics Data System (ADS)

    Keiter, P. A.; Trantham, M.; Malamud, G.; Klein, S. R.; Davis, J.; VanDervort, R.; Shvarts, D.; Drake, R. P.; Stone, J. M.; Fraenkel, M.; Frank, Y.; Raicher, E.

    2017-03-01

    The interstellar medium is heterogeneous with dense clouds amid an ambient medium. Radiation from young OB stars asymmetrically irradiate the dense clouds. Bertoldi (1989) developed analytic formulae to describe possible outcomes of these clouds when irradiated by hot, young stars. One of the critical parameters that determines the cloud's fate is the number of photon mean free paths in the cloud. For the extreme cases where the cloud size is either much greater than or much less than one mean free path, the radiation transport should be well understood. However, as one transitions between these limits, the radiation transport is much more complex and is a challenge to solve with many of the current radiation transport models implemented in codes. We present the design of laboratory experiments that use a thermal source of x-rays to asymmetrically irradiate a low-density plastic foam sphere. The experiment will vary the density and hence the number of mean free paths of the sphere to study the radiation transport in different regimes. We have developed dimensionless parameters to relate the laboratory experiment to the astrophysical system and we show that we can perform the experiment in the same transport regime.

  3. Spacecraft and mission design for the SP-100 flight experiment

    NASA Technical Reports Server (NTRS)

    Deininger, William D.; Vondra, Robert J.

    1988-01-01

    The design and performance of a spacecraft employing arcjet nuclear electric propulsion, suitable for use in the SP-100 Space Reactor Power System (SRPS) Flight Experiment, are outlined. The vehicle design is based on a 93 kW(e) ammonia arcjet system operating at an experimentally measured specific impulse of 1031 s and an efficiency of 42.3 percent. The arcjet/gimbal assemblies, power conditioning subsystem, propellant feed system, propulsion system thermal control, spacecraft diagnostic instrumentation, and the telemetry requirements are described. A 100 kW(e) SRPS is assumed. The spacecraft mass is baselined at 5675 kg excluding the propellant and propellant feed system. Four mission scenarios are described which are capable of demonstrating the full capability of the SRPS. The missions considered include spacecraft deployment to possible surveillance platform orbits, a spacecraft storage mission, and an orbit raising round trip corresponding to possible orbit transfer vehicle (OTV) missions.

  4. The high temperature superconductivity space experiment (HTSSE-II) design

    SciTech Connect

    Kawecki, T.G.; Golba, G.A.; Price, G.E.; Rose, V.S.; Meyers, W.J.

    1996-07-01

    The high temperature superconductivity space experiment (HTSSE) program, initiated by the Naval Research Laboratory (NRL) in 1988, is described. The HTSSE program focuses high temperature superconductor (HTS) technology applications on space systems. The program phases, goals, and objectives are discussed. The devices developed for the HTSSE-II phase of the program and their suppliers are enumerated. Eight space-qualified components were integrated as a cryogenic experimental payload on DOD`s ARGOS spacecraft. The payload was designed and built using a unique NRL/industry partnership and was integrated and space-qualified at NRL.

  5. The Design and Analysis of Transposon-Insertion Sequencing Experiments

    PubMed Central

    Chao, Michael C.; Abel, Sören; Davis, Brigid M.; Waldor, Matthew K.

    2016-01-01

    Preface Transposon-insertion sequencing (TIS) is a powerful approach that can be widely applied to genome-wide definition of loci that are required for growth in diverse conditions. However, experimental design choices and stochastic biological processes can heavily influence the results of TIS experiments and affect downstream statistical analysis. Here, we discuss TIS experimental parameters and how these factors relate to the benefits and limitations of the various statistical frameworks that can be applied to computational analysis of TIS data. PMID:26775926

  6. Scaling and design of landslide and debris-flow experiments

    NASA Astrophysics Data System (ADS)

    Iverson, Richard M.

    2015-09-01

    Scaling plays a crucial role in designing experiments aimed at understanding the behavior of landslides, debris flows, and other geomorphic phenomena involving grain-fluid mixtures. Scaling can be addressed by using dimensional analysis or - more rigorously - by normalizing differential equations that describe the evolving dynamics of the system. Both of these approaches show that, relative to full-scale natural events, miniaturized landslides and debris flows exhibit disproportionately large effects of viscous shear resistance and cohesion as well as disproportionately small effects of excess pore-fluid pressure that is generated by debris dilation or contraction. This behavioral divergence grows in proportion to H3, where H is the thickness of a moving mass. Therefore, to maximize geomorphological relevance, experiments with wet landslides and debris flows must be conducted at the largest feasible scales. Another important consideration is that, unlike stream flows, landslides and debris flows accelerate from statically balanced initial states. Thus, no characteristic macroscopic velocity exists to guide experiment scaling and design. On the other hand, macroscopic gravity-driven motion of landslides and debris flows evolves over a characteristic time scale (L/g)1/2, where g is the magnitude of gravitational acceleration and L is the characteristic length of the moving mass. Grain-scale stress generation within the mass occurs on a shorter time scale, H/(gL)1/2, which is inversely proportional to the depth-averaged material shear rate. A separation of these two time scales exists if the criterion H/L < < 1 is satisfied, as is commonly the case. This time scale separation indicates that steady-state experiments can be used to study some details of landslide and debris-flow behavior but cannot be used to study macroscopic landslide or debris-flow dynamics.

  7. Scaling and design of landslide and debris-flow experiments

    USGS Publications Warehouse

    Iverson, Richard M.

    2015-01-01

    Scaling plays a crucial role in designing experiments aimed at understanding the behavior of landslides, debris flows, and other geomorphic phenomena involving grain-fluid mixtures. Scaling can be addressed by using dimensional analysis or – more rigorously – by normalizing differential equations that describe the evolving dynamics of the system. Both of these approaches show that, relative to full-scale natural events, miniaturized landslides and debris flows exhibit disproportionately large effects of viscous shear resistance and cohesion as well as disproportionately small effects of excess pore-fluid pressure that is generated by debris dilation or contraction. This behavioral divergence grows in proportion to H3, where H is the thickness of a moving mass. Therefore, to maximize geomorphological relevance, experiments with wet landslides and debris flows must be conducted at the largest feasible scales. Another important consideration is that, unlike stream flows, landslides and debris flows accelerate from statically balanced initial states. Thus, no characteristic macroscopic velocity exists to guide experiment scaling and design. On the other hand, macroscopic gravity-driven motion of landslides and debris flows evolves over a characteristic time scale (L/g)1/2, where g is the magnitude of gravitational acceleration and L is the characteristic length of the moving mass. Grain-scale stress generation within the mass occurs on a shorter time scale, H/(gL)1/2, which is inversely proportional to the depth-averaged material shear rate. A separation of these two time scales exists if the criterion H/L < < 1 is satisfied, as is commonly the case. This time scale separation indicates that steady-state experiments can be used to study some details of landslide and debris-flow behavior but cannot be used to study macroscopic landslide or debris-flow dynamics.

  8. Designing Statistical Language Learners: Experiments on Noun Compounds

    NASA Astrophysics Data System (ADS)

    Lauer, Mark

    1996-09-01

    The goal of this thesis is to advance the exploration of the statistical language learning design space. In pursuit of that goal, the thesis makes two main theoretical contributions: (i) it identifies a new class of designs by specifying an architecture for natural language analysis in which probabilities are given to semantic forms rather than to more superficial linguistic elements; and (ii) it explores the development of a mathematical theory to predict the expected accuracy of statistical language learning systems in terms of the volume of data used to train them. The theoretical work is illustrated by applying statistical language learning designs to the analysis of noun compounds. Both syntactic and semantic analysis of noun compounds are attempted using the proposed architecture. Empirical comparisons demonstrate that the proposed syntactic model is significantly better than those previously suggested, approaching the performance of human judges on the same task, and that the proposed semantic model, the first statistical approach to this problem, exhibits significantly better accuracy than the baseline strategy. These results suggest that the new class of designs identified is a promising one. The experiments also serve to highlight the need for a widely applicable theory of data requirements.

  9. Frac-and-pack stimulation: Application, design, and field experience

    SciTech Connect

    Roodhart, L.P.; Fokker, P.A.; Davies, D.R.; Shlyapobersky, J.; Wong, G.K.

    1994-03-01

    This paper discusses the criteria for selecting wells to be frac-and-packed. The authors show how systematic study of the inflow performance can be used to assess the potential of frac-and-packed wells, to identify the controlling factors, and to optimize design parameters. They also show that fracture conductivity is often the key to successful treatment. This conductivity depends largely on proppant size; formation permeability damage around the created fracture has less effect. Appropriate allowance needs to be made for flow restrictions caused by the presence of the perforations, partial penetration, and non-Darcy effects. They describe the application of the overpressure-calibrated hydraulic fracture model in frac-and-pack treatment design, and discuss some operational considerations with reference to field examples. The full potential of this promising new completion method can be achieved only if the design is tailored to the individual well. This demands high-quality input data, which can be obtained only from a calibration test. This paper presents their strategy for frac-and-pack design, drawing on examples from field experience. They also point out several areas that the industry needs to address, such as the sizing of proppant in soft formations and the interaction between fracturing fluids and resin in resin-coated proppant.

  10. Design and preliminary experiment of China imaging altimeter

    NASA Astrophysics Data System (ADS)

    Zhang, Yunhua; Jiang, Jingshan; Zhang, Xiangkun; Xu, Ke; Yan, Jingye; Jiang, Changhong; Lei, Liqing

    2003-04-01

    The design of the China Imaging ALTimeter (CIALT) and the flight experiment of its airborne model are presented in this paper. The system is aimed for providing observation measure for both oceanic applications and continental topographic mapping in the future. The motivation of this project is to develop a three dimensional imager fitted for small satellites with small volume, mass and power consumption. An experimental airborne model of the CIALT has been developed for verifying the design concept. The CIALT integrates three techniques together, i.e. the height measurement and tracking technique of traditional radar altimeter used for ocean applications, the synthetic aperture technique and the interferometric technique. A robust height tracker has been designed for meeting the requirements of both oceanic surfaces and continental surfaces (including surfaces of ice continent). The synthetic aperture technique is used for achieving a higher azimuthal resolution along the cross range direction compared with that of a traditional altimeter. The interferometric technique is used for retrieving the height information corresponding to each image pixel and for boresight angle correction of the antennas, which is crucial for accurate height measurement. The CIALT is different from other proposed imaging altimeters, such as SAR altimeter and scanning altimeter, in which no height tracker is involved. Some key technologies regarding the development of imaging altimeter are addressed, such as the antenna design, the transmitter, the receiver and the robust tracking algorithm.

  11. Optimal experiment design for time-lapse traveltime tomography

    SciTech Connect

    Ajo-Franklin, J.B.

    2009-10-01

    Geophysical monitoring techniques offer the only noninvasive approach capable of assessing both the spatial and temporal dynamics of subsurface fluid processes. Increasingly, permanent sensor arrays in boreholes and on the ocean floor are being deployed to improve the repeatability and increase the temporal sampling of monitoring surveys. Because permanent arrays require a large up-front capital investment and are difficult (or impossible) to re-configure once installed, a premium is placed on selecting a geometry capable of imaging the desired target at minimum cost. We present a simple approach to optimizing downhole sensor configurations for monitoring experiments making use of differential seismic traveltimes. In our case, we use a design quality metric based on the accuracy of tomographic reconstructions for a suite of imaging targets. By not requiring an explicit singular value decomposition of the forward operator, evaluation of this objective function scales to problems with a large number of unknowns. We also restrict the design problem by recasting the array geometry into a low dimensional form more suitable for optimization at a reasonable computational cost. We test two search algorithms on the design problem: the Nelder-Mead downhill simplex method and the Multilevel Coordinate Search algorithm. The algorithm is tested for four crosswell acquisition scenarios relevant to continuous seismic monitoring, a two parameter array optimization, several scenarios involving four parameter length/offset optimizations, and a comparison of optimal multi-source designs. In the last case, we also examine trade-offs between source sparsity and the quality of tomographic reconstructions. One general observation is that asymmetric array lengths improve localized image quality in crosswell experiments with a small number of sources and a large number of receivers. Preliminary results also suggest that high-quality differential images can be generated using only a small

  12. On Becoming a Civic-Minded Instructional Designer: An Ethnographic Study of an Instructional Design Experience

    ERIC Educational Resources Information Center

    Yusop, Farrah Dina; Correia, Ana-Paula

    2014-01-01

    This ethnographic study took place in a graduate course at a large research university in the Midwestern United States. It presents an in-depth examination of the experiences and challenges of a group of four students learning to be Instructional Design and Technology professionals who are concerned with the well-being of all members of a society,…

  13. Safeguard By Design Lessons Learned from DOE Experience Integrating Safety into Design

    SciTech Connect

    Hockert, John; Burbank, Roberta L.

    2010-04-13

    This paper identifies the lessons to be learned for the institutionalization of Safeguards by Design (SBD) from the Department of Energy (DOE) experience developing and implementing DOE-STD-1189-2008, Integration of Safety into the Design Process. The experience is valuable because of the similarity of the challenges of integrating safety and safeguards into the design process. The paper reviews the content and development of DOE-STD-1189-2008 from its initial concept in January 2006 to its issuance in March 2008. Lessons learned are identified in the areas of the development and structure of requirements for the SBD process; the target audience for SBD requirements and guidance, the need for a graded approach to SBD, and a possible strategy for development and implementation of SBD within DOE.

  14. Accelerating Vaccine Formulation Development Using Design of Experiment Stability Studies.

    PubMed

    Ahl, Patrick L; Mensch, Christopher; Hu, Binghua; Pixley, Heidi; Zhang, Lan; Dieter, Lance; Russell, Ryann; Smith, William J; Przysiecki, Craig; Kosinski, Mike; Blue, Jeffrey T

    2016-10-01

    Vaccine drug product thermal stability often depends on formulation input factors and how they interact. Scientific understanding and professional experience typically allows vaccine formulators to accurately predict the thermal stability output based on formulation input factors such as pH, ionic strength, and excipients. Thermal stability predictions, however, are not enough for regulators. Stability claims must be supported by experimental data. The Quality by Design approach of Design of Experiment (DoE) is well suited to describe formulation outputs such as thermal stability in terms of formulation input factors. A DoE approach particularly at elevated temperatures that induce accelerated degradation can provide empirical understanding of how vaccine formulation input factors and interactions affect vaccine stability output performance. This is possible even when clear scientific understanding of particular formulation stability mechanisms are lacking. A DoE approach was used in an accelerated 37(°)C stability study of an aluminum adjuvant Neisseria meningitidis serogroup B vaccine. Formulation stability differences were identified after only 15 days into the study. We believe this study demonstrates the power of combining DoE methodology with accelerated stress stability studies to accelerate and improve vaccine formulation development programs particularly during the preformulation stage.

  15. Parameter Screening and Optimisation for ILP Using Designed Experiments

    NASA Astrophysics Data System (ADS)

    Srinivasan, Ashwin; Ramakrishnan, Ganesh

    Reports of experiments conducted with an Inductive Logic Programming system rarely describe how specific values of parameters of the system are arrived at when constructing models. Usually, no attempt is made to identify sensitive parameters, and those that are used are often given "factory-supplied" default values, or values obtained from some non-systematic exploratory analysis. The immediate consequence of this is, of course, that it is not clear if better models could have been obtained if some form of parameter selection and optimisation had been performed. Questions follow inevitably on the experiments themselves: specifically, are all algorithms being treated fairly, and is the exploratory phase sufficiently well-defined to allow the experiments to be replicated? In this paper, we investigate the use of parameter selection and optimisation techniques grouped under the study of experimental design. Screening and "response surface" methods determine, in turn, sensitive parameters and good values for these parameters. This combined use of parameter selection and response surface-driven optimisation has a long history of application in industrial engineering, and its role in ILP is investigated using two well-known benchmarks. The results suggest that computational overheads from this preliminary phase are not substantial, and that much can be gained, both on improving system performance and on enabling controlled experimentation, by adopting well-established procedures such as the ones proposed here.

  16. Lower hybrid system design for the Tokamak physics experiment

    SciTech Connect

    Goranson, P.L.; Conner, D.L.; Swain, D.W.; Yugo, J.J.; Bernabei, S.; Greenough, N.

    1995-12-31

    The lower hybrid (LH) launcher configuration has been redesigned to integrate the functions of the vertical four-way power splitter and the front waveguide array (front array). This permits 256 waveguide channels to be fed by only 64 waveguides at the vacuum window interface. The resulting configuration is a more compact coupler, which incorporates the simplicity of a multijunction coupler while preserving the spectral flexibility of a conventional lower hybrid launcher. Other spin-offs of the redesign are reduction in thermal incompatibility between the front array and vacuum windows, improved maintainability, in situ vacuum window replacement, a reduced number of radio frequency (rf) connections, and a weight reduction of 7300 kg. There should be a significant cost reduction as well. Issues associated with the launcher design and fabrication have been addressed by a research and development program that includes brazing of the front array and testing of the power splitter configuration to confirm that phase errors due to reflections in the shorted splitter legs will not significantly impact the rf spectrum. The Conceptual Design Review requires that radiation levels at the torus radial port mounting flange and outer surface of the toroidal field coils should be sufficiently low to permit hands-on maintenance. Low activation materials and neutron shielding are incorporated in the launcher design to meet these requirements. The launcher is configured to couple 3 MW of steady state LH heating/LH current drive power at 3.7 GHz to the Tokamak Physics Experiment plasma.

  17. Nanosatellite optical downlink experiment: design, simulation, and prototyping

    NASA Astrophysics Data System (ADS)

    Clements, Emily; Aniceto, Raichelle; Barnes, Derek; Caplan, David; Clark, James; Portillo, Iñigo del; Haughwout, Christian; Khatsenko, Maxim; Kingsbury, Ryan; Lee, Myron; Morgan, Rachel; Twichell, Jonathan; Riesing, Kathleen; Yoon, Hyosang; Ziegler, Caleb; Cahoy, Kerri

    2016-11-01

    The nanosatellite optical downlink experiment (NODE) implements a free-space optical communications (lasercom) capability on a CubeSat platform that can support low earth orbit (LEO) to ground downlink rates>10 Mbps. A primary goal of NODE is to leverage commercially available technologies to provide a scalable and cost-effective alternative to radio-frequency-based communications. The NODE transmitter uses a 200-mW 1550-nm master-oscillator power-amplifier design using power-efficient M-ary pulse position modulation. To facilitate pointing the 0.12-deg downlink beam, NODE augments spacecraft body pointing with a microelectromechanical fast steering mirror (FSM) and uses an 850-nm uplink beacon to an onboard CCD camera. The 30-cm aperture ground telescope uses an infrared camera and FSM for tracking to an avalanche photodiode detector-based receiver. Here, we describe our approach to transition prototype transmitter and receiver designs to a full end-to-end CubeSat-scale system. This includes link budget refinement, drive electronics miniaturization, packaging reduction, improvements to pointing and attitude estimation, implementation of modulation, coding, and interleaving, and ground station receiver design. We capture trades and technology development needs and outline plans for integrated system ground testing.

  18. Target Station Design for the Mu2e Experiment

    SciTech Connect

    Pronskikh, Vitaly; Ambrosio, Giorgio; Campbell, Michael; Coleman, Richard; Ginther, George; Kashikhin, Vadim; Krempetz, Kurt; Lamm, Michael; Lee, Ang; Leveling, Anthony; Mokhov, Nikolai; Nagaslaev, Vladimir; Stefanik, Andrew; Striganov, Sergei; Werkema, Steven; Bartoszek, Larry; Densham, Chris; Loveridge, Peter; Lynch, Kevin; Popp, James

    2014-07-01

    The Mu2e experiment at Fermilab is devoted to search for the conversion of a negative muon into an electron in the field of a nucleus without emission of neutrinos. One of the main parts of the Mu2e experimental setup is its Target Station in which negative pions are generated in interactions of the 8-GeV primary proton beam with a tungsten target. A large-aperture 5-T superconducting production solenoid (PS) enhances pion collection, and an S-shaped transport solenoid (TS) delivers muons and pions to the Mu2e detector. The heat and radiation shield (HRS) protects the PS and the first TS coils. A beam dump absorbs the spent beam. In order for the PS superconducting magnet to operate reliably the sophisticated HRS was designed and optimized for performance and cost. The beam dump was designed to absorb the spent beam and maintaining its temperature and air activation in the hall at the allowable level. Comprehensive MARS15 simulations have been carried out to optimize all the parts while maximizing muon yield. Results of simulations of critical radiation quantities and their implications on the overall Target Station design and integration will be reported.

  19. Designing a Field Experience Tracking System in the Area of Special Education

    ERIC Educational Resources Information Center

    He, Wu; Watson, Silvana

    2014-01-01

    Purpose: To improve the quality of field experience, support field experience cooperation and streamline field experience management, the purpose of this paper is to describe the experience in using Activity Theory to design and develop a web-based field experience tracking system for a special education program. Design/methodology/approach: The…

  20. Design of experiments (DoE) in pharmaceutical development.

    PubMed

    N Politis, Stavros; Colombo, Paolo; Colombo, Gaia; M Rekkas, Dimitrios

    2017-02-23

    At the beginning of the twentieth century, Sir Ronald Fisher introduced the concept of applying statistical analysis during the planning stages of research rather than at the end of experimentation. When statistical thinking is applied from the design phase, it enables to build quality into the product, by adopting Deming's profound knowledge approach, comprising system thinking, variation understanding, theory of knowledge, and psychology. The pharmaceutical industry was late in adopting these paradigms, compared to other sectors. It heavily focused on blockbuster drugs, while formulation development was mainly performed by One Factor At a Time (OFAT) studies, rather than implementing Quality by Design (QbD) and modern engineering-based manufacturing methodologies. Among various mathematical modeling approaches, Design of Experiments (DoE) is extensively used for the implementation of QbD in both research and industrial settings. In QbD, product and process understanding is the key enabler of assuring quality in the final product. Knowledge is achieved by establishing models correlating the inputs with the outputs of the process. The mathematical relationships of the Critical Process Parameters (CPPs) and Material Attributes (CMAs) with the Critical Quality Attributes (CQAs) define the design space. Consequently, process understanding is well assured and rationally leads to a final product meeting the Quality Target Product Profile (QTPP). This review illustrates the principles of quality theory through the work of major contributors, the evolution of the QbD approach and the statistical toolset for its implementation. As such, DoE is presented in detail since it represents the first choice for rational pharmaceutical development.

  1. Computational design aspects of a NASP nozzle/afterbody experiment

    NASA Technical Reports Server (NTRS)

    Ruffin, Stephen M.; Venkatapathy, Ethiraj; Keener, Earl R.; Nagaraj, N.

    1989-01-01

    This paper highlights the influence of computational methods on design of a wind tunnel experiment which generically models the nozzle/afterbody flow field of the proposed National Aerospace Plane. The rectangular slot nozzle plume flow field is computed using a three-dimensional, upwind, implicit Navier-Stokes solver. Freestream Mach numbers of 5.3, 7.3, and 10 are investigated. Two-dimensional parametric studies of various Mach numbers, pressure ratios, and ramp angles are used to help determine model loads and afterbody ramp angle and length. It was found that the center of pressure on the ramp occurs at nearly the same location for all ramp angles and test conditions computed. Also, to prevent air liquefaction, it is suggested that a helium-air mixture be used as the jet gas for the highest Mach number test case.

  2. Gender Consideration in Experiment Design for Air Break in Prebreathe

    NASA Technical Reports Server (NTRS)

    Conkin, Johnny; Dervay, Joseph P.; Gernhardt, Michael L.

    2007-01-01

    If gender is a confounder of the decompression sickness (DCS) or venous gas emboli (VGE) outcomes of a proposed air break in oxygen prebreathe (PB) project, then decisions about the final experiment design must be made. We evaluated if the incidence of DCS and VGE from tests in altitude chambers over 20 years were different between men and women after resting and exercise PB protocols. Nitrogen washout during PB is our primary risk mitigation strategy to prevent subsequent DCS and VGE in subjects. Bubbles in the pulmonary artery (venous blood) were detected from the precordial position using Doppler ultrasound bubble detectors. The subjects were monitored for VGE for four min at about 15 min intervals for the duration of the altitude exposure, with maximum bubble grade assigned a Spencer Grade of IV.

  3. Design of experiments for thermal protection system process optimization

    NASA Astrophysics Data System (ADS)

    Longani, Hans R.

    2000-01-01

    Solid Rocket Booster (SRB) structures were protected from heating due to aeroshear, radiation and plume impingement by a Thermal Protection System (TPS) known as Marshall Sprayable Ablative (MSA-2). MSA-2 contains Volatile Organic Compounds (VOCs) which due to strict environmental legislation was eliminated. MSA-2 was also classified as hazardous waste, which makes the disposal very costly. Marshall Convergent Coating (MCC-1) replaced MSA-2, and eliminated the use of solvents by delivering the dry filler materials and the fluid resin system to a patented spray gun which utilizes Convergent Spray Technologies spray process. The selection of TPS material was based on risk assessment, performance comparisons, processing, application and cost. Design of Experiments technique was used to optimize the spraying parameters. .

  4. A designed experiment in stitched/RTM composites

    NASA Technical Reports Server (NTRS)

    Dickinson, Larry C.

    1993-01-01

    The damage tolerance of composite laminates can be significantly improved by the addition of through-the-thickness fibrous reinforcement such as stitching. However, there are numerous stitching parameters which can be independently varied, and their separate and combined effects on mechanical properties need to be determined. A statistically designed experiment (a 2(sup 5-1) fractional factorial, also known as a Taguchi L16 test matrix) used to evaluate five important parameters is described. The effects and interactions of stitch thread material, stitch thread strength, stitch row spacing and stitch pitch are examined for both thick (48 ply) and thin (16 ply) carbon/epoxy (AS4/E905L) composites. Tension, compression and compression after impact tests are described. Preliminary results of completed tension testing are discussed. Larger threads decreased tensile strength. Panel thickness was found not to be an important stitching parameter for tensile properties. Tensile modulus was unaffected by stitching.

  5. Design of Experiments Results for the Feedthru Insulator

    SciTech Connect

    BENAVIDES,GILBERT L.; VAN ORNUM,DAVID J.; BACA,MAUREEN R.; APPEL,PATRICIA E.

    1999-12-01

    A design of experiments (DoE) was performed at Ceramtec to improve the yield of a cermet part known as the feedthru insulator. The factors chosen to be varied in this DoE were syringe orifice size, fill condition, solvent, and surfactant. These factors were chosen because of their anticipated effect on the cermet slurry and its consequences to the feedthru insulator in succeeding fabrication operations. Response variables to the DoE were chosen to be indirect indicators of production yield for the feedthru insulator. The solvent amount used to mix the cermet slurry had the greatest overall effect on the response variables. Based upon this DoE, there is the potential to improve the yield not only for the feedthru insulator but for other cermet parts as well. This report thoroughly documents the DoE and contains additional information regarding the feedthru insulator.

  6. PV-Diesel Hybrid SCADA Experiment Network Design

    NASA Technical Reports Server (NTRS)

    Kalu, Alex; Durand, S.; Emrich, Carol; Ventre, G.; Wilson, W.; Acosta, R.

    1999-01-01

    The essential features of an experimental network for renewable power system satellite based supervisory, control and data acquisition (SCADA) are communication links, controllers, diagnostic equipment and a hybrid power system. Required components for implementing the network consist of two satellite ground stations, to satellite modems, two 486 PCs, two telephone receivers, two telephone modems, two analog telephone lines, one digital telephone line, a hybrid-power system equipped with controller and a satellite spacecraft. In the technology verification experiment (TVE) conducted by Savannah State University and Florida Solar Energy Center, the renewable energy hybrid system is the Apex-1000 Mini-Hybrid which is equipped with NGC3188 for user interface and remote control and the NGC2010 for monitoring and basic control tasks. This power system is connected to a satellite modem via a smart interface, RS232. Commands are sent to the power system control unit through a control PC designed as PC1. PC1 is thus connected to a satellite model through RS232. A second PC, designated PC2, the diagnostic PC is connected to both satellite modems via separate analog telephone lines for checking modems'health. PC2 is also connected to PC1 via a telephone line. Due to the unavailability of a second ground station for the ACTS, one ground station is used to serve both the sending and receiving functions in this experiment. Signal is sent from the control PC to the Hybrid system at a frequency f(sub 1), different from f(sub 2), the signal from the hybrid system to the control PC. f(sub l) and f(sub 2) are sufficiently separated to avoid interference.

  7. Design, Construction, Alignment, and Calibration of a Compact Velocimetry Experiment

    SciTech Connect

    Kaufman, Morris I.; Malone, Robert M.; Frogget, Brent C.; Esquibel, David L.; Romero, Vincent T.; Lare, Gregory A.; Briggs, Bart; Iverson, Adam J.; Frayer, Daniel K.; DeVore, Douglas; Cata, Brian

    2007-09-21

    A velocimetry experiment has been designed to measure shock properties for small cylindrical metal targets (8-mm-diameter by 2-mm thick). A target is accelerated by high explosives, caught, and retrieved for later inspection. The target is expected to move at a velocity of 0.1 to 3 km/sec. The complete experiment canister is approximately 105 mm in diameter and 380 mm long. Optical velocimetry diagnostics include the Velocity Interferometer System for Any Reflector (VISAR) and Photon Doppler Velocimetry (PDV). The packaging of the velocity diagnostics is not allowed to interfere with the catchment or an X-ray imaging diagnostic. A single optical relay, using commercial lenses, collects Doppler-shifted light for both VISAR and PDV. The use of fiber optics allows measurement of point velocities on the target surface during accelerations occurring over 15 mm of travel. The VISAR operates at 532 nm and has separate illumination fibers requiring alignment. The PDV diagnostic operates at 1550 nm, but is aligned and focused at 670 nm. The VISAR and PDV diagnostics are complementary measurements and they image spots in close proximity on the target surface. Because the optical relay uses commercial glass, the axial positions of the optical fibers for PDV and VISAR are offset to compensate for chromatic aberrations. The optomechanical design requires careful attention to fiber management, mechanical assembly and disassembly, positioning of the foam catchment, and X-ray diagnostic field-of-view. Calibration and alignment data are archived at each stage of the assembly sequence.

  8. Laser communication experiment. Volume 1: Design study report: Spacecraft transceiver. Part 3: LCE design specifications

    NASA Technical Reports Server (NTRS)

    1970-01-01

    The requirements for the design, fabrication, performance, and testing of a 10.6 micron optical heterodyne receiver subsystem for use in a laser communication system are presented. The receiver subsystem, as a part of the laser communication experiment operates in the ATS 6 satellite and in a transportable ground station establishing two-way laser communications between the spacecraft and the transportable ground station. The conditions under which environmental tests are conducted are reported.

  9. Design study for a diverging supernova explosion experiment on NIF

    NASA Astrophysics Data System (ADS)

    Flaig, Markus; Plewa, Tomasz; Keiter, Paul; Grosskopf, Michael; Kuranz, Carolyn; Drake, Paul; Park, Hye-Sook

    2013-10-01

    We report on design simulations of a spherically-diverging, multi-interface, supernova-relevant Rayleigh-Taylor experiment (DivSNRT) to be carried out at the National Ignition Facility (NIF). The simulations are performed in two and three dimensions using the block-adaptive, multi-group radiative diffusion hydrodynamics code CRASH and the FLASH-based MHD code Proteus. In the present study, we concentrate mainly on a planar variant of the experiment. We assess the sensitivity of the Rayleigh-Taylor instability growth on numerical discretization, variations in the laser drive energy and the manufacturing noise at the material interfaces. We find that a simple buoyancy-drag model accurately predicts the mixed-layer width obtained in the simulations. We use synthetic radiographs to optimize the diagnostic system and the experimental setup. Finally, we perform a series of exploratory MHD simulations and investigate the self-generation of magnetic fields and their role in the system evolution. Supported by the DOE grant DE-SC0008823.

  10. Design of a miniature explosive isentropic compression experiment

    SciTech Connect

    Tasker, Douglas G

    2010-01-01

    The purpose of this design study is to adapt the High Explosive Pulsed Power Isentropic Compression Experiment (HEPP-ICE) to milligram quantities of materials at stresses of {approx}100 GPa. For this miniature application we assume that a parallel plate stripline of {approx}2.5 mm width is needed to compress the samples. In any parallel plate load, the rising currents flow preferentially along the outside edges of the load where the specific impedance is a minimum [1]. Therefore, the peak current must be between 1 and 2 MA to reach a stress of 100 GPa in the center of a 2.5 mm wide parallel plate load; these are small relative to typical HEPP-ICE currents. We show that a capacitor bank alone exceeds the requirements of this miniature ICE experiment and a flux compression generator (FCG) is not necessary. The proposed circuit will comprise one half of the 2.4-MJ bank, i.e., the 6-mF, 20-kV, 1.2 MJ capacitor bank used in the original HEPP-ICE circuit. Explosive opening and closing switches will still be required because the rise time of the capacitor circuit would be of the order of 30 {micro}s without them. For isentropic loading in these small samples, stress rise times of {approx}200 ns are required.

  11. Fast ignition integrated experiments and high-gain point design

    SciTech Connect

    Shiraga, H.; Nagatomo, H.; Theobald, W.; Solodov, A. A.; Tabak, M.

    2014-04-17

    Here, integrated fast ignition experiments were performed at ILE, Osaka, and LLE, Rochester, in which a nanosecond driver laser implodes a deuterated plastic shell in front of the tip of a hollow metal cone and an intense ultrashort-pulse laser is injected through the cone to heat the compressed plasma. Based on the initial successful results of fast electron heating of cone-in-shell targets, large-energy short-pulse laser beam lines were constructed and became operational: OMEGA-EP at Rochester and LFEX at Osaka. Neutron enhancement due to heating with a ~kJ short-pulse laser has been demonstrated in the integrated experiments at Osaka and Rochester. The neutron yields are being analyzed by comparing the experimental results with simulations. Details of the fast electron beam transport and the electron energy deposition in the imploded fuel plasma are complicated and further studies are imperative. The hydrodynamics of the implosion was studied including the interaction of the imploded core plasma with the cone tip. Theory and simulation studies are presented on the hydrodynamics of a high-gain target for a fast ignition point design.

  12. Design, integration and preliminary results of the IXV Catalysis experiment

    NASA Astrophysics Data System (ADS)

    Viladegut, Alan; Panerai, F.; Chazot, O.; Pichon, T.; Bertrand, P.; Verdy, C.; Coddet, C.

    2016-08-01

    The CATalytic Experiment (CATE) is an in-flight demonstration of catalysis effects at the surface of thermal protection materials. A high-catalytic coating was applied over the baseline ceramic material on the windward side of the intermediate experimental vehicle (IXV). The temperature jump due to different catalytic activities was detected during re-entry through measurements made with near-surface thermocouples on the windward side of the vehicle. The experiment aimed at contributing to the development and validation of gas/surface interaction models for re-entry applications. The present paper summarizes the design of CATE and its integration on the windward side of the IXV. Results of a qualification campaign at the Plasmatron facility of the von Karman Institute for Fluid Dynamics are presented. They provided an experimental evidence of the temperature jump at the low-to-high catalytic interface of the heat shield under aerothermal conditions relevant to the actual IXV flight. These tests also gave confidence so that the high-catalytic patch would not endanger the integrity of the vehicle and the safety of the mission. A preliminary assessment of flight data from the thermocouple measurements shows consistency with results of the qualification tests.

  13. Interim Service ISDN Satellite (ISIS) hardware experiment development for advanced ISDN satellite designs and experiments

    NASA Technical Reports Server (NTRS)

    Pepin, Gerard R.

    1992-01-01

    The Interim Service Integrated Service Digital Network (ISDN) Satellite (ISIS) Hardware Experiment Development for Advanced Satellite Designs describes the development of the ISDN Satellite Terminal Adapter (ISTA) capable of translating ISDN protocol traffic into Time Division Multiple Access (TDMA) signals for use by a communications satellite. The ISTA connects the Type 1 Network Termination (NT1) via the U-interface on the line termination side of the CPE to the RS-499 interface for satellite uplink. The same ISTA converts in the opposite direction the RS-499 to U-interface data with a simple switch setting.

  14. National Spherical Torus Experiment (NSTX) Torus Design, Fabrication and Assembly

    SciTech Connect

    C. Neumeyer; G. Barnes; J.H. Chrzanowski; P. Heitzenroeder; et al

    1999-11-01

    The National Spherical Torus Experiment (NSTX) is a low aspect ratio spherical torus (ST) located at Princeton Plasma Physics Laboratory (PPPL). Fabrication, assembly, and initial power tests were completed in February of 1999. The majority of the design and construction efforts were constructed on the Torus system components. The Torus system includes the centerstack assembly, external Poloidal and Toroidal coil systems, vacuum vessel, torus support structure and plasma facing components (PFC's). NSTX's low aspect ratio required that the centerstack be made with the smallest radius possible. This, and the need to bake NSTXs carbon-carbon composite plasma facing components at 350 degrees C, was major drivers in the design of NSTX. The Centerstack Assembly consists of the inner legs of the Toroidal Field (TF) windings, the Ohmic Heating (OH) solenoid and its associated tension cylinder, three inner Poloidal Field (PF) coils, thermal insulation, diagnostics and an Inconel casing which forms the inner wall of the vacuum vessel boundary. It took approximately nine months to complete the assembly of the Centerstack. The tight radial clearances and the extreme length of the major components added complexity to the assembly of the Centerstack components. The vacuum vessel was constructed of 304-stainless steel and required approximately seven months to complete and deliver to the Test Cell. Several of the issues associated with the construction of the vacuum vessel were control of dimensional stability following welding and controlling the permeability of the welds. A great deal of time and effort was devoted to defining the correct weld process and material selection to meet our design requirements. The PFCs will be baked out at 350 degrees C while the vessel is maintained at 150 degrees C. This required care in designing the supports so they can accommodate the high electromagnetic loads resulting from plasma disruptions and the resulting relative thermal expansions

  15. Propagation-related AMT design aspects and supporting experiments

    NASA Technical Reports Server (NTRS)

    Dessouky, Khaled; Estabrook, Polly

    1991-01-01

    The ACTS Mobile Terminal (AMT) is presently being developed with the goal of significantly extending commercial satellite applications and their user base. A thorough knowledge of the Ka-band channel characteristics is essential to the proper design of a commercially viable system that efficiently utilizes the valuable resources. To date, only limited tests have been performed to characterize the Ka-band channel, and they have focused on the needs of fixed terminals. As part of the value of the AMT as a Ka-band test bed is its function as a vehicle through which tests specifically applicable to the mobile satellite communications can be performed. The exact propagation environment with the proper set of elevation angles, vehicle antenna gains and patterns, roadside shadowing, rain, and Doppler is encountered. The ability to measure all of the above, as well as correlate their effects with observed communication system performance, creates an invaluable opportunity to understand in depth Ka-band's potential in supporting mobile and personal communications. This paper discusses the propagation information required for system design, the setup with ACTS that will enable obtaining this information, and finally the types of experiments to be performed and data to be gathered by the AMT to meet this objective.

  16. Entombment Using Cementitious Materials: Design Considerations and International Experience

    SciTech Connect

    Seitz, R.R.

    2002-05-15

    Cementitious materials have physical and chemical properties that are well suited for the requirements of radioactive waste management. Namely, the materials have low permeability and durability that is consistent with the time frame required for short-lived radionuclides to decay. Furthermore, cementitious materials can provide a long-term chemical environment that substantially reduces the mobility of some long-lived radionuclides of concern for decommissioning (e.g., C-14, Ni-63, Ni-59). Because of these properties, cementitious materials are common in low-level radioactive waste disposal facilities throughout the world and are an attractive option for entombment of nuclear facilities. This paper describes design considerations for cementitious barriers in the context of performance over time frames of a few hundreds of years (directed toward short-lived radionuclides) and time frames of thousands of years (directed towards longer-lived radionuclides). The emphasis is on providing a n overview of concepts for entombment that take advantage of the properties of cementitious materials and experience from the design of low-level radioactive waste disposal facilities. A few examples of the previous use of cementitious materials for entombment of decommissioned nuclear facilities and proposals for the use in future decommissioning of nuclear reactors in a few countries are also included to provide global perspective.

  17. Entombment Using Cementitious Materials: Design Considerations and International Experience

    SciTech Connect

    Seitz, Roger Ray

    2002-08-01

    Cementitious materials have physical and chemical properties that are well suited for the requirements of radioactive waste management. Namely, the materials have low permeability and durability that is consistent with the time frame required for short-lived radionuclides to decay. Furthermore, cementitious materials can provide a long-term chemical environment that substantially reduces the mobility of some long-lived radionuclides of concern for decommissioning (e.g., C-14, Ni-63, Ni-59). Because of these properties, cementitious materials are common in low-level radioactive waste disposal facilities throughout the world and are an attractive option for entombment of nuclear facilities. This paper describes design considerations for cementitious barriers in the context of performance over time frames of a few hundreds of years (directed toward short-lived radionuclides) and time frames of thousands of years (directed towards longer-lived radionuclides). The emphasis is on providing an overview of concepts for entombment that take advantage of the properties of cementitious materials and experience from the design of low-level radioactive waste disposal facilities. A few examples of the previous use of cementitious materials for entombment of decommissioned nuclear facilities and proposals for the use in future decommissioning of nuclear reactors in a few countries are also included to provide global perspective.

  18. Computational Design of Short Pulse Laser Driven Iron Opacity Experiments

    NASA Astrophysics Data System (ADS)

    Martin, Madison E.; London, Richard A.; Goluoglu, Sedat; Whitley, Heather D.

    2015-11-01

    Opacity is a critical parameter in the transport of radiation in systems such as inertial confinement fusion capsules and stars. The resolution of current disagreements between solar models and helioseismological observations would benefit from experimental validation of theoretical opacity models. Short pulse lasers can be used to heat targets to higher temperatures and densities than long pulse lasers and pulsed power machines, thus potentially enabling access to emission spectra at conditions relevant to solar models. In order to ensure that the relevant plasma conditions are accessible and that an emission measurement is practical, we use computational design of experiments to optimize the target characteristics and laser conditions. Radiation-hydrodynamic modeling, using HYDRA, is used to investigate the effects of modifying laser irradiance, target dimensions, and dopant dilution on the plasma conditions and emission of an iron opacity target. Several optimized designs reaching temperatures and densities relevant to the radiative zone of the sun will be discussed. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. Lawrence Livermore National Security, LLC.

  19. Design issues in toxicogenomics using DNA microarray experiment

    SciTech Connect

    Lee, Kyoung-Mu; Kim, Ju-Han; Kang, Daehee . E-mail: dhkang@snu.ac.kr

    2005-09-01

    The methods of toxicogenomics might be classified into omics study (e.g., genomics, proteomics, and metabolomics) and population study focusing on risk assessment and gene-environment interaction. In omics study, microarray is the most popular approach. Genes falling into several categories (e.g., xenobiotics metabolism, cell cycle control, DNA repair etc.) can be selected up to 20,000 according to a priori hypothesis. The appropriate type of samples and species should be selected in advance. Multiple doses and varied exposure durations are suggested to identify those genes clearly linked to toxic response. Microarray experiments can be affected by numerous nuisance variables including experimental designs, sample extraction, type of scanners, etc. The number of slides might be determined from the magnitude and variance of expression change, false-positive rate, and desired power. Instead, pooling samples is an alternative. Online databases on chemicals with known exposure-disease outcomes and genetic information can aid the interpretation of the normalized results. Gene function can be inferred from microarray data analyzed by bioinformatics methods such as cluster analysis. The population study often adopts hospital-based or nested case-control design. Biases in subject selection and exposure assessment should be minimized, and confounding bias should also be controlled for in stratified or multiple regression analysis. Optimal sample sizes are dependent on the statistical test for gene-to-environment or gene-to-gene interaction. The design issues addressed in this mini-review are crucial in conducting toxicogenomics study. In addition, integrative approach of exposure assessment, epidemiology, and clinical trial is required.

  20. Conceptual Issues in Quantifying Unusualness and Conceiving Stochastic Experiments: Insights from Students' Experiences in Designing Sampling Simulations

    ERIC Educational Resources Information Center

    Saldanha, Luis

    2016-01-01

    This article reports on a classroom teaching experiment that engaged a group of high school students in designing sampling simulations within a computer microworld. The simulation-design activities aimed to foster students' abilities to conceive of contextual situations as stochastic experiments, and to engage them with the logic of hypothesis…

  1. Experiments and other methods for developing expertise with design of experiments in a classroom setting

    NASA Technical Reports Server (NTRS)

    Patterson, John W.

    1990-01-01

    The only way to gain genuine expertise in Statistical Process Control (SPC) and the design of experiments (DOX) is with repeated practice, but not on canned problems with dead data sets. Rather, one must negotiate a wide variety of problems each with its own peculiarities and its own constantly changing data. The problems should not be of the type for which there is a single, well-defined answer that can be looked up in a fraternity file or in some text. The problems should match as closely as possible the open-ended types for which there is always an abundance of uncertainty. These are the only kinds that arise in real research, whether that be basic research in academe or engineering research in industry. To gain this kind of experience, either as a professional consultant or as an industrial employee, takes years. Vast amounts of money, not to mention careers, must be put at risk. The purpose here is to outline some realistic simulation-type lab exercises that are so simple and inexpensive to run that the students can repeat them as often as desired at virtually no cost. Simulations also allow the instructor to design problems whose outcomes are as noisy as desired but still predictable within limits. Also the instructor and the students can learn a great deal more from the postmortum conducted after the exercise is completed. One never knows for sure what the true data should have been when dealing only with real life experiments. To add a bit more realism to the exercises, it is sometimes desirable to make the students pay for each experimental result from a make-believe budget allocation for the problem.

  2. Introduction to the Design and Optimization of Experiments Using Response Surface Methodology. A Gas Chromatography Experiment for the Instrumentation Laboratory

    ERIC Educational Resources Information Center

    Lang, Patricia L.; Miller, Benjamin I.; Nowak, Abigail Tuttle

    2006-01-01

    The study describes how to design and optimize an experiment with multiple factors and multiple responses. The experiment uses fractional factorial analysis as a screening experiment only to identify important instrumental factors and does not use response surface methodology to find the optimal set of conditions.

  3. The OECI certification/designation program: the Genoa experience.

    PubMed

    Orengo, Giovanni; Pronzato, Paolo; Ferrarini, Manlio

    2015-01-01

    Accreditation and designation procedures by the Organisation of European Cancer Institutes (OECI) have represented a considerable challenge for most of the Italian cancer centers. We summarize the experience of the San Martino-IST in Genoa, which, on the whole, was satisfactory, albeit demanding for the staff. The reorganization of most oncology/hematology operations within the disease management teams was probably the key point that allowed us to obtain approval as it brought about the possibility of bringing in uniform methods of diagnosis/treatment, increasing patient recruitment in clinical trials, and fostering translational research by promoting collaboration between clinicians and laboratory investigators. The creation of a more cohesive supportive and terminal care team facilitated both the OECI procedures as well as the operations within the institution. Finally, some considerations are added to the doctor and nurse management roles in Italian hospitals characterized by noticeable differences from northern Europe. These differences may represent an extra challenge for hospital management and evaluator teams more used to the northern European type of organization.

  4. Physics Design of the National Compact Stellarator Experiment

    SciTech Connect

    G.H. Neilson; M.C. Zarnstorff; J.F. Lyon; the NCSX Team

    2002-02-21

    Compact quasi-axisymmetric stellarators offer the possibility of combining the steady-state low-recirculating power, external control, and disruption resilience of previous stellarators with the low-aspect ratio, high beta-limit, and good confinement of advanced tokamaks. Quasi-axisymmetric equilibria have been developed for the proposed National Compact Stellarator Experiment (NCSX) with average aspect ratio approximately 4.4 and average elongation approximately 1.8. Even with bootstrap-current consistent profiles, they are passively stable to the ballooning, kink, vertical, Mercier, and neoclassical-tearing modes for b > 4%, without the need for external feedback or conducting walls. The bootstrap current generates only 1/4 of the magnetic rotational transform at b = 4% (the rest is from the coils). Transport simulations show adequate fast-ion confinement and thermal neoclassical transport similar to equivalent tokamaks. Modular coils have been designed which reproduce the physics properties, provide good flux surfaces, and allow flexible variation of the plasma shape to control the predicted MHD stability and transport properties.

  5. Conceptual design study for Infrared Limb Experiment (IRLE)

    NASA Technical Reports Server (NTRS)

    Baker, Doran J.; Ulwick, Jim; Esplin, Roy; Batty, J. C.; Ware, Gene; Tew, Craig

    1989-01-01

    The phase A engineering design study for the Infrared Limb Experiment (IRLE) instrument, the infrared portion of the Mesosphere-Lower Thermosphere Explorer (MELTER) satellite payload is given. The IRLE instrument is a satellite instrument, based on the heritage of the Limb Infrared Monitor of the Stratosphere (LIMS) program, that will make global measurements of O3, CO2, NO, NO2, H2O, and OH from earth limb emissions. These measurements will be used to provide improved understanding of the photochemistry, radiation, dynamics, energetics, and transport phenomena in the lower thermosphere, mesosphere, and stratosphere. The IRLE instrument is the infrared portion of the MELTER satellite payload. MELTER is being proposed to NASA Goddard by a consortium consisting of the University of Michigan, University of Colorado and NASA Langley. It is proposed that the Space Dynamics Laboratory at Utah State University (SDL/USU) build the IRLE instrument for NASA Langley. MELTER is scheduled for launch in November 1994 into a sun-synchronous, 650-km circular orbit with an inclination angle of 97.8 deg and an ascending node at 3:00 p.m. local time.

  6. The LHCb Simulation Application, Gauss: Design, Evolution and Experience

    NASA Astrophysics Data System (ADS)

    Clemencic, M.; Corti, G.; Easo, S.; Jones, C. R.; Miglioranzi, S.; Pappagallo, M.; Robbe, P.; LHCb Collaboration

    2011-12-01

    The LHCb simulation application, Gauss, is based on the Gaudi framework and on experiment basic components such as the Event Model and Detector Description. Gauss also depends on external libraries for the generation of the primary events (PYTHIA 6, EvtGen, etc.) and on GEANT4 for particle transport in the experimental setup. The application supports the production of different types of events from minimum bias to B physics signals and particle guns. It is used for purely generator-level studies as well as full simulations. Gauss is used both directly by users and in massive central productions on the grid. The design and implementation of the application and its evolution due to evolving requirements will be described as in the case of the recently adopted Python-based configuration or the possibility of taking into account detectors conditions via a Simulation Conditions database. The challenge of supporting at the same time the flexibililty needed for the different tasks for which it is used, from evaluation of physics reach to background modeling, together with the stability and reliabilty of the code will also be described.

  7. Irradiation Experiment Conceptual Design Parameters for NBSR Fuel Conversion

    SciTech Connect

    Brown N. R.; Brown,N.R.; Baek,J.S; Hanson, A.L.; Cuadra,A.; Cheng,L.Y.; Diamond, D.J.

    2013-03-31

    It has been proposed to convert the National Institute of Standards and Technology (NIST) research reactor, known as the NBSR, from high-enriched uranium (HEU) fuel to low-enriched uranium (LEU) fuel. The motivation to convert the NBSR to LEU fuel is to reduce the risk of proliferation of special nuclear material. This report is a compilation of relevant information from recent studies related to the proposed conversion using a metal alloy of LEU with 10 w/o molybdenum. The objective is to inform the design of the mini-plate and full-size plate irradiation experiments that are being planned. This report provides relevant dimensions of the fuel elements, and the following parameters at steady state: average and maximum fission rate density and fission density, fuel temperature distribution for the plate with maximum local temperature, and two-dimensional heat flux profiles of fuel plates with high power densities. . The latter profiles are given for plates in both the inner and outer core zones and for cores with both fresh and depleted shim arms (reactivity control devices). In addition, a summary of the methodology to obtain these results is presented.

  8. Irradiation Experiment Conceptual Design Parameters for NBSR Fuel Conversion

    SciTech Connect

    Brown, N. R.; Brown, N. R.; Baek, J. S; Hanson, A. L.; Cuadra, A.; Cheng, L. Y.; Diamond, D. J.

    2014-04-30

    It has been proposed to convert the National Institute of Standards and Technology (NIST) research reactor, known as the NBSR, from high-enriched uranium (HEU) fuel to low-Enriched uranium (LEU) fuel. The motivation to convert the NBSR to LEU fuel is to reduce the risk of proliferation of special nuclear material. This report is a compilation of relevant information from recent studies related to the proposed conversion using a metal alloy of LEU with 10 w/o molybdenum. The objective is to inform the design of the mini-plate and full-size-Plate irradiation experiments that are being planned. This report provides relevant dimensions of the fuel elements, and the following parameters at steady state: average and maximum fission rate density and fission density, fuel temperature distribution for the plate with maximum local temperature, and two-dimensional heat flux profiles of fuel plates with high power densities. The latter profiles are given for plates in both the inner and outer core zones and for cores with both fresh and depleted shim arms (reactivity control devices). A summary of the methodology to obtain these results is presented. Fuel element tolerance assumptions and hot channel factors used in the safety analysis are also given.

  9. Magnetohydrodynamic Augmented Propulsion Experiment: I. Performance Analysis and Design

    NASA Technical Reports Server (NTRS)

    Litchford, R. J.; Cole, J. W.; Lineberry, J. T.; Chapman, J. N.; Schmidt, H. J.; Lineberry, C. W.

    2003-01-01

    The performance of conventional thermal propulsion systems is fundamentally constrained by the specific energy limitations associated with chemical fuels and the thermal limits of available materials. Electromagnetic thrust augmentation represents one intriguing possibility for improving the fuel composition of thermal propulsion systems, thereby increasing overall specific energy characteristics; however, realization of such a system requires an extremely high-energy-density electrical power source as well as an efficient plasma acceleration device. This Technical Publication describes the development of an experimental research facility for investigating the use of cross-field magnetohydrodynamic (MHD) accelerators as a possible thrust augmentation device for thermal propulsion systems. In this experiment,a 1.5-MW(sub e) Aerotherm arc heater is used to drive a 2-MW(sub e) MHD accelerator. The heatsink MHD accelerator is configured as an externally diagonalized, segmented channel, which is inserted into a large-bore, 2-T electromagnet. The performance analysis and engineering design of the flow path are described as well as the parameter measurements and flow diagnostics planned for the initial series of test runs.

  10. Design Criteria and Machine Integration of the Ignitor Experiment

    NASA Astrophysics Data System (ADS)

    Bianchi, A.; Coppi, B.

    2010-11-01

    High field, high density compact experiments are the only ones capable of producing, on the basis of available technology and knowledge of plasma physics, plasmas that can reach ignition conditions. The Ignitor machine (R01.32 m, a xb0.47x0.83 m^2, BT<=13 T, Ip<=11 MA) is characterized by a complete structural integration of its major components. A sophisticated Poloidal Field system provides the flexibility to produce the expected sequence of plasma equilibrium configurations during the plasma current and pressure rise. The structural concept of the machine is based on an optimized combination of ``bucking'' and ``wedging''. All components, with the exception of the vacuum vessel, are cooled before each plasma pulse by means of He gas, to an optimal temperature of 30 K, at which the ratio of the electrical resistivity to the specific heat of copper is minimum. The 3D and 2D design and integration of all the core machine components, including electro-fluidic and fluidic lines, has been produced using the Dassault CATIA-V software. A complete structural analysis has verified that the machine can withstand the forces produced for all the main operational scenarios.

  11. Gender Consideration in Experiment Design for Airbrake in Prebreathe

    NASA Technical Reports Server (NTRS)

    Conkin, Johnny; Gernhardt, Michael I.; Dervay, Joseph P.

    2007-01-01

    If gender is a confounder of the decompression sickness (DCS) or venous gas emboli (VGE) outcomes of a proposed air break in oxygen prebreathe (PB) project, then decisions about the final experiment design must be made. We evaluated if the incidence of DCS and VGE from tests in altitude chambers over 20 years were different between men and women after resting and exercise prebreathe protocols. Nitrogen washout during PB is our primary risk mitigation strategy to prevent subsequent DCS and VGE in subjects. Bubbles in the pulmonary artery (venous blood) were detected from the precordial position using Doppler ultrasound bubble detectors. The subjects were monitored for VGE for four min at about 15 min intervals for the duration of the altitude exposure, with maximum bubble grade assigned a Spencer Grade of IV. There was no difference in DCS incidence between men and women in either PB protocol. The incidence of VGE and Grade IV VGE is statistically lower in women compared to men after resting PB. Even when 10 tests were compared with Mantel-Haenszel 2 where both men (n = 168) and women (n = 92) appeared, the p-value for VGE incidence was still significant at 0.03. The incidence of VGE and Grade IV VGE is not statistically lower in women compared to men after exercise PB. Even when six tests were compared with Mantel-Haenszel x2 where both men (n = 165) and women (n = 49) appeared, the p-value for VGE incidence was still not significant at 0.90. Our goal is to understand the risk of brief air breaks during PB without other confounding variables invalidating our conclusions. The cost to additionally account for the confounding role of gender on VGE outcome after resting PB is judged excessive. Our decision is to only evaluate air breaks in the exercise PB protocol. So there is no restriction to recruiting women as test subjects.

  12. Stillbirth Collaborative Research Network: design, methods and recruitment experience.

    PubMed

    Parker, Corette B; Hogue, Carol J R; Koch, Matthew A; Willinger, Marian; Reddy, Uma M; Thorsten, Vanessa R; Dudley, Donald J; Silver, Robert M; Coustan, Donald; Saade, George R; Conway, Deborah; Varner, Michael W; Stoll, Barbara; Pinar, Halit; Bukowski, Radek; Carpenter, Marshall; Goldenberg, Robert

    2011-09-01

    The Stillbirth Collaborative Research Network (SCRN) has conducted a multisite, population-based, case-control study, with prospective enrollment of stillbirths and livebirths at the time of delivery. This paper describes the general design, methods and recruitment experience. The SCRN attempted to enroll all stillbirths and a representative sample of livebirths occurring to residents of pre-defined geographical catchment areas delivering at 59 hospitals associated with five clinical sites. Livebirths <32 weeks gestation and women of African descent were oversampled. The recruitment hospitals were chosen to ensure access to at least 90% of all stillbirths and livebirths to residents of the catchment areas. Participants underwent a standardised protocol including maternal interview, medical record abstraction, placental pathology, biospecimen testing and, in stillbirths, post-mortem examination. Recruitment began in March 2006 and was completed in September 2008 with 663 women with a stillbirth and 1932 women with a livebirth enrolled, representing 69% and 63%, respectively, of the women identified. Additional surveillance for stillbirths continued until June 2009 and a follow-up of the case-control study participants was completed in December 2009. Among consenting women, there were high consent rates for the various study components. For the women with stillbirths, 95% agreed to a maternal interview, chart abstraction and a placental pathological examination; 91% of the women with a livebirth agreed to all of these components. Additionally, 84% of the women with stillbirths agreed to a fetal post-mortem examination. This comprehensive study is poised to systematically study a wide range of potential causes of, and risk factors for, stillbirths and to better understand the scope and incidence of the problem.

  13. Stillbirth Collaborative Research Network: Design, Methods and Recruitment Experience

    PubMed Central

    Parker, Corette B.; Hogue, Carol J. Rowland; Koch, Matthew A.; Willinger, Marian; Reddy, Uma; Thorsten, Vanessa R.; Dudley, Donald J.; Silver, Robert M.; Coustan, Donald; Saade, George R.; Conway, Deborah; Varner, Michael W.; Stoll, Barbara; Pinar, Halit; Bukowski, Radek; Carpenter, Marshall; Goldenberg, Robert

    2013-01-01

    SUMMARY The Stillbirth Collaborative Research Network (SCRN) has conducted a multisite, population-based, case-control study, with prospective enrollment of stillbirths and live births at the time of delivery. This paper describes the general design, methods, and recruitment experience. The SCRN attempted to enroll all stillbirths and a representative sample of live births occurring to residents of pre-defined geographic catchment areas delivering at 59 hospitals associated with five clinical sites. Live births <32 weeks gestation and women of African descent were oversampled. The recruitment hospitals were chosen to ensure access to at least 90% of all stillbirths and live births to residents of the catchment areas. Participants underwent a standardized protocol including maternal interview, medical record abstraction, placental pathology, biospecimen testing, and, in stillbirths, postmortem examination. Recruitment began in March 2006 and was completed in September 2008 with 663 women with a stillbirth and 1932 women with a live birth enrolled, representing 69% and 63%, respectively, of the women identified. Additional surveillance for stillbirth continued through June 2009 and a follow-up of the case-control study participants was completed in December 2009. Among consenting women, there were high consent rates for the various study components. For the women with stillbirth, 95% agreed to maternal interview, chart abstraction, and placental pathologic examination; 91% of the women with live birth agreed to all of these components. Additionally, 84% of the women with stillbirth agreed to a fetal postmortem examination. This comprehensive study is poised to systematically study a wide range of potential causes of, and risk factors for, stillbirth and to better understand the scope and incidence of the problem. PMID:21819424

  14. Skylab Earth Resource Experiment Package critical design review. [conference

    NASA Technical Reports Server (NTRS)

    1973-01-01

    An outline of the conference for reviewing the design of the EREP is presented. Systems design for review include: tape recorder, support equipment, view finder/tracking, support hardware, and control and display panel.

  15. Viking dynamics experience with application to future payload design

    NASA Technical Reports Server (NTRS)

    Barrett, S.; Rader, W. P.; Payne, K. R.

    1978-01-01

    Analytical and test techniques are discussed. Areas in which hindsight indicated erroneous, redundant, or unnecessarily severe design and test specifications are identified. Recommendations are made for improvements in the dynamic design and criteria philosophy, aimed at reducing costs for payloads.

  16. Model-Driven Design: Systematically Building Integrated Blended Learning Experiences

    ERIC Educational Resources Information Center

    Laster, Stephen

    2010-01-01

    Developing and delivering curricula that are integrated and that use blended learning techniques requires a highly orchestrated design. While institutions have demonstrated the ability to design complex curricula on an ad-hoc basis, these projects are generally successful at a great human and capital cost. Model-driven design provides a…

  17. Predictive Model for the Design of Zwitterionic Polymer Brushes: A Statistical Design of Experiments Approach.

    PubMed

    Kumar, Ramya; Lahann, Joerg

    2016-07-06

    The performance of polymer interfaces in biology is governed by a wide spectrum of interfacial properties. With the ultimate goal of identifying design parameters for stem cell culture coatings, we developed a statistical model that describes the dependence of brush properties on surface-initiated polymerization (SIP) parameters. Employing a design of experiments (DOE) approach, we identified operating boundaries within which four gel architecture regimes can be realized, including a new regime of associated brushes in thin films. Our statistical model can accurately predict the brush thickness and the degree of intermolecular association of poly[{2-(methacryloyloxy) ethyl} dimethyl-(3-sulfopropyl) ammonium hydroxide] (PMEDSAH), a previously reported synthetic substrate for feeder-free and xeno-free culture of human embryonic stem cells. DOE-based multifunctional predictions offer a powerful quantitative framework for designing polymer interfaces. For example, model predictions can be used to decrease the critical thickness at which the wettability transition occurs by simply increasing the catalyst quantity from 1 to 3 mol %.

  18. Design of experiments for zeroth and first-order reaction rates.

    PubMed

    Amo-Salas, Mariano; Martín-Martín, Raúl; Rodríguez-Aragón, Licesio J

    2014-09-01

    This work presents optimum designs for reaction rates experiments. In these experiments, time at which observations are to be made and temperatures at which reactions are to be run need to be designed. Observations are performed along time under isothermal conditions. Each experiment needs a fixed temperature and so the reaction can be measured at the designed times. For these observations under isothermal conditions over the same reaction a correlation structure has been considered. D-optimum designs are the aim of our work for zeroth and first-order reaction rates. Temperatures for the isothermal experiments and observation times, to obtain the most accurate estimates of the unknown parameters, are provided in these designs. D-optimum designs for a single observation in each isothermal experiment or for several correlated observations have been obtained. Robustness of the optimum designs for ranges of the correlation parameter and comparisons of the information gathered by different designs are also shown.

  19. The Historical and Situated Nature Design Experiments--Implications for Data Analysis

    ERIC Educational Resources Information Center

    Krange, I.; Ludvigsen, Sten

    2009-01-01

    This article is a methodological contribution to the use of design experiments in educational research. We will discuss the implications of a historical and situated interpretation to design experiments, the consequences this has for the analysis of the collected data and empirically based suggestions to improve the designs of the computer-based…

  20. Taxonomic Organization Scaffolds Young Children's Learning from Storybooks: A Design Experiment

    ERIC Educational Resources Information Center

    Kaefer, Tanya; Pinkham, Ashley M.; Neuman, Susan B.

    2010-01-01

    The purpose of this design experiment was to research, test and iteratively design a set of taxonomically-organized storybooks that served to scaffold young children's word learning and concept development. Specifically, Phase 1 of the design experiment asked: (1) What are the effects of taxonomic organization on children's ability to acquire…

  1. NASA battery testbed: A designed experiment for the optimization of LEO battery operational parameters

    SciTech Connect

    Deligiannis, F.; Perrone, D.; Distefano, S.

    1996-02-01

    Simulation of spacecraft battery operation is implemented. Robust design experiment to obtain optimum battery operational parameters is performed. It is found that short term tests using robust design of experiments can provide guidelines for optimum battery operation. It is decided to use robust design approach to provide guidelines for battery operation on current spacecraft in orbit as batteries age (GRO, UARS, EUVE, TOPEX).

  2. NASA battery testbed: A designed experiment for the optimization of LEO battery operational parameters

    NASA Technical Reports Server (NTRS)

    Deligiannis, F.; Perrone, D.; DiStefano, S.

    1996-01-01

    Simulation of spacecraft battery operation is implemented. Robust design experiment to obtain optimum battery operational parameters is performed. It is found that short term tests using robust design of experiments can provide guidelines for optimum battery operation. It is decided to use robust design approach to provide guidelines for battery operation on current spacecraft in orbit as batteries age (GRO, UARS, EUVE, TOPEX).

  3. How Design Experiments Can Inform a Rethinking of Transfer and Vice Versa.

    ERIC Educational Resources Information Center

    Lobato, Joanne

    2003-01-01

    Proposes that the theoretical assumptions underlying a researcher's model of transfer affect how design decisions are made. Discusses limitations with the two most common approaches to transfer in design experiments, exploring how design experiments can challenge the basic assumptions underlying transfer models, and illustrating this point by…

  4. Validation of scaffold design optimization in bone tissue engineering: finite element modeling versus designed experiments.

    PubMed

    Uth, Nicholas; Mueller, Jens; Smucker, Byran; Yousefi, Azizeh-Mitra

    2017-02-21

    This study reports the development of biological/synthetic scaffolds for bone tissue engineering (TE) via 3D bioplotting. These scaffolds were composed of poly(L-lactic-co-glycolic acid) (PLGA), type I collagen, and nano-hydroxyapatite (nHA) in an attempt to mimic the extracellular matrix of bone. The solvent used for processing the scaffolds was 1,1,1,3,3,3-hexafluoro-2-propanol. The produced scaffolds were characterized by scanning electron microscopy, microcomputed tomography, thermogravimetric analysis, and unconfined compression test. This study also sought to validate the use of finite-element optimization in COMSOL Multiphysics for scaffold design. Scaffold topology was simplified to three factors: nHA content, strand diameter, and strand spacing. These factors affect the ability of the scaffold to bear mechanical loads and how porous the structure can be. Twenty four scaffolds were constructed according to an I-optimal, split-plot designed experiment (DE) in order to generate experimental models of the factor-response relationships. Within the design region, the DE and COMSOL models agreed in their recommended optimal nHA (30%) and strand diameter (460 μm). However, the two methods disagreed by more than 30% in strand spacing (908 μm for DE; 601 μm for COMSOL). Seven scaffolds were 3D-bioplotted to validate the predictions of DE and COMSOL models (4.5-9.9 MPa measured moduli). The predictions for these scaffolds showed relative agreement for scaffold porosity (mean absolute percentage error of 4% for DE and 13% for COMSOL), but were substantially poorer for scaffold modulus (51% for DE; 21% for COMSOL), partly due to some simplifying assumptions made by the models. Expanding the design region in future experiments (e.g., higher nHA content and strand diameter), developing an efficient solvent evaporation method, and exerting a greater control over layer overlap could allow developing PLGA-nHA-collagen scaffolds to meet the mechanical requirements for

  5. The Oxford Conception Study design and recruitment experience.

    PubMed

    Pyper, Cecilia; Bromhall, Lise; Dummett, Sarah; Altman, Douglas G; Brownbill, Pat; Murphy, Michael

    2006-11-01

    The Oxford Conception Study is a randomised controlled trial that aims to determine whether or not information about potential fertility from a device that monitors urinary hormones will increase the conception rate in women wishing to conceive. Three modified versions of a fertility monitor have been developed for the study. The monitor measures the levels of urinary oestrone-3-glucuronide (E3G) and luteinising hormone (LH), and the display indicates high or low fertility. The monitor requests all women to test their urine from day 6 to day 25 of the menstrual cycle inclusive. One-third of women are randomised to receive information from the fertility monitor about the early fertile time (from the first rise in E3G until the LH surge is detected), one-third receive information about the late fertile time (the onset of the LH surge and the following 2 days), and a third do not receive any information (control group). All the women are followed up for 6 months or until they are pregnant. A total of 1453 women have been recruited into the study, reaching the study recruitment goal for 80% power to detect a 10% difference in three-cycle pregnancy rate between the Late Fertile Time group (50%) and the Control group (40%), allowing for a 15% non-pregnancy drop-out rate. Follow-up of the women is currently ongoing. The primary analysis will compare the cumulative three-cycle pregnancy rate between each of the study arms. Time-specific conception probabilities will be estimated from coitus information recorded in 12-h intervals. The data from this study will also allow many additional questions to be addressed, including changes in intercourse patterns with feedback about the fertile days and other questions in relation to menstrual cycle function, sexual intercourse, stress, exposures to tobacco products, alcohol, caffeine and medications, fertility and pregnancy outcomes. In addition to presenting the study design, we review the recruitment experience for the Oxford

  6. The BWR advanced fuel design experience using Studsvik CMS

    SciTech Connect

    DiGiovine, A.S.; Gibbon, S.H.; Wiksell, G.

    1996-12-31

    The current trend within the nuclear industry is to maximize generation by extending cycle lengths and taking outages as infrequently as possible. As a result, many utilities have begun to use fuel designed to meet these more demanding requirements. These fuel designs are significantly more heterogeneous in mechanical and neutronic detail than prior designs. The question arises as to how existing in-core fuel management codes, such as Studsvik CMS perform in modeling cores containing these designs. While this issue pertains to both pressurized water reactors (PWRs) and boiling water reactors (BWRs), this summary focuses on BWR applications.

  7. Ignition experiment design based on γ-pumping gas lasers

    NASA Astrophysics Data System (ADS)

    Bonyushkin, E. K.; Il'kaev, R. I.; Morovov, A. P.; Pavlovskii, A. I.; Lazhintsev, B. V.; Basov, N.; Gus'kov, S. Yu.; Rosanov, V. B.; Zmitrenko, N. V.

    1996-05-01

    Comparative analysis of gas lasers pumped by γ-radiation for ignition experiment is carried out. The possibilities of frequency-time pulse shaping are discussed for these kinds of laser drivers. New type of ICF target (LIGHT-target), which is able to provide an uniform deposition of laser driver energy is proposed as a target for ignition experiment.

  8. Applying modeling Results in designing a global tropospheric experiment

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A set of field experiments and advanced modeling studies which provide a strategy for a program of global tropospheric experiments was identified. An expanded effort to develop space applications for trospheric air quality monitoring and studies was recommended. The tropospheric ozone, carbon, nitrogen, and sulfur cycles are addressed. Stratospheric-tropospheric exchange is discussed. Fast photochemical processes in the free troposphere are considered.

  9. On the design of experiments to study extreme field limits

    SciTech Connect

    Bulanov, S. S.; Chen, M.; Schroeder, C. B.; Esarey, E.; Leemans, W. P.; Bulanov, S. V.; Esirkepov, T. Zh.; Kando, M.; Koga, J. K.; Zhidkov, A. G.; Chen, P.; Mur, V. D.; Narozhny, N. B.; Popov, V. S.; Thomas, A. G. R.; Korn, G.

    2012-12-21

    We propose experiments on the collision of high intensity electromagnetic pulses with electron bunches and on the collision of multiple electromagnetic pulses for studying extreme field limits in the nonlinear interaction of electromagnetic waves. The effects of nonlinear QED will be revealed in these laser plasma experiments.

  10. Education Through the Dance Experience. Designed for Children Series.

    ERIC Educational Resources Information Center

    Docherty, David

    This text presents a creative, child-centered approach to the teaching of dance in the elementary school based on the theories and methods of Rudolf Laban and Joyce Boorman. The content area of dance is briefly described so that the practical experiences presented later in the text can be viewed in perspective. Dance experiences are presented that…

  11. Design and Construction of a Shock Tube Experiment for Multiphase Instability Experiments

    NASA Astrophysics Data System (ADS)

    Middlebrooks, John; Black, Wolfgang; Avgoustopoulos, Constantine; Allen, Roy; Kathakapa, Raj; Guo, Qiwen; McFarland, Jacob

    2016-11-01

    Hydrodynamic instabilities are important phenomena that have a wide range of practical applications in engineering and physics. One such instability, the shock driven multiphase instability (SDMI), arises when a shockwave accelerates an interface between two particle-gas mixtures with differing multiphase properties. The SDMI is present in high energy explosives, scramjets, and supernovae. A practical way of studying shock wave driven instabilities is through experimentation in a shock tube laboratory. This poster presentation will cover the design and data acquisition process of the University of Missouri's Fluid Mixing Shock Tube Laboratory. In the shock tube, a pressure generated shockwave is passed through a multiphase interface, creating the SDMI instability. This can be photographed for observation using high speed cameras, lasers, and advance imaging techniques. Important experimental parameters such as internal pressure and temperature, and mass flow rates of gases can be set and recorded by remotely controlled devices. The experimental facility provides the University of Missouri's Fluid Mixing Shock Tube Laboratory with the ability to validate simulated experiments and to conduct further inquiry into the field of shock driven multiphase hydrodynamic instabilities. Advisor.

  12. Kinetics experiments and bench-scale system: Background, design, and preliminary experiments

    SciTech Connect

    Rofer, C.K.

    1987-10-01

    The project, Supercritical Water Oxidation of Hazardous Chemical Waste, is a Hazardous Waste Remedial Actions Program (HAZWRAP) Research and Development task being carried out by the Los Alamos National Laboratory. Its objective is to obtain information for use in understanding the basic technology and for scaling up and applying oxidation in supercritical water as a viable process for treating a variety of DOE-DP waste streams. This report gives the background and rationale for kinetics experiments on oxidation in supercritical water being carried out as a part of this HAZWRAP Research and Development task. It discusses supercritical fluid properties and their relevance to applying this process to the destruction of hazardous wastes. An overview is given of the small emerging industry based on applications of supercritical water oxidation. Factors that could lead to additional applications are listed. Modeling studies are described as a basis for the experimental design. The report describes plug flow reactor and batch reactor systems, and presents preliminary results. 28 refs., 4 figs., 5 tabs.

  13. Statistical issues in the design and planning of proteomic profiling experiments.

    PubMed

    Cairns, David A

    2015-01-01

    The statistical design of a clinical proteomics experiment is a critical part of well-undertaken investigation. Standard concepts from experimental design such as randomization, replication and blocking should be applied in all experiments, and this is possible when the experimental conditions are well understood by the investigator. The large number of proteins simultaneously considered in proteomic discovery experiments means that determining the number of required replicates to perform a powerful experiment is more complicated than in simple experiments. However, by using information about the nature of an experiment and making simple assumptions this is achievable for a variety of experiments useful for biomarker discovery and initial validation.

  14. Laser communication experiment. Volume 1: Design study report: Spacecraft transceiver. Part 1: Transceiver design

    NASA Technical Reports Server (NTRS)

    1970-01-01

    The ATS-F Laser Communications Experiment (LCE) is the first significant step in the application of laser systems to space communications. The space-qualified laser communications system being developed in this experiment, and the data resulting from its successful deployment in space, will be applicable to the use of laser communications systems in a wide variety of manned as well as unmanned space missions, both near earth and in deep space. Particular future NASA missions which can benefit from this effort are the Tracking and Data Relay Satellite System and the Earth Resources Satellites. The LCE makes use of carbon dioxide lasers to establish simultaneous, two-way communication between the ATS-F synchronous satellite and a ground station. In addition, the LCE is designed to permit communication with a similar spacecraft transceiver proposed to be flown on ATS-G, nominally one year after the launch of ATS-F. This would be the first attempt to employ lasers for satellite-to-satellite communications.

  15. Experiences performing conceptual design optimization of transport aircraft

    NASA Technical Reports Server (NTRS)

    Arbuckle, P. D.; Sliwa, S. M.

    1984-01-01

    Optimum Preliminary Design of Transports (OPDOT) is a computer program developed at NASA Langley Research Center for evaluating the impact of new technologies upon transport aircraft. For example, it provides the capability to look at configurations which have been resized to take advantage of active controls and provide and indication of economic sensitivity to its use. Although this tool returns a conceptual design configuration as its output, it does not have the accuracy, in absolute terms, to yield satisfactory point designs for immediate use by aircraft manufacturers. However, the relative accuracy of comparing OPDOT-generated configurations while varying technological assumptions has been demonstrated to be highly reliable. Hence, OPDOT is a useful tool for ascertaining the synergistic benefits of active controls, composite structures, improved engine efficiencies and other advanced technological developments. The approach used by OPDOT is a direct numerical optimization of an economic performance index. A set of independent design variables is iterated, given a set of design constants and data. The design variables include wing geometry, tail geometry, fuselage size, and engine size. This iteration continues until the optimum performance index is found which satisfies all the constraint functions. The analyst interacts with OPDOT by varying the input parameters to either the constraint functions or the design constants. Note that the optimization of aircraft geometry parameters is equivalent to finding the ideal aircraft size, but with more degrees of freedom than classical design procedures will allow.

  16. Redesigning the Urban Design Studio: Two Learning Experiments

    ERIC Educational Resources Information Center

    Pak, Burak; Verbeke, Johan

    2013-01-01

    The main aim of this paper is to discuss how the combination of Web 2.0, social media and geographic technologies can provide opportunities for learning and new forms of participation in an urban design studio. This discussion is mainly based on our recent findings from two experimental urban design studio setups as well as former research and…

  17. The Future of Management as Design: A Thought Experiment

    ERIC Educational Resources Information Center

    Bouchard, Veronique; del Forno, Leon

    2012-01-01

    Purpose: Management practices and education are presently in a stage of reappraisal and a growing number of scholars and experts are suggesting that managers should be taught and adopt the approach and methodologies of designers. The purpose of this paper is to imagine the impact of this move and to try and foresee whether "management as design"…

  18. Space Shuttle Orbiter thermal protection system design and flight experience

    NASA Technical Reports Server (NTRS)

    Curry, Donald M.

    1993-01-01

    The Space Shuttle Orbiter Thermal Protection System materials, design approaches associated with each material, and the operational performance experienced during fifty-five successful flights are described. The flights to date indicate that the thermal and structural design requirements were met and that the overall performance was outstanding.

  19. Moving Without Wheels: Educational Experiments in Robot Design and Locomotion

    DTIC Science & Technology

    2008-01-01

    Novel locomotion Lecture material presented comparative anatomies commenting on joint placement and limb lengths and the resulting effects on the...Figure 4: Six-legged walking robot. Designed for stability and speed. Figure 5: Turtle -like robot, designed for power. Figure 6

  20. Space Shuttle Orbiter thermal protection system design and flight experience

    NASA Astrophysics Data System (ADS)

    Curry, Donald M.

    1993-07-01

    The Space Shuttle Orbiter Thermal Protection System materials, design approaches associated with each material, and the operational performance experienced during fifty-five successful flights are described. The flights to date indicate that the thermal and structural design requirements were met and that the overall performance was outstanding.

  1. Developing Teachers' Competences for Designing Inclusive Learning Experiences

    ERIC Educational Resources Information Center

    Navarro, Silvia Baldiris; Zervas, Panagiotis; Gesa, Ramon Fabregat; Sampson, Demetrios G.

    2016-01-01

    Inclusive education, namely the process of providing all learners with equal educational opportunities, is a major challenge for many educational systems worldwide. In order to address this issue, a widely used framework has been developed, namely the Universal Design for Learning (UDL), which aims to provide specific educational design guidelines…

  2. Use of Taguchi design of experiments to optimize and increase robustness of preliminary designs

    NASA Technical Reports Server (NTRS)

    Carrasco, Hector R.

    1992-01-01

    The research performed this summer includes the completion of work begun last summer in support of the Air Launched Personnel Launch System parametric study, providing support on the development of the test matrices for the plume experiments in the Plume Model Investigation Team Project, and aiding in the conceptual design of a lunar habitat. After the conclusion of last years Summer Program, the Systems Definition Branch continued with the Air Launched Personnel Launch System (ALPLS) study by running three experiments defined by L27 Orthogonal Arrays. Although the data was evaluated during the academic year, the analysis of variance and the final project review were completed this summer. The Plume Model Investigation Team (PLUMMIT) was formed by the Engineering Directorate to develop a consensus position on plume impingement loads and to validate plume flowfield models. In order to obtain a large number of individual correlated data sets for model validation, a series of plume experiments was planned. A preliminary 'full factorial' test matrix indicated that 73,024 jet firings would be necessary to obtain all of the information requested. As this was approximately 100 times more firings than the scheduled use of Vacuum Chamber A would permit, considerable effort was needed to reduce the test matrix and optimize it with respect to the specific objectives of the program. Part of the First Lunar Outpost Project deals with Lunar Habitat. Requirements for the habitat include radiation protection, a safe haven for occasional solar flare storms, an airlock module as well as consumables to support 34 extra vehicular activities during a 45 day mission. The objective for the proposed work was to collaborate with the Habitat Team on the development and reusability of the Logistics Modules.

  3. Use of Taguchi design of experiments to optimize and increase robustness of preliminary designs

    NASA Astrophysics Data System (ADS)

    Carrasco, Hector R.

    1992-12-01

    The research performed this summer includes the completion of work begun last summer in support of the Air Launched Personnel Launch System parametric study, providing support on the development of the test matrices for the plume experiments in the Plume Model Investigation Team Project, and aiding in the conceptual design of a lunar habitat. After the conclusion of last years Summer Program, the Systems Definition Branch continued with the Air Launched Personnel Launch System (ALPLS) study by running three experiments defined by L27 Orthogonal Arrays. Although the data was evaluated during the academic year, the analysis of variance and the final project review were completed this summer. The Plume Model Investigation Team (PLUMMIT) was formed by the Engineering Directorate to develop a consensus position on plume impingement loads and to validate plume flowfield models. In order to obtain a large number of individual correlated data sets for model validation, a series of plume experiments was planned. A preliminary 'full factorial' test matrix indicated that 73,024 jet firings would be necessary to obtain all of the information requested. As this was approximately 100 times more firings than the scheduled use of Vacuum Chamber A would permit, considerable effort was needed to reduce the test matrix and optimize it with respect to the specific objectives of the program. Part of the First Lunar Outpost Project deals with Lunar Habitat. Requirements for the habitat include radiation protection, a safe haven for occasional solar flare storms, an airlock module as well as consumables to support 34 extra vehicular activities during a 45 day mission. The objective for the proposed work was to collaborate with the Habitat Team on the development and reusability of the Logistics Modules.

  4. Evidence-based recommendations for designing free-sorting experiments.

    PubMed

    Blanchard, Simon J; Banerji, Ishani

    2016-12-01

    The card-sorting task is a flexible research tool that is widely used across many of the subfields of psychology. Yet this same great flexibility requires researchers to make several (seemingly arbitrary) decisions in their designs, such as fixing a sufficient number of objects to sort, setting task requirements, and creating task instructions for participants. In the present research, we provide a systematic empirical investigation of the consequences of typical researcher design choices while administering sorting tasks. Specifically, we studied the effects of seven sorting task design factors by collecting data from over 1,000 online participants assigned to one of 36 sorting tasks, as part of a fractional factorial experimental design. Analyses show the effects of the various researcher decisions on the probability that participants would quit the task, the amount of time spent on the task, the number of piles made, and posttask measures such as satisfaction and depletion. Research design recommendations are provided.

  5. The value of well-designed experiments in studying diseases with special reference to amphibians.

    PubMed

    Blaustein, Andrew R; Alford, Ross A; Harris, Reid N

    2009-09-01

    Relatively few studies of amphibian diseases have employed standard ecological experimental designs. We discuss what constitutes a well-designed ecological experiment and encourage their use in disease studies. We illustrate how well-designed experiments can be used to determine the effects of pathogens on amphibians and we illustrate how ancillary information, including that collected using molecular tools, can be used to enhance the value of such experiments.

  6. Generalizing Over Conditions by Combining the Multitrait Multimethod Matrix and the Representative Design of Experiments,

    DTIC Science & Technology

    1986-01-01

    MATRIX AND THE (0 REPRESENTATIVE DESIGN OF EXPERIMENTS .4 Kenneth R. Hammond, Robert M. Hamm and Janet Grassia ) 13 LA.’I.7 A GENERALIZING OVER...CONDITIONS BY COMBINING THE MULTITKAIT MULTIMETHOD MATRIX AND THE 46_6 REPRESENTATIVE DESIGN OF EXPERIMENTS Kenneth R. Hammond, Robert M. Hamm and Janet...and the Representative Design of Experiments 6. PERFORMINGOn. REPORT NUMER 7. AUTHOR(@) S. CONTRACT oR GRANT NUM6ER(s) Kenneth R. Hammond, Robert M

  7. Design reuse experience of space and hazardous operations robots

    NASA Technical Reports Server (NTRS)

    Oneil, P. Graham

    1994-01-01

    A comparison of design drivers for space and hazardous nuclear waste operating robots details similarities and differences in operations, performance and environmental parameters for these critical environments. The similarities are exploited to provide low risk system components based on reuse principles and design knowledge. Risk reduction techniques are used for bridging areas of significant differences. As an example, risk reduction of a new sensor design for nuclear environment operations is employed to provide upgradeable replacement units in a reusable architecture for significantly higher levels of radiation.

  8. Cooling water for SSC experiments: Supplemental Conceptual Design Report (SCDR)

    SciTech Connect

    Doyle, R.E.

    1989-10-20

    This paper discusses the following topics on cooling water design on the superconducting super collider; low conductivity water; industrial cooling water; chilled water systems; and radioactive water systems. (LSP)

  9. KiloPower Project - KRUSTY Experiment Nuclear Design

    SciTech Connect

    Poston, David Irvin; Godfroy, Thomas; Mcclure, Patrick Ray; Sanchez, Rene Gerardo

    2015-07-20

    This PowerPoint presentation covers the following topics: Reference Kilopower configuration; Reference KRUSTY configuration; KRUSTY design sensitivities; KRUSTY reactivity coefficients; KRUSTY criticality safety and control; KRUSTY core activation/dose; and KRUSTY shielding, room activation/dose.

  10. Geothermal FIT Design: International Experience and U.S. Considerations

    SciTech Connect

    Rickerson, W.; Gifford, J.; Grace, R.; Cory, K.

    2012-08-01

    Developing power plants is a risky endeavor, whether conventional or renewable generation. Feed-in tariff (FIT) policies can be designed to address some of these risks, and their design can be tailored to geothermal electric plant development. Geothermal projects face risks similar to other generation project development, including finding buyers for power, ensuring adequate transmission capacity, competing to supply electricity and/or renewable energy certificates (RECs), securing reliable revenue streams, navigating the legal issues related to project development, and reacting to changes in existing regulations or incentives. Although FITs have not been created specifically for geothermal in the United States to date, a variety of FIT design options could reduce geothermal power plant development risks and are explored. This analysis focuses on the design of FIT incentive policies for geothermal electric projects and how FITs can be used to reduce risks (excluding drilling unproductive exploratory wells).

  11. Review Committee report on the conceptual design of the Tokamak Physics Experiment

    SciTech Connect

    Not Available

    1993-04-01

    This report discusses the following topics on the conceptual design of the Tokamak Physics Experiment: Role and mission of TPX; overview of design; physics design assessment; engineering design assessment; evaluation of cost, schedule, and management plans; and, environment safety and health.

  12. Supervisory Control of Underwater Telemanipulators: Design and Experiment.

    DTIC Science & Technology

    1982-08-30

    control ........ 61 3.6 Descriptions for process level control ......... 67 Chapter 4 The design of a software man-machine interface...different types of human oriented interfaces for achieving such control will be presented. 4. The design of a new man-machine interface system intended 11...under the direction of a human operator [6]. Supervisory control fits on a continuum between manually controlled

  13. Structural Optimization of a Force Balance Using a Computational Experiment Design

    NASA Technical Reports Server (NTRS)

    Parker, P. A.; DeLoach, R.

    2002-01-01

    This paper proposes a new approach to force balance structural optimization featuring a computational experiment design. Currently, this multi-dimensional design process requires the designer to perform a simplification by executing parameter studies on a small subset of design variables. This one-factor-at-a-time approach varies a single variable while holding all others at a constant level. Consequently, subtle interactions among the design variables, which can be exploited to achieve the design objectives, are undetected. The proposed method combines Modern Design of Experiments techniques to direct the exploration of the multi-dimensional design space, and a finite element analysis code to generate the experimental data. To efficiently search for an optimum combination of design variables and minimize the computational resources, a sequential design strategy was employed. Experimental results from the optimization of a non-traditional force balance measurement section are presented. An approach to overcome the unique problems associated with the simultaneous optimization of multiple response criteria is described. A quantitative single-point design procedure that reflects the designer's subjective impression of the relative importance of various design objectives, and a graphical multi-response optimization procedure that provides further insights into available tradeoffs among competing design objectives are illustrated. The proposed method enhances the intuition and experience of the designer by providing new perspectives on the relationships between the design variables and the competing design objectives providing a systematic foundation for advancements in structural design.

  14. Human experience and product usability: principles to assist the design of user-product interactions.

    PubMed

    Chamorro-Koc, Marianella; Popovic, Vesna; Emmison, Michael

    2009-07-01

    This paper introduces research that investigates how human experience influences people's understandings of product usability. It describes an experiment that employs visual representation of concepts to elicit participants' ideas of a product's use. Results from the experiment lead to the identification of relationships between human experience, knowledge, and context-of-use--relationships that influence designers' and users' concepts of product usability. These relationships are translated into design principles that inform the design activity with respect to the aspects of experience that trigger people's understanding of a product's use. A design tool (ECEDT) is devised to aid designers in the application of these principles. This tool is then trialled in the context of a design task in order to verify applicability of the findings.

  15. Designing Hospital for better Infection Control: an Experience.

    PubMed

    Rao, Skm

    2004-01-01

    The physical design of hospital is an essential component of a hospital's infection control strategy, incorporating infection control issues to minimise the risk of infection transmission. Hospital design therefore, needs to consider the separation of dirty and clean areas, adequate ventilation, lighting and storage facilities and design of patient accommodation areas, including adequate number of wash hand basins and single bed facilities. A 250 bed general hospital was planned keeping in view structural and design elements necessary for success of a good infection control programme. Various National and International Standards like BSI recommendations, JCAHO, IC Standards, DHSS, ASHRAE, AIA and OSHA were studied and compared with our planning parameters. Planning of ward unit, ICU, Operation theatre and Isolation wards were especially reviewed in the light of recent knowledge available in the field of hospital acquired infection and modifications were carried out. Need for effective identification of potential infections, risks in the design of a hospital were stressed. Engineering controls required to reduce the concentration of infectious droplet nuclei in the air and prevention of transmission of disease were highlighted.

  16. An Introduction to Statistical Design of Experiments in Metallurgical Research,

    DTIC Science & Technology

    1963-01-01

    10 5. Recovery versus conditioning t ime in a three-level experiment...... 13 6. Recovery versus conditioning time in a...35 4. Tests in a factorial experiment to investigate recovery of manga- nese by flotation in which three levels of conditioning time and four...to • the factors Xa and • : ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ical recoveries of for the three fac— - tors and represent- ing the response by means

  17. The prototype design of the Stanford Relativity Gyro Experiment

    NASA Technical Reports Server (NTRS)

    Parkinson, Bradford W.; Everitt, C. W. Francis; Turneaure, John P.; Parmley, Richard T.

    1987-01-01

    The Stanford Relativity Gyroscope Experiment constitutes a fundamental test of Einstein's General Theory of Relativity, probing such heretofore untested aspects of the theory as those that relate to spin by means of drag-free satellite-borne gyroscopes. General Relativity's prediction of two orthogonal precessions (motional and geodetic) for a perfect Newtonian gyroscope in polar orbit has not yet been experimentally assessed, and will mark a significant advancement in experimental gravitation. The technology employed in the experiment has been under development for 25 years at NASA's Marshall Space Flight Center. Four fused quartz gyroscopes will be used.

  18. AGC-1 Experiment and Final Preliminary Design Report

    SciTech Connect

    Robert L. Bratton; Tim Burchell

    2006-08-01

    This report details the experimental plan and design as of the preliminary design review for the Advanced Test Reactor Graphite Creep-1 graphite compressive creep capsule. The capsule will contain five graphite grades that will be irradiated in the Advanced Test Reactor at the Idaho National Laboratory to determine the irradiation induced creep constants. Seven other grades of graphite will be irradiated to determine irradiated physical properties. The capsule will have an irradiation temperature of 900 C and a peak irradiation dose of 5.8 x 10{sup 21} n/cm{sup 2} [E > 0.1 MeV], or 4.2 displacements per atom.

  19. Conditional Optimal Design in Three- and Four-Level Experiments

    ERIC Educational Resources Information Center

    Hedges, Larry V.; Borenstein, Michael

    2014-01-01

    The precision of estimates of treatment effects in multilevel experiments depends on the sample sizes chosen at each level. It is often desirable to choose sample sizes at each level to obtain the smallest variance for a fixed total cost, that is, to obtain optimal sample allocation. This article extends previous results on optimal allocation to…

  20. Managerial-Skills Development: An Experience in Program Design

    ERIC Educational Resources Information Center

    Thorne, Edward H.; Marshall, Jean L.

    1976-01-01

    The article is an overview of the design of a Managerial Skills Development Program Model in an industrial setting which was based on adult education principles. Discussed are: program objectives and philosophy, educative environment, group commitment, group-centered action, program evaluation and revision, manager/instructor teams, and…

  1. User-Centered Design in Practice: The Brown University Experience

    ERIC Educational Resources Information Center

    Bordac, Sarah; Rainwater, Jean

    2008-01-01

    This article presents a case study in user-centered design that explores the needs and preferences of undergraduate users. An analysis of LibQual+ and other user surveys, interviews with public service staff, and a formal American with Disabilities Act accessibility review served as the basis for planning a redesign of the Brown University…

  2. Unknown Gases: Student-Designed Experiments in the Introductory Laboratory.

    ERIC Educational Resources Information Center

    Hanson, John; Hoyt, Tim

    2002-01-01

    Introductory students design and carry-out experimental procedures to determine the identity of three unknown gases from a list of eight possibilities: air, nitrogen, oxygen, argon, carbon dioxide, helium, methane, and hydrogen. Students are excited and motivated by the opportunity to come up with their own experimental approach to solving a…

  3. Proficiency-Based Curriculum Design: Principles Derived from Government Experience.

    ERIC Educational Resources Information Center

    Lowe, Pardee, Jr.

    1985-01-01

    Describes principles for designing a proficiency-based course to prepare students for the ACTFL/ETS Advanced Plus/Superior level according to Interagency Language Roundtable guidelines. Proposes ways to combine grammatical and "functional/notional" syllabuses with a proficiency approach. Examines the implications of these principles for…

  4. Experiment design for pilot identification in compensatory tracking tasks

    NASA Technical Reports Server (NTRS)

    Wells, W. R.

    1976-01-01

    A design criterion for input functions in laboratory tracking tasks resulting in efficient parameter estimation is formulated. The criterion is that the statistical correlations between pairs of parameters be reduced in order to minimize the problem of nonuniqueness in the extraction process. The effectiveness of the method is demonstrated for a lower order dynamic system.

  5. How Design Experiments Can Inform Teaching and Learning: Teacher-Researchers as Collaborators in Educational Research

    ERIC Educational Resources Information Center

    Jitendra, Asha K.

    2005-01-01

    In this commentary, I summarize my own research with colleagues to affirm Dr. Gersten's call for considering design experiments prior to conducting intervention research. I describe how design experiments not only can inform teaching and the learning of innovative approaches, but also hold the promise of effectively bridging the…

  6. Staying True to the Core: Designing the Future Academic Library Experience

    ERIC Educational Resources Information Center

    Bell, Steven J.

    2014-01-01

    In 2014, the practice of user experience design in academic libraries continues to evolve. It is typically applied in the context of interactions with digital interfaces. Some academic librarians are applying user experience approaches more broadly to design both environments and services with human-centered strategies. As the competition for the…

  7. Design Analysis of a Space Based Chromotomographic Hyperspectral Imaging Experiment

    DTIC Science & Technology

    2010-03-01

    Tilt Platforms S-340 Platform Recommended Models Mirror Aluminum Aluminum S-340.Ax Invar Zerodur glass S-340.ix Titanium BK7 glass S-340.Tx Steel S-340...composed of a telescope, two grating spectrometers, calibration lamps, and focal plane electronics and cooling system. The telescope is a three mirror ...advanced hyperspectral imager for coastal bathymetry is that the experiment will closely mirror that of the proposed space-based chromotomographic hy

  8. Skylab SO71/SO72 circadian periodicity experiment. [experimental design and checkout of hardware

    NASA Technical Reports Server (NTRS)

    Fairchild, M. K.; Hartmann, R. A.

    1973-01-01

    The circadian rhythm hardware activities from 1965 through 1973 are considered. A brief history of the programs leading to the development of the combined Skylab SO71/SO72 Circadian Periodicity Experiment (CPE) is given. SO71 is the Skylab experiment number designating the pocket mouse circadian experiment, and SO72 designates the vinegar gnat circadian experiment. Final design modifications and checkout of the CPE, integration testing with the Apollo service module CSM 117 and the launch preparation and support tasks at Kennedy Space Center are reported.

  9. Carbon Taxes: A Review of Experience and Policy Design Considerations

    SciTech Connect

    Sumner, J.; Bird, L.; Smith, H.

    2009-12-01

    State and local governments in the United States are evaluating a wide range of policies to reduce carbon emissions, including, in some instances, carbon taxes, which have existed internationally for nearly 20 years. This report reviews existing carbon tax policies both internationally and in the United States. It also analyzes carbon policy design and effectiveness. Design considerations include which sectors to tax, where to set the tax rate, how to use tax revenues, what the impact will be on consumers, and how to ensure emissions reduction goals are achieved. Emission reductions that are due to carbon taxes can be difficult to measure, though some jurisdictions have quantified reductions in overall emissions and other jurisdictions have examined impacts that are due to programs funded by carbon tax revenues.

  10. Carbon Taxes. A Review of Experience and Policy Design Considerations

    SciTech Connect

    Sumner, Jenny; Bird, Lori; Smith, Hillary

    2009-12-01

    State and local governments in the United States are evaluating a wide range of policies to reduce carbon emissions, including, in some instances, carbon taxes, which have existed internationally for nearly 20 years. This report reviews existing carbon tax policies both internationally and in the United States. It also analyzes carbon policy design and effectiveness. Design considerations include which sectors to tax, where to set the tax rate, how to use tax revenues, what the impact will be on consumers, and how to ensure emissions reduction goals are achieved. Emission reductions that are due to carbon taxes can be difficult to measure, though some jurisdictions have quantified reductions in overall emissions and other jurisdictions have examined impacts that are due to programs funded by carbon tax revenues.

  11. Designing Online Resources in Preparation for Authentic Laboratory Experiences

    PubMed Central

    Boulay, Rachel; Parisky, Alex; Leong, Peter

    2013-01-01

    Professional development for science teachers can be benefited through active learning in science laboratories. However, how online training materials can be used to complement traditional laboratory training is less understood. This paper explores the design of online training modules to teach molecular biology and user perception of those modules that were part of an intensive molecular biology “boot camp” targeting high school biology teachers in the State of Hawaii. The John A. Burns School of Medicine at the University of Hawaii had an opportunity to design and develop professional development that prepares science teachers with an introduction of skills, techniques, and applications for their students to conduct medical research in a laboratory setting. A group of 29 experienced teachers shared their opinions of the online materials and reported on how they used the online materials in their learning process or teaching. PMID:24319698

  12. Study of Optimality Criteria in Design of Experiments.

    DTIC Science & Technology

    1980-06-03

    1970). Introduction to Matrix Analysis. 2nd edition, McGraw Hill. 2. Birkhoff, G.(1946). Tres observations sobre el algebra lineal , Univ. nac. Tucuman...Definition 2.1 is a response function criterion, most criteria in design theory are directly related to parameter estimation. Hence the information matrices ...nonincreasing func- tionals § on the set of information matrices rather than the class of convex nondecreasing functionals * on the set of covariance

  13. Weighing Evidence: The Design and Comparison of Probability Thought Experiments.

    DTIC Science & Technology

    1983-06-01

    a belief-function design for the problem. Weighing Evidence 12 The Hominids of East Turkana In the August 1078 issue of Scientific American, Alan...Walker and Richara E. T. Leakey discuss the hominid fossils that have recently been discovered in the region east of Lake Turkana in Kenya. These fossils...the effect that distinct hominid species cannot co-exist after one of them has acquired culture. (ii) Hypotheses I and 4 are doubtful because they

  14. Vanguard/PLACE experiment system design and test plan

    NASA Technical Reports Server (NTRS)

    Taylor, R. E.

    1973-01-01

    The design, development, and testing of the NASA-GFSC Position Location and Aircraft Communications Equipment (PLACE) at C band frequency are discussed. The equipment was installed on the USNS Vanguard. The tests involved a sea test to evalute the position-location, 2-way voice, and 2-way data communications capability of PLACE and a trilateration test to position-fix the ATS-5 satellite using the PLACE system.

  15. Student-Designed Fluid Experiment for DIME Competition

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Student-designed and -built apparatus for the second Dropping in a Microgravity Environment (DIME) competition held April 23-25, 2002, at NASA's Glenn Research Center. Competitors included two teams from Sycamore High School, Cincinnati, OH, and one each from Bay High School, Bay Village, OH, and COSI Academy, Columbus, OH. DIME is part of NASA's education and outreach activities. Details are on line at http://microgravity.grc.nasa.gov/DIME_2002.html.

  16. Aerospace Engineering Systems and the Advanced Design Technologies Testbed Experience

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.; Livingston, Mary E.; Melton, John E.; Torres, Francisco J.; Stremel, Paul M.

    1999-01-01

    Continuous improvement of aerospace product development processes is a driving requirement across much of the aerospace community. As up to 90% of the cost of an aerospace product is committed during the first 10% of the development cycle, there is a strong emphasis on capturing, creating, and communicating better information (both requirements and performance) early in the product development process. The community has responded by pursuing the development of computer-based systems designed to enhance the decision-making capabilities of product development individuals and teams. Recently, the historical foci on sharing the geometrical representation and on configuration management are being augmented: 1) Physics-based analysis tools for filling the design space database; 2) Distributed computational resources to reduce response time and cost; 3) Web-based technologies to relieve machine-dependence; and 4) Artificial intelligence technologies to accelerate processes and reduce process variability. The Advanced Design Technologies Testbed (ADTT) activity at NASA Ames Research Center was initiated to study the strengths and weaknesses of the technologies supporting each of these trends, as well as the overall impact of the combination of these trends on a product development event. Lessons learned and recommendations for future activities are reported.

  17. Design of experiments with multiple independent variables: a resource management perspective on complete and reduced factorial designs.

    PubMed

    Collins, Linda M; Dziak, John J; Li, Runze

    2009-09-01

    An investigator who plans to conduct an experiment with multiple independent variables must decide whether to use a complete or reduced factorial design. This article advocates a resource management perspective on making this decision, in which the investigator seeks a strategic balance between service to scientific objectives and economy. Considerations in making design decisions include whether research questions are framed as main effects or simple effects; whether and which effects are aliased (confounded) in a particular design; the number of experimental conditions that must be implemented in a particular design and the number of experimental subjects the design requires to maintain the desired level of statistical power; and the costs associated with implementing experimental conditions and obtaining experimental subjects. In this article 4 design options are compared: complete factorial, individual experiments, single factor, and fractional factorial. Complete and fractional factorial designs and single-factor designs are generally more economical than conducting individual experiments on each factor. Although relatively unfamiliar to behavioral scientists, fractional factorial designs merit serious consideration because of their economy and versatility.

  18. Mini-columns for Conducting Breakthrough Experiments. Design and Construction

    SciTech Connect

    Dittrich, Timothy M.; Reimus, Paul William; Ware, Stuart Douglas

    2015-06-11

    Experiments with moderately and strongly sorbing radionuclides (i.e., U, Cs, Am) have shown that sorption between experimental solutions and traditional column materials must be accounted for to accurately determine stationary phase or porous media sorption properties (i.e., sorption site density, sorption site reaction rate coefficients, and partition coefficients or Kd values). This report details the materials and construction of mini-columns for use in breakthrough columns to allow for accurate measurement and modeling of sorption parameters. Material selection, construction techniques, wet packing of columns, tubing connections, and lessons learned are addressed.

  19. Lockheed design of a wind satellite (WINDSAT) experiment

    NASA Technical Reports Server (NTRS)

    Osmundson, John S.; Martin, Stephen C.

    1985-01-01

    WINDSAT is a proposed space based global wind measuring system. A Shuttleborne experiment is proposed as a proof of principle demonstration before development of a full operational system. WINDSAT goals are to measure wind speed and direction to + or - 1 m/s and 10 deg accuracy over the entire earth from 0 to 20 km altitude with 1 km altitude resolution. The wind measuring instrument is a coherent lidar incorporating a pulsed CO2 TEA laser transmitter and a continuously scanning 1.25 m diameter optical system. The wind speed is measured by heterodyne detecting the backscattered return laser radiation and measuring this frequency shift.

  20. Characterizing Variability in Smestad and Gratzel's Nanocrystalline Solar Cells: A Collaborative Learning Experience in Experimental Design

    ERIC Educational Resources Information Center

    Lawson, John; Aggarwal, Pankaj; Leininger, Thomas; Fairchild, Kenneth

    2011-01-01

    This article describes a collaborative learning experience in experimental design that closely approximates what practicing statisticians and researchers in applied science experience during consulting. Statistics majors worked with a teaching assistant from the chemistry department to conduct a series of experiments characterizing the variation…

  1. Young Children's Learning of Novel Digital Interfaces: How Technology Experience, Age, and Design Come into Play

    ERIC Educational Resources Information Center

    Gilutz, Shuli

    2009-01-01

    This study looks at the relationship between age, technology experience, and design factors in determining young children's comprehension of novel digital interfaces. In Experiment 1, 35 preschoolers played three games that varied in complexity and familiarity. Parental questionnaires were used to assess children's previous technology experience.…

  2. Students' Design of Experiments: An Inquiry Module on the Conduction of Heat

    ERIC Educational Resources Information Center

    Hatzikraniotis, E.; Kallery, M.; Molohidis, A.; Psillos, D.

    2010-01-01

    This article examines secondary students' design of experiments after engagement in an innovative and inquiry-oriented module on heat transfer. The module consists of an integration of hands-on experiments, simulated experiments and microscopic model simulations, includes a structured series of guided investigative tasks and was implemented for a…

  3. Engineering design of the FRX-C experiment

    SciTech Connect

    Kewish, R.W. Jr.; Bartsch, R.R.; Siemon, R.E.

    1981-01-01

    Research on Compact Toroid (CT) configurations has been greatly accelerated in the last few years because of their potential for providing a practical and economical fusion system. Los Alamos research is being concentrated on two types of configurations: (1) magnetized-gun-produced Spheromaks (configurations that contain a mixture of toroidal and poloidal fields); and (2) field-reversed configurations (FRCs) that contain purely poloidal magnetic field. This paper describes the design of FRX-C, a field-reversed theta pinch used to form FRCs.

  4. Regulatory affairs in biotechnology: optimal statistical designs for biomedical experiments.

    PubMed

    Carriere, K C

    1998-01-01

    One of the major issues in all applications of biotechnology is how to regulate the process through which new technological information is produced. The end products of biotechnological applications are diverse (e.g., better drugs, better interventions, better fertilizers). Such applications should be properly regulated to obtain valid scientific findings in the most efficient way possible. Some statistically optimal designs are more popularly employed than others as regulatory tools in medical, pharmaceutical and clinical trials. The statistical and practical properties (strengths and weaknesses) are presented to better appreciate their optimality. Recent developments on some related issues are also reviewed.

  5. Shuttle Orbiter Active Thermal Control Subsystem design and flight experience

    NASA Technical Reports Server (NTRS)

    Bond, Timothy A.; Metcalf, Jordan L.; Asuncion, Carmelo

    1991-01-01

    The paper examines the design of the Space Shuttle Orbiter Active Thermal Control Subsystem (ATCS) constructed for providing the vehicle and payload cooling during all phases of a mission and during ground turnaround operations. The operation of the Shuttle ATCS and some of the problems encountered during the first 39 flights of the Shuttle program are described, with special attention given to the major problems encountered with the degradation of the Freon flow rate on the Orbiter Columbia, the Flash Evaporator Subsystem mission anomalies which occurred on STS-26 and STS-34, and problems encountered with the Ammonia Boiler Subsystem. The causes and the resolutions of these problems are discussed.

  6. Student-Designed Fluid Experiment for DIME Competition

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Test tubes to hold different types of fluids while in free-fall were among the student-designed items for the second Dropping in a Microgravity Environment (DIME) competition held April 23-25, 2002, at NASA's Glenn Research Center. Competitors included two teams from Sycamore High School, Cincinnati, OH, and one each from Bay High School, Bay Village, OH, and COSI Academy, Columbus, OH. DIME is part of NASA's education and outreach activities. Details are on line at http://microgravity.grc.nasa.gov/DIME_2002.html.

  7. Design and experience with utility-scale CFB boilers

    SciTech Connect

    Darling, S.L.; Hennenfent, M.

    1995-12-31

    Circulating fluidized bed (CFB) boilers have been in operation for many years in industrial steam and power generation applications, primarily in the 50-100 MWe range. In the past few years, however, several utility-scale CFB boilers have entered service. The scale-up of the Foster Wheeler Pyropower, Inc. CFB boilers has proceeded smoothly, and today FWPI CFB boilers up to 180 MWe are in operation, two 235 MWe boilers are now under construction, and other large units are in the design stage.

  8. Design of a new liquid cell for shock experiments

    SciTech Connect

    Reinhart, W.D.; Chhabildas, L.C.

    1999-11-22

    Controlled impact methodology has been used on a powdergun to obtain dynamic behavior properties of Tributyl Phosphate (TBP). A novel test methodology is used to provide extremely accurate equation of state data of the liquid. A thin aluminum plate used for confining the liquid also serves as a diagnostic to provide reshock states and subsequent release adiabats from the reshocked state. Polar polymer, polyvinylidene fluoride (PVDF) gauges and velocity interferometer system for any reflector (VISAR) provided redundant and precise data of temporal resolution to five nanoseconds and shock velocity measurements of better than 1%. The design and test methodologies are presented in this paper.

  9. Design and Analysis of Single-Cell Sequencing Experiments.

    PubMed

    Grün, Dominic; van Oudenaarden, Alexander

    2015-11-05

    Recent advances in single-cell sequencing hold great potential for exploring biological systems with unprecedented resolution. Sequencing the genome of individual cells can reveal somatic mutations and allows the investigation of clonal dynamics. Single-cell transcriptome sequencing can elucidate the cell type composition of a sample. However, single-cell sequencing comes with major technical challenges and yields complex data output. In this Primer, we provide an overview of available methods and discuss experimental design and single-cell data analysis. We hope that these guidelines will enable a growing number of researchers to leverage the power of single-cell sequencing.

  10. Design of Experiments for the Thermal Characterization of Metallic Foam

    NASA Technical Reports Server (NTRS)

    Crittenden, Paul E.; Cole, Kevin D.

    2003-01-01

    Metallic foams are being investigated for possible use in the thermal protection systems of reusable launch vehicles. As a result, the performance of these materials needs to be characterized over a wide range of temperatures and pressures. In this paper a radiation/conduction model is presented for heat transfer in metallic foams. Candidates for the optimal transient experiment to determine the intrinsic properties of the model are found by two methods. First, an optimality criterion is used to find an experiment to find all of the parameters using one heating event. Second, a pair of heating events is used to determine the parameters in which one heating event is optimal for finding the parameters related to conduction, while the other heating event is optimal for finding the parameters associated with radiation. Simulated data containing random noise was analyzed to determine the parameters using both methods. In all cases the parameter estimates could be improved by analyzing a larger data record than suggested by the optimality criterion.

  11. How Instructional Design Experts Use Knowledge and Experience to Solve Ill-Structured Problems

    ERIC Educational Resources Information Center

    Ertmer, Peggy A.; Stepich, Donald A.; York, Cindy S.; Stickman, Ann; Wu, Xuemei (Lily); Zurek, Stacey; Goktas, Yuksel

    2008-01-01

    This study examined how instructional design (ID) experts used their prior knowledge and previous experiences to solve an ill-structured instructional design problem. Seven experienced designers used a think-aloud procedure to articulate their problem-solving processes while reading a case narrative. Results, presented in the form of four…

  12. Creative Minds Abroad: How Design Students Make Meaning of Their International Education Experiences

    ERIC Educational Resources Information Center

    Johnson, Rachel Sherman

    2016-01-01

    The purpose of this study is to explore the ways in which students majoring in a design discipline make meaning of their study abroad experiences in relation to their creativity and creative design work. Students and recent alumni from the College of Design (CDes) at the University of Minnesota-Twin Cities (UMTC) who had studied abroad formed the…

  13. How to Teach Engineering and Industrial Design: a U.K. Experience.

    ERIC Educational Resources Information Center

    Sheldon, D. F.

    1988-01-01

    Explored are the possibilities of teaching engineering through a project approach. Discussed are the introduction, clashing cultures of industrial and engineering design, skills required of a designer, teaching approach to the total design activity, CAD/CAM experiences, and conclusions. (Author/YP)

  14. Technologists Talk: Making the Links between Design, Problem-Solving and Experiences with Hard Materials

    ERIC Educational Resources Information Center

    Potter, Patricia

    2013-01-01

    Design and problem-solving is a key learning focus in technology education and remains a distinguishing factor that separates it from other subject areas. This research investigated how two expert designers considered experiences with hard materials contributed to their learning design and problem-solving with these materials. The research project…

  15. Perceptibility and the "Choice Experience": User Sensory Perceptions and Experiences Inform Vaginal Prevention Product Design.

    PubMed

    Guthrie, Kate Morrow; Dunsiger, Shira; Vargas, Sara E; Fava, Joseph L; Shaw, Julia G; Rosen, Rochelle K; Kiser, Patrick F; Kojic, E Milu; Friend, David R; Katz, David F

    The development of pericoital (on demand) vaginal HIV prevention technologies remains a global health priority. Clinical trials to date have been challenged by nonadherence, leading to an inability to demonstrate product efficacy. The work here provides new methodology and results to begin to address this limitation. We created validated scales that allow users to characterize sensory perceptions and experiences when using vaginal gel formulations. In this study, we sought to understand the user sensory perceptions and experiences (USPEs) that characterize the preferred product experience for each participant. Two hundred four women evaluated four semisolid vaginal formulations using the USPE scales at four randomly ordered formulation evaluation visits. Women were asked to select their preferred formulation experience for HIV prevention among the four formulations evaluated. The scale scores on the Sex-associated USPE scales (e.g., Initial Penetration and Leakage) for each participant's selected formulation were used in a latent class model analysis. Four classes of preferred formulation experiences were identified. Sociodemographic and sexual history variables did not predict class membership; however, four specific scales were significantly related to class: Initial Penetration, Perceived Wetness, Messiness, and Leakage. The range of preferred user experiences represented by the scale scores creates a potential target range for product development, such that products that elicit scale scores that fall within the preferred range may be more acceptable, or tolerable, to the population under study. It is recommended that similar analyses should be conducted with other semisolid vaginal formulations, and in other cultures, to determine product property and development targets.

  16. Workspace design for crane cabins applying a combined traditional approach and the Taguchi method for design of experiments.

    PubMed

    Spasojević Brkić, Vesna K; Veljković, Zorica A; Golubović, Tamara; Brkić, Aleksandar Dj; Kosić Šotić, Ivana

    2016-01-01

    Procedures in the development process of crane cabins are arbitrary and subjective. Since approximately 42% of incidents in the construction industry are linked to them, there is a need to collect fresh anthropometric data and provide additional recommendations for design. In this paper, dimensioning of the crane cabin interior space was carried out using a sample of 64 crane operators' anthropometric measurements, in the Republic of Serbia, by measuring workspace with 10 parameters using nine measured anthropometric data from each crane operator. This paper applies experiments run via full factorial designs using a combined traditional and Taguchi approach. The experiments indicated which design parameters are influenced by which anthropometric measurements and to what degree. The results are expected to be of use for crane cabin designers and should assist them to design a cabin that may lead to less strenuous sitting postures and fatigue for operators, thus improving safety and accident prevention.

  17. Inflatable Re-Entry Vehicle Experiment (IRVE) Design Overview

    NASA Technical Reports Server (NTRS)

    Hughes, Stephen J.; Dillman, Robert A.; Starr, Brett R.; Stephan, Ryan A.; Lindell, Michael C.; Player, Charles J.; Cheatwood, F. McNeil

    2005-01-01

    Inflatable aeroshells offer several advantages over traditional rigid aeroshells for atmospheric entry. Inflatables offer increased payload volume fraction of the launch vehicle shroud and the possibility to deliver more payload mass to the surface for equivalent trajectory constraints. An inflatable s diameter is not constrained by the launch vehicle shroud. The resultant larger drag area can provide deceleration equivalent to a rigid system at higher atmospheric altitudes, thus offering access to higher landing sites. When stowed for launch and cruise, inflatable aeroshells allow access to the payload after the vehicle is integrated for launch and offer direct access to vehicle structure for structural attachment with the launch vehicle. They also offer an opportunity to eliminate system duplication between the cruise stage and entry vehicle. There are however several potential technical challenges for inflatable aeroshells. First and foremost is the fact that they are flexible structures. That flexibility could lead to unpredictable drag performance or an aerostructural dynamic instability. In addition, durability of large inflatable structures may limit their application. They are susceptible to puncture, a potentially catastrophic insult, from many possible sources. Finally, aerothermal heating during planetary entry poses a significant challenge to a thin membrane. NASA Langley Research Center and NASA's Wallops Flight Facility are jointly developing inflatable aeroshell technology for use on future NASA missions. The technology will be demonstrated in the Inflatable Re-entry Vehicle Experiment (IRVE). This paper will detail the development of the initial IRVE inflatable system to be launched on a Terrier/Orion sounding rocket in the fourth quarter of CY2005. The experiment will demonstrate achievable packaging efficiency of the inflatable aeroshell for launch, inflation, leak performance of the inflatable system throughout the flight regime, structural

  18. Researching Design Practices and Design Cognition: Contexts, Experiences and Pedagogical Knowledge-in-Pieces

    ERIC Educational Resources Information Center

    Kali, Yael; Goodyear, Peter; Markauskaite, Lina

    2011-01-01

    If research and development in the field of learning design is to have a serious and sustained impact on education, then technological innovation needs to be accompanied--and probably guided--by good empirical studies of the design practices and design thinking of those who develop these innovations. This article synthesises two related lines of…

  19. Computational design and analysis of flatback airfoil wind tunnel experiment.

    SciTech Connect

    Mayda, Edward A.; van Dam, C.P.; Chao, David D.; Berg, Dale E.

    2008-03-01

    A computational fluid dynamics study of thick wind turbine section shapes in the test section of the UC Davis wind tunnel at a chord Reynolds number of one million is presented. The goals of this study are to validate standard wind tunnel wall corrections for high solid blockage conditions and to reaffirm the favorable effect of a blunt trailing edge or flatback on the performance characteristics of a representative thick airfoil shape prior to building the wind tunnel models and conducting the experiment. The numerical simulations prove the standard wind tunnel corrections to be largely valid for the proposed test of 40% maximum thickness to chord ratio airfoils at a solid blockage ratio of 10%. Comparison of the computed lift characteristics of a sharp trailing edge baseline airfoil and derived flatback airfoils reaffirms the earlier observed trend of reduced sensitivity to surface contamination with increasing trailing edge thickness.

  20. An optics education program designed around experiments with small telescopes

    NASA Astrophysics Data System (ADS)

    Pompea, Stephen M.; Sparks, Robert T.; Walker, Constance E.; Dokter, Erin F. C.

    2010-08-01

    The National Optical Astronomy Observatory has led the development of a new telescope kit for kids as part of a strategic plan to interest young children in science. This telescope has been assembled by tens of thousands of children nationwide, who are now using this high-quality telescope to conduct optics experiments and to make astronomical observations. The Galileoscope telescope kit and its associated educational program are an outgrowth of the NSF sponsored "Hands-On Optics" (HOO) project, a collaboration of the SPIE, the Optical Society of America, and NOAO. This project developed optics kits and activities for upper elementary students and has reached over 20,000 middle school kids in afterschool programs. HOO is a highly flexible educational program and was featured as an exemplary informal science program by the National Science Teachers Association. Our new "Teaching with Telescopes" program builds on HOO, the Galileoscope and other successful optical education projects.

  1. Tokamak Physics Experiment (TPX) power supply design and development

    SciTech Connect

    Neumeyer, C.; Bronner, G.; Lu, E.; Ramakrishnan, S.

    1995-04-01

    The Tokamak Physics Experiment (TPX) is an advanced tokamak project aimed at the production of quasi-steady state plasmas with advanced shape, heating, and particle control. TPX is to be built at the Princeton Plasma Physics Laboratory (PPPL) using many of the facilities from the Tokamak Fusion Test Reactor (TFTR). TPX will be the first tokamak to utilize superconducting (SC) magnets in both the toroidal field (TF) and poloidal field (PF) systems. This new feature requires a departure from the traditional tokamak power supply schemes. This paper describes the plan for the adaptation of the PPPL/FTR power system facilities to supply TPX. Five major areas are addressed, namely the AC power system, the TF, PF and Fast Plasma Position Control (FPPC) power supplies, and quench protection for the TF and PF systems. Special emphasis is placed on the development of new power supply and protection schemes.

  2. Conceptual design for spacelab two-phase flow experiments

    NASA Technical Reports Server (NTRS)

    Bradshaw, R. D.; King, C. D.

    1977-01-01

    KC-135 aircraft tests confirmed the gravity sensitivity of two phase flow correlations. The prime component of the apparatus is a 1.5 cm dia by 90 cm fused quartz tube test section selected for visual observation. The water-cabin air system with water recycle was a clear choice for a flow regime-pressure drop test since it was used satisfactorily on KC-135 tests. Freon-11 with either overboard dump or with liquid-recycle will be used for the heat transfer test. The two experiments use common hardware. The experimental plan covers 120 data points in six hours with mass velocities from 10 to 640 kg/sec-sq m and qualities 0.01 to 0.64. The apparatus with pump, separator, storage tank and controls is mounted in a double spacelab rack. Supporting hardware, procedures, measured variables and program costs are defined.

  3. Preliminary design for Arctic atmospheric radiative transfer experiments

    NASA Technical Reports Server (NTRS)

    Zak, B. D.; Church, H. W.; Stamnes, K.; Shaw, G.; Filyushkin, V.; Jin, Z.; Ellingson, R. G.; Tsay, S. C.

    1995-01-01

    If current plans are realized, within the next few years, an extraordinary set of coordinated research efforts focusing on energy flows in the Arctic will be implemented. All are motivated by the prospect of global climate change. SHEBA (Surface Energy Budget of the Arctic Ocean), led by the National Science Foundation (NSF) and the Office of Naval Research (ONR), involves instrumenting an ice camp in the perennial Arctic ice pack, and taking data for 12-18 months. The ARM (Atmospheric Radiation Measurement) North Slope of Alaska and Adjacent Arctic Ocean (NSA/AAO) Cloud and Radiation Testbed (CART) focuses on atmospheric radiative transport, especially in the presence of clouds. The NSA/AAO CART involves instrumenting a sizeable area on the North Slope of Alaska and adjacent waters in the vicinity of Barrow, and acquiring data over a period of about 10 years. FIRE (First ISCCP (International Satellite Cloud Climatology Program) Regional Experiment) Phase 3 is a program led by the National Aeronautics and Space Administration (NASA) which focuses on Arctic clouds, and which is coordinated with SHEBA and ARM. FIRE has historically emphasized data from airborne and satellite platforms. All three program anticipate initiating Arctic data acquisition during spring, 1997. In light of his historic opportunity, the authors discuss a strawman atmospheric radiative transfer experimental plan that identifies which features of the radiative transport models they think should be tested, what experimental data are required for each type of test, the platforms and instrumentation necessary to acquire those data, and in general terms, how the experiments could be conducted. Aspects of the plan are applicable to all three programs.

  4. Designing Critical Experiments in Support of Full Burnup Credit

    SciTech Connect

    Mueller, Don; Roberts, Jeremy A

    2008-01-01

    Burnup credit is the process of accounting for the negative reactivity due to fuel burnup and generation of parasitic absorbers over fuel assembly lifetime. For years, the fresh fuel assumption was used as a simple bound in criticality work for used fuel storage and transportation. More recently, major actinides have been included [1]. However, even this yields a highly conservative estimate in criticality calculations. Because of the numerous economical benefits including all available negative reactivity (i.e., full burnup credit) could provide [2], it is advantageous to work toward full burnup credit. Unfortunately, comparatively little work has been done to include non-major actinides and other fission products (FP) in burnup credit analyses due in part to insufficient experimental data for validation of codes and nuclear data. The Burnup Credit Criticality Experiment (BUCCX) at Sandia National Laboratory was a set of experiments with {sup 103}Rh that have relevance for burnup credit [3]. This work uses TSUNAMI-3D to investigate and adjust a BUCCX model to match isotope-specific, energy-dependent k{sub eff} sensitivity profiles to those of a representative high-capacity cask model (GBC-32) [4] for each FP of interest. The isotopes considered are {sup 149}Sm, {sup 143}Nd, {sup 103}Rh, {sup 133}Cs, {sup 155}Gd, {sup 152}Sm, {sup 99}Tc, {sup 145}Nd, {sup 153}Eu, {sup 147}Sm, {sup 109}Ag, {sup 95}Mo, {sup 150}Sm, {sup 101}Ru, and {sup 151}Eu. The goal is to understand the biases and bias uncertainties inherent in nuclear data, and ultimately, to apply these in support of full burnup credit.

  5. Combining theory and experiment in electrocatalysis: Insights into materials design.

    PubMed

    Seh, Zhi Wei; Kibsgaard, Jakob; Dickens, Colin F; Chorkendorff, Ib; Nørskov, Jens K; Jaramillo, Thomas F

    2017-01-13

    Electrocatalysis plays a central role in clean energy conversion, enabling a number of sustainable processes for future technologies. This review discusses design strategies for state-of-the-art heterogeneous electrocatalysts and associated materials for several different electrochemical transformations involving water, hydrogen, and oxygen, using theory as a means to rationalize catalyst performance. By examining the common principles that govern catalysis for different electrochemical reactions, we describe a systematic framework that clarifies trends in catalyzing these reactions, serving as a guide to new catalyst development while highlighting key gaps that need to be addressed. We conclude by extending this framework to emerging clean energy reactions such as hydrogen peroxide production, carbon dioxide reduction, and nitrogen reduction, where the development of improved catalysts could allow for the sustainable production of a broad range of fuels and chemicals.

  6. Rational experiment design for sequencing-based RNA structure mapping.

    PubMed

    Aviran, Sharon; Pachter, Lior

    2014-12-01

    Structure mapping is a classic experimental approach for determining nucleic acid structure that has gained renewed interest in recent years following advances in chemistry, genomics, and informatics. The approach encompasses numerous techniques that use different means to introduce nucleotide-level modifications in a structure-dependent manner. Modifications are assayed via cDNA fragment analysis, using electrophoresis or next-generation sequencing (NGS). The recent advent of NGS has dramatically increased the throughput, multiplexing capacity, and scope of RNA structure mapping assays, thereby opening new possibilities for genome-scale, de novo, and in vivo studies. From an informatics standpoint, NGS is more informative than prior technologies by virtue of delivering direct molecular measurements in the form of digital sequence counts. Motivated by these new capabilities, we introduce a novel model-based in silico approach for quantitative design of large-scale multiplexed NGS structure mapping assays, which takes advantage of the direct and digital nature of NGS readouts. We use it to characterize the relationship between controllable experimental parameters and the precision of mapping measurements. Our results highlight the complexity of these dependencies and shed light on relevant tradeoffs and pitfalls, which can be difficult to discern by intuition alone. We demonstrate our approach by quantitatively assessing the robustness of SHAPE-Seq measurements, obtained by multiplexing SHAPE (selective 2'-hydroxyl acylation analyzed by primer extension) chemistry in conjunction with NGS. We then utilize it to elucidate design considerations in advanced genome-wide approaches for probing the transcriptome, which recently obtained in vivo information using dimethyl sulfate (DMS) chemistry.

  7. Optimal experiment design for cardinal values estimation: guidelines for data collection.

    PubMed

    Bernaerts, K; Gysemans, K P M; Nhan Minh, T; Van Impe, J F

    2005-04-15

    Optimal experiment design for parameter estimation (OED/PE) is an interesting technique for modelling practices when aiming for maximum parameter estimation accuracy. Nowadays, experimental designs for secondary modelling within the field of predictive microbiology are mostly arbitrary or based on factorial design. The latter type of design is common practice in response surface modelling approaches. A number of levels of the factor(s) under study are selected and all possible treatment combinations are performed. It is however not always clear which levels and treatment combinations are most relevant. An answer to this question can be obtained from optimal experiment design for-in this particular case-parameter estimation. This technique is based on the extremisation of a scalar function of the Fisher information matrix. The type of scalar function determines the final focus of the optimised design. In this paper, optimal experiment designs are computed for the cardinal temperature model with inflection point (CTMI) and the cardinal pH model (CPM). A model output sensitivity analysis (depicting the sensitivity of the model output to a small change in the model parameters) yields a first indication of relevant temperature or pH treatments. Performed designs are: D-optimal design aiming for a maximum global parameter estimation accuracy (by minimising the determinant of the Fisher information matrix), and E-optimal design improving the confidence in the most uncertain model parameter (by maximising the smallest eigenvalue of the Fisher information matrix). Although lowering the information content of a set of experiments, boundary values on the design region need to be imposed during optimisation to exclude unworkable experiments and partly account for incorrect nominal parameter values. Opposed to the frequently applied equidistant or arbitrary treatment placement, optimal design results show that typically four informative temperature or pH levels are selected and

  8. Optimal input experiment design and parameter estimation in core-scale pressure oscillation experiments

    NASA Astrophysics Data System (ADS)

    Potters, M. G.; Mansoori, M.; Bombois, X.; Jansen, J. D.; Van den Hof, P. M. J.

    2016-03-01

    This paper considers Pressure Oscillation (PO) experiments for which we find the minimum experiment time that guarantees user-imposed parameter variance upper bounds and honours actuator limits. The parameters permeability and porosity are estimated with a classical least-squares estimation method for which an expression of the covariance matrix of the estimates is calculated. This expression is used to tackle the optimization problem. We study the Dynamic Darcy Cell experiment set-up (Heller et al., 2002) and focus on data generation using square wave actuator signals, which, as we shall prove, deliver shorter experiment times than sinusoidal ones. Parameter identification is achieved using either inlet pressure/outlet pressure measurements (Heller et al., 2002) or actuator position/outlet pressure measurements, where the latter is a novel approach. The solution to the optimization problem reveals that for both measurement methods an optimal excitation frequency, an optimal inlet volume, and an optimal outlet volume exist. We find that under the same parameter variance bounds and actuator constraints, actuator position/outlet pressure measurements result in required experiment times that are a factor fourteen smaller compared to inlet pressure/outlet pressure measurements. This result is analysed in detail and we find that the dominant effect driving this difference originates from an identifiability problem when using inlet-outlet pressure measurements for joint estimation of permeability and porosity. We illustrate our results with numerical simulations, and show excellent agreement with theoretical expectations.

  9. Preliminary design polymeric materials experiment. [for space shuttles and Spacelab missions

    NASA Technical Reports Server (NTRS)

    Mattingly, S. G.; Rude, E. T.; Marshner, R. L.

    1975-01-01

    A typical Advanced Technology Laboratory mission flight plan was developed and used as a guideline for the identification of a number of experiment considerations. The experiment logistics beginning with sample preparation and ending with sample analysis are then overlaid on the mission in order to have a complete picture of the design requirements. The results of this preliminary design study fall into two categories. First specific preliminary designs of experiment hardware which is adaptable to a variety of mission requirements. Second, identification of those mission considerations which affect hardware design and will require further definition prior to final design. Finally, a program plan is presented which will provide the necessary experiment hardware in a realistic time period to match the planned shuttle flights. A bibliography of all material reviewed and consulted but not specifically referenced is provided.

  10. Multichannel readout ASIC design flow for high energy physics and cosmic rays experiments

    NASA Astrophysics Data System (ADS)

    Voronin, A.; Malankin, E.

    2016-02-01

    In the large-scale high energy physics and astrophysics experiments multi-channel readout application specific integrated circuits (ASICs) are widely used. The ASICs for such experiments are complicated systems, which usually include both analog and digital building blocks. The complexity and large number of channels in such ASICs require the proper methodological approach to their design. The paper represents the mixed-signal design flow of the ASICs for high energy physics and cosmic rays experiments. This flow was successfully embedded to the development of the read-out ASIC prototype for the muon chambers of the CBM experiment. The approach was approved in UMC CMOS MMRF 180 nm process. The design flow enable to analyse the mixed-signal system operation on the different levels: functional, behavioural, schematic and post layout including parasitic elements. The proposed design flow allows reducing the simulation period and eliminating the functionality mismatches on the very early stage of the design.

  11. Best Practices for Operando Battery Experiments: Influences of X-ray Experiment Design on Observed Electrochemical Reactivity

    SciTech Connect

    Borkiewicz, O. J.; Wiaderek, Kamila M.; Chupas, Peter J.; Chapman, Karena W.

    2015-06-04

    Dynamic properties and multiscale complexities governing electrochemical energy storage in batteries are most ideally interrogated under simulated operating conditions within an electrochemical cell. We assess how electrochemical reactivity can be impacted by experiment design, including the X-ray measurements or by common features or adaptations of electrochemical cells that enable X-ray measurements.

  12. W/V-Band RF Propagation Experiment Design

    NASA Technical Reports Server (NTRS)

    Acosta, Roberto J.; Nessel, James A.; Simons, Rainee N.; Zemba, Michael J.; Morse, Jacquelynne Rose; Budinger, James M.

    2012-01-01

    The utilization of frequency spectrum for space-to-ground communications applications has generally progressed from the lowest available bands capable of supporting transmission through the atmosphere to the higher bands, which have required research and technological advancement to implement. As communications needs increase and the available spectrum in the microwave frequency bands (3 30 GHz) becomes congested globally, future systems will move into the millimeter wave (mm-wave) range (30 300 GHz). While current systems are operating in the Ka-band (20 30 GHz), systems planned for the coming decades will initiate operations in the Q-Band (33 50 GHz), V-Band (50 75 GHz) and W Band (75 110 GHz) of the spectrum. These bands offer extremely broadband capabilities (contiguous allocations of 500 MHz to 1GHz or more) and an uncluttered spectrum for a wide range of applications. NASA, DoD and commercial missions that can benefit from moving into the mm-wave bands include data relay and near-Earth data communications, unmanned aircraft communications, NASA science missions, and commercial broadcast/internet services, all able to be implemented via very small terminals. NASA Glenn Research Center has a long history of performing the inherently governmental function of opening new frequency spectrum by characterizing atmospheric effects on electromagnetic propagation and collaborating with the satellite communication industry to develop specific communications technologies for use by NASA and the nation. Along these lines, there are critical issues related to W/V-band propagation that need to be thoroughly understood before design of any operational system can commence. These issues arise primarily due to the limitations imposed on W/V-band signal propagation by the Earth s atmosphere, and to the fundamental lack of understanding of these effects with regards to proper system design and fade mitigation. In this paper, The GRC RF propagation team recommends measurements

  13. Computational design of short pulse laser driven iron opacity experiments

    DOE PAGES

    Martin, M. E.; London, R. A.; Goluoglu, S.; ...

    2017-02-23

    Here, the resolution of current disagreements between solar parameters calculated from models and observations would benefit from the experimental validation of theoretical opacity models. Iron's complex ionic structure and large contribution to the opacity in the radiative zone of the sun make iron a good candidate for validation. Short pulse lasers can be used to heat buried layer targets to plasma conditions comparable to the radiative zone of the sun, and the frequency dependent opacity can be inferred from the target's measured x-ray emission. Target and laser parameters must be optimized to reach specific plasma conditions and meet x-ray emissionmore » requirements. The HYDRA radiation hydrodynamics code is used to investigate the effects of modifying laser irradiance and target dimensions on the plasma conditions, x-ray emission, and inferred opacity of iron and iron-magnesium buried layer targets. It was determined that plasma conditions are dominantly controlled by the laser energy and the tamper thickness. The accuracy of the inferred opacity is sensitive to tamper emission and optical depth effects. Experiments at conditions relevant to the radiative zone of the sun would investigate the validity of opacity theories important to resolving disagreements between solar parameters calculated from models and observations.« less

  14. Designing Forest Adaptation Experiments through Manager-Scientist Partnerships

    NASA Astrophysics Data System (ADS)

    Nagel, L. M.; Swanston, C.; Janowiak, M.

    2014-12-01

    Three common forest adaptation options discussed in the context of an uncertain future climate are: creating resistance, promoting resilience, and enabling forests to respond to change. Though there is consensus on the broad management goals addressed by each of these options, translating these concepts into management plans specific for individual forest types that vary in structure, composition, and function remains a challenge. We will describe a decision-making framework that we employed within a manager-scientist partnership to develop a suite of adaptation treatments for two contrasting forest types as part of a long-term forest management experiment. The first, in northern Minnesota, is a red pine-dominated forest with components of white pine, aspen, paper birch, and northern red oak, with a hazel understory. The second, in southwest Colorado, is a warm-dry mixed conifer forest dominated by ponderosa pine, white fir, and Douglas-fir, with scattered aspen and an understory of Gambel oak. The current conditions at both sites are characterized by overstocking with moderate-to-high fuel loading, vulnerability to numerous forest health threats, and are generally uncharacteristic of historic structure and composition. The desired future condition articulated by managers for each site included elements of historic structure and natural range of variability, but were greatly tempered by known vulnerabilities and projected changes to climate and disturbance patterns. The resultant range of treatments we developed are distinct for each forest type, and address a wide range of management objectives.

  15. Computer models for designing hypertension experiments and studying concepts.

    PubMed

    Guyton, A C; Montani, J P; Hall, J E; Manning, R D

    1988-04-01

    This paper demonstrates how computer models along with animal experiments have been used to work out the conceptual bases of hypertensive mechanisms, especially the following: (1) The renal-fluid volume pressure control mechanism has a feed-back gain for pressure control of infinity. Therefore, the chronic level to which the arterial pressure is controlled can be changed only by altering this pressure control mechanism. (2) An increase in total peripheral resistance is not sufficient by itself to cause hypertension. The only resistances in the circulatory system that, when increased, will cause hypertension are those along a restricted axis from the root of the aorta to Bowman's capsule in the kidneys. (3) Autoregulation in the peripheral vascular beds does not increase the arterial pressure in hypertension. However, autoregulation can convert high cardiac output hypertension into high peripheral resistance hypertension. (4) In a computer simulation that cannot yet be performed in animals, a simulated hypertension caused by a combination of increased renal afferent and efferent arteriolar resistances has characteristics that match almost exactly those of essential hypertension.

  16. A Guide to Designing Future Ground-based CMB Experiments

    SciTech Connect

    Wu, W. L.K.; Errard, J.; Dvorkin, C.; Kuo, C. L.; Lee, A. T.; McDonald, P.; Slosar, A.; Zahn, O.

    2014-02-18

    In this follow-up work to the High Energy Physics Community Summer Study 2013 (HEP CSS 2013, a.k.a. Snowmass), we explore the scientific capabilities of a future Stage-IV Cosmic Microwave Background polarization experiment (CMB-S4) under various assumptions on detector count, resolution, and sky coverage. We use the Fisher matrix technique to calculate the expected uncertainties in cosmological parameters in vΛCDM that are especially relevant to the physics of fundamental interactions, including neutrino masses, effective number of relativistic species, dark-energy equation of state, dark-matter annihilation, and inflationary parameters. To further chart the landscape of future cosmology probes, we include forecasted results from the Baryon Acoustic Oscillation (BAO) signal as measured by DESI to constrain parameters that would benefit from low redshift information. We find the following best 1-σ constraints: σ(Mv ) = 15 meV, σ(Neff ) = 0.0156, Dark energy Figure of Merit = 303, σ(pann) = 0.00588 x 3 x 10-26 cm3/s/GeV, σ( ΩK) = 0.00074, σ(ns) = 0.00110, σ( αs) = 0.00145, and σ(r) = 0.00009. We also detail the dependences of the parameter constraints on detector count, resolution, and sky coverage.

  17. Computational design of short pulse laser driven iron opacity experiments

    NASA Astrophysics Data System (ADS)

    Martin, M. E.; London, R. A.; Goluoglu, S.; Whitley, H. D.

    2017-02-01

    The resolution of current disagreements between solar parameters calculated from models and observations would benefit from the experimental validation of theoretical opacity models. Iron's complex ionic structure and large contribution to the opacity in the radiative zone of the sun make iron a good candidate for validation. Short pulse lasers can be used to heat buried layer targets to plasma conditions comparable to the radiative zone of the sun, and the frequency dependent opacity can be inferred from the target's measured x-ray emission. Target and laser parameters must be optimized to reach specific plasma conditions and meet x-ray emission requirements. The HYDRA radiation hydrodynamics code is used to investigate the effects of modifying laser irradiance and target dimensions on the plasma conditions, x-ray emission, and inferred opacity of iron and iron-magnesium buried layer targets. It was determined that plasma conditions are dominantly controlled by the laser energy and the tamper thickness. The accuracy of the inferred opacity is sensitive to tamper emission and optical depth effects. Experiments at conditions relevant to the radiative zone of the sun would investigate the validity of opacity theories important to resolving disagreements between solar parameters calculated from models and observations.

  18. Geothermal injection treatment: process chemistry, field experiences, and design options

    SciTech Connect

    Kindle, C.H.; Mercer, B.W.; Elmore, R.P.; Blair, S.C.; Myers, D.A.

    1984-09-01

    The successful development of geothermal reservoirs to generate electric power will require the injection disposal of approximately 700,000 gal/h (2.6 x 10/sup 6/ 1/h) of heat-depleted brine for every 50,000 kW of generating capacity. To maintain injectability, the spent brine must be compatible with the receiving formation. The factors that influence this brine/formation compatibility and tests to quantify them are discussed in this report. Some form of treatment will be necessary prior to injection for most situations; the process chemistry involved to avoid and/or accelerate the formation of precipitate particles is also discussed. The treatment processes, either avoidance or controlled precipitation approaches, are described in terms of their principles and demonstrated applications in the geothermal field and, when such experience is limited, in other industrial use. Monitoring techniques for tracking particulate growth, the effect of process parameters on corrosion and well injectability are presented. Examples of brine injection, preinjection treatment, and recovery from injectivity loss are examined and related to the aspects listed above.

  19. Design of experiments for enantiomeric separation in supercritical fluid chromatography.

    PubMed

    Landagaray, Elodie; Vaccher, Claude; Yous, Saïd; Lipka, Emmanuelle

    2016-02-20

    A new chiral melatoninergic ligand, potentially successor of Valdoxan(®), presenting an improved pharmacological profile with regard to agomelatine, was chosen as a probe for a supercritical fluid chromatographic separation carried-out on an amylose tris[(S)-1-α-methylbenzylcarbamate] based stationary phase. The goal of this work was to optimize simultaneously three factors identified to have a significant influence to obtain the best resolution in the shortest analysis time (i.e., retention time of the second eluting enantiomer) for this chiral compound. For this purpose a central circumscribed composite (CCC) design was developed with three factors: the flow-rate, the pressure outlet and the percentage of ethanol to optimize of two responses: shortest analysis time and best resolution. The optimal conditions obtained via the optimizer mode of the software (using the Nelder-Mead method) i.e., CO2/EtOH 86:14 (v:v), 104bar, 3.2mLmin(-1) at 35°C lead to a resolution of 3.27 in less than 6min. These conditions were transposed to a preparative scale where a concentrated methanolic solution of 40mM was injected with a sample loop of 100μL. This step allowed to separate an amount of around 65mg of racemic melatonin ligand in only 3h with impressive yields (97%) and enantiomeric excess (99.5%).

  20. Designing an artificial pancreas architecture: the AP@home experience.

    PubMed

    Lanzola, Giordano; Toffanin, Chiara; Di Palma, Federico; Del Favero, Simone; Magni, Lalo; Bellazzi, Riccardo

    2015-12-01

    The latest achievements in sensor technologies for blood glucose level monitoring, pump miniaturization for insulin delivery, and the availability of portable computing devices are paving the way toward the artificial pancreas as a treatment for diabetes patients. This device encompasses a controller unit that oversees the administration of insulin micro-boluses and continuously drives the pump based on blood glucose readings acquired in real time. In order to foster the research on the artificial pancreas and prepare for its adoption as a therapy, the European Union in 2010 funded the AP@home project, following a series of efforts already ongoing in the USA. This paper, authored by members of the AP@home consortium, reports on the technical issues concerning the design and implementation of an architecture supporting the exploitation of an artificial pancreas platform. First a PC-based platform was developed by the authors to prove the effectiveness and reliability of the algorithms responsible for insulin administration. A mobile-based one was then adopted to improve the comfort for the patients. Both platforms were tested on real patients, and a description of the goals, the achievements, and the major shortcomings that emerged during those trials is also reported in the paper.

  1. Woody Vegetation on Levees? - Research Experiences and Design Suggestions

    NASA Astrophysics Data System (ADS)

    Lammeranner, Walter

    2013-04-01

    Recent flood events in Austria have reawakened practical and scientific interest in the stability of levees. One focus amongst others has been taken on the relationship between vegetation and levee stability with special reference to the role of woody plants. The effects of woody plants are undoubtedly manifold: On the one hand they can potentially have a negative influence and endanger levees, which is why many guidelines ban woody vegetation to preserve stability, visual inspection and unhindered flood-fight access. On the other hand woody vegetation can have several positive impacts on soil stability and which effects prevail depends largely on types and characteristics of plants. This shows how controversially woody plants on levees can be discussed and the strong need for further research in this field. In order to obtain new insights and widen horizons for this controversial issue, a research project carried out by the Institute of Soil Bioengineering and Landscape Construction - at the University of Natural Resources and Life Sciences, Vienna - was launched. This project deals with several aspects of effects of woody plants have on levees and focuses particularly on shrubby woody plants. The examined vegetation type is a dense stand of willows - Purple-Willows (Salix purpurea L.) - commonly used for stabilization of river embankments. The proposed contribution discusses the gained results with reference to levee stability and existing levee vegetation guidelines and gives design suggestions for compatible woody vegetation on levees.

  2. Optimal Experiment Design for Monoexponential Model Fitting: Application to Apparent Diffusion Coefficient Imaging.

    PubMed

    Alipoor, Mohammad; Maier, Stephan E; Gu, Irene Yu-Hua; Mehnert, Andrew; Kahl, Fredrik

    2015-01-01

    The monoexponential model is widely used in quantitative biomedical imaging. Notable applications include apparent diffusion coefficient (ADC) imaging and pharmacokinetics. The application of ADC imaging to the detection of malignant tissue has in turn prompted several studies concerning optimal experiment design for monoexponential model fitting. In this paper, we propose a new experiment design method that is based on minimizing the determinant of the covariance matrix of the estimated parameters (D-optimal design). In contrast to previous methods, D-optimal design is independent of the imaged quantities. Applying this method to ADC imaging, we demonstrate its steady performance for the whole range of input variables (imaged parameters, number of measurements, and range of b-values). Using Monte Carlo simulations we show that the D-optimal design outperforms existing experiment design methods in terms of accuracy and precision of the estimated parameters.

  3. Maximize, minimize or target - optimization for a fitted response from a designed experiment

    DOE PAGES

    Anderson-Cook, Christine Michaela; Cao, Yongtao; Lu, Lu

    2016-04-01

    One of the common goals of running and analyzing a designed experiment is to find a location in the design space that optimizes the response of interest. Depending on the goal of the experiment, we may seek to maximize or minimize the response, or set the process to hit a particular target value. After the designed experiment, a response model is fitted and the optimal settings of the input factors are obtained based on the estimated response model. Furthermore, the suggested optimal settings of the input factors are then used in the production environment.

  4. Superheavies: Short-Term Experiments and Far-Reaching Designs

    NASA Astrophysics Data System (ADS)

    Zagrebaev, V. I.; Karpov, A. V.; Mishustin, I. N.; Greiner, Walter

    Low values of the fusion cross sections and very short half-lives of nuclei with Z>120 put obstacles in synthesis of new elements. However the fusion reactions of medium mass projectiles with different actinide targets still can be used for the production of the not-yet-synthesized SH nuclei. The gap of unknown SH nuclei, located between the isotopes which were produced earlier in the cold and hot fusion reactions, could be filled in fusion reactions of ^{48}Ca with available lighter isotopes of Pu, Am, and Cm. Cross sections for the production of these nuclei are predicted to be rather large, and the corresponding experiments can be easily performed at existing facilities. The use of heavier actinide targets give us a chance to produce more neutron enriched SH isotopes. Moreover, for the first time, a narrow pathway is found to the middle of the island of stability owing to possible β ^+ decay of SH isotopes which can be formed in ordinary fusion reactions of stable nuclei. Multi-nucleon transfer processes at near barrier collisions of heavy (and very heavy, U-like) ions seem to be quite realistic reaction mechanism allowing us to produce new neutron enriched heavy nuclei located in the unexplored upper part of the nuclear map. Neutron capture reactions can be also used for the production of the long-living neutron rich SH nuclei. Strong neutron fluxes might be provided by pulsed nuclear reactors and by nuclear explosions in laboratory conditions and by supernova explosions in nature. All these possibilities are discussed in the chapter.

  5. Optical Design of the MOSES Sounding Rocket Experiment

    NASA Astrophysics Data System (ADS)

    Thomas, R. J.; Kankelborg, C. C.

    2001-12-01

    The Multi-Order Solar EUV Spectrograph (MOSES) is a sounding rocket payload now being developed by Montana State University in collaboration with the Goddard Space Flight Center, Lockheed Martin Advanced Technology Center, and Mullard Space Science Laboratory. The instrument utilizes a unique optical design to provide solar EUV measurements with true 2-pixel resolutions of 1.0 arcsec and 60 mÅ over a full two-dimensional field of view of 1056 x 528 arcsec, all at a time cadence of 10 s. This unprecedented capability is achieved by means of an objective spherical grating 100 mm in diameter, ruled at 833 gr/mm. The concave grating focuses spectrally dispersed solar radiation onto three separate detectors, simultaneously recording the zero-order as well as the plus and minus first-spectral-order images. Data analysis procedures, similar to those used in X-ray tomography reconstructions, can then disentangle the mixed spatial and spectral information recorded by the multiple detectors. A flat folding mirror permits an imaging focal length of 4.74 m to be packaged within the payload's physical length of 2.82 m. Both the objective grating and folding flat have specialized, closely matched, multilayer coatings that strongly enhance their EUV reflectance while also suppressing off-band radiation that would otherwise complicate data inversion. Although the spectral bandpass is rather narrow, several candidate wavelength intervals are available to carry out truly unique scientific studies of the outer solar atmosphere. Initial flights of MOSES, scheduled to begin in 2004, will observe a 10 Å band that covers very strong emission lines characteristic of both the sun's corona (Si XI 303 Å) and transition-region (He II 304 Å). The MOSES program is supported by a grant from NASA's Office of Space Science.

  6. Optical Design of the MOSES Sounding Rocket Experiment

    NASA Technical Reports Server (NTRS)

    Thomas, Roger J.; Kankelborg, Charles C.; Fisher, Richard R. (Technical Monitor)

    2001-01-01

    The Multi-Order Solar EUV Spectrograph (MOSES) is a sounding rocket payload now being developed by Montana State University in collaboration with the Goddard Space Flight Center, Lockheed Martin Advanced Technology Center, and Mullard Space Science Laboratory. The instrument utilizes a unique optical design to provide solar EUV measurements with true 2-pixel resolutions of 1.0 arcsec and 60 mA over a full two-dimensional field of view of 1056 x 528 arcsec, all at a time cadence of 10 s. This unprecedented capability is achieved by means of an objective spherical grating 100 mm in diameter, ruled at 833 gr/mm. The concave grating focuses spectrally dispersed solar radiation onto three separate detectors, simultaneously recording the zero-order as well as the plus and minus first-spectral-order images. Data analysis procedures, similar to those used in X-ray tomography reconstructions, can then disentangle the mixed spatial and spectral information recorded by the multiple detectors. A flat folding mirror permits an imaging focal length of 4.74 m to be packaged within the payload's physical length of 2.82 m. Both the objective grating and folding flat have specialized, closely matched, multilayer coatings that strongly enhance their EUV reflectance while also suppressing off-band radiation that would otherwise complicate data inversion. Although the spectral bandpass is rather narrow, several candidate wavelength intervals are available to carry out truly unique scientific studies of the outer solar atmosphere. Initial flights of MOSES, scheduled to begin in 2004, will observe a 10 Angstrom band that covers very strong emission lines characteristic of both the sun's corona (Si XI 303 Angstroms) and transition-region (He II 304 Angstroms). The MOSES program is supported by a grant from NASA's Office of Space Science.

  7. High Temperature Electrolysis Pressurized Experiment Design, Operation, and Results

    SciTech Connect

    J.E. O'Brien; X. Zhang; G.K. Housley; K. DeWall; L. Moore-McAteer

    2012-09-01

    A new facility has been developed at the Idaho National Laboratory for pressurized testing of solid oxide electrolysis stacks. Pressurized operation is envisioned for large-scale hydrogen production plants, yielding higher overall efficiencies when the hydrogen product is to be delivered at elevated pressure for tank storage or pipelines. Pressurized operation also supports higher mass flow rates of the process gases with smaller components. The test stand can accommodate planar cells with dimensions up to 8.5 cm x 8.5 cm and stacks of up to 25 cells. It is also suitable for testing other cell and stack geometries including tubular cells. The pressure boundary for these tests is a water-cooled spool-piece pressure vessel designed for operation up to 5 MPa. Pressurized operation of a ten-cell internally manifolded solid oxide electrolysis stack has been successfully demonstrated up 1.5 MPa. The stack is internally manifolded and operates in cross-flow with an inverted-U flow pattern. Feed-throughs for gas inlets/outlets, power, and instrumentation are all located in the bottom flange. The entire spool piece, with the exception of the bottom flange, can be lifted to allow access to the internal furnace and test fixture. Lifting is accomplished with a motorized threaded drive mechanism attached to a rigid structural frame. Stack mechanical compression is accomplished using springs that are located inside of the pressure boundary, but outside of the hot zone. Initial stack heatup and performance characterization occurs at ambient pressure followed by lowering and sealing of the pressure vessel and subsequent pressurization. Pressure equalization between the anode and cathode sides of the cells and the stack surroundings is ensured by combining all of the process gases downstream of the stack. Steady pressure is maintained by means of a backpressure regulator and a digital pressure controller. A full description of the pressurized test apparatus is provided in this

  8. Providing Novice Instructional Designers Real-World Experiences: The PacifiCorp Design and Development Competition

    ERIC Educational Resources Information Center

    Bishop, MJ; Schuch, Dan; Spector, J. Michael; Tracey, Monica W.

    2005-01-01

    According to the International Board of Standards for Training, Performance, and Instruction (ibstpi) new technologies and methods have made instructional design practice more complex and sophisticated today than it was in the early years of the field (Richey et al., 2001). The authors of the 2000 ibstpi instructional design standards claimed…

  9. An Automated Tool for Developing Experimental Designs: The Computer-Aided Design Reference for Experiments (CADRE)

    DTIC Science & Technology

    2009-01-01

    survey procedures, and cognitive task analysis), system design methods (e.g., focus groups , design guidelines, specifications, and requirements), and...LABORATORY - HRED ATTN AMSRD ARL HR MZ A DAVISON 199 E 4TH ST STE C TECH PARK BLG 2 FT LEONARD WOOD MO 65473-1949 1 ARMY RSCH LABORATORY

  10. Electrical design of Space Shuttle payload G-534: The pool boiling experiment

    NASA Technical Reports Server (NTRS)

    Francisco, David R.

    1993-01-01

    Payload G-534, the Pool Boiling Experiment (PBE), is a Get Away Special (GAS) payload that flew on the Space Shuttle Spacelab Mission J (STS 47) on September 19-21, 1992. This paper will give a brief overall description of the experiment with the main discussion being the electrical design with a detailed description of the power system and interface to the GAS electronics. The batteries used and their interface to the experiment Power Control Unit (PCU) and GAS electronics will be examined. The design philosophy for the PCU will be discussed in detail. The criteria for selection of fuses, relays, power semiconductors, and other electrical components along with grounding and shielding policy for the entire experiment are presented. The intent of this paper is to discuss the use of military tested parts and basic design guidelines to build a quality experiment for minimal additional cost.

  11. Design of a Synthetic Aperture Array to Support Experiments in Active Control of Scattering

    DTIC Science & Technology

    1990-06-01

    IIC FILE COPY DESIGN OF A SYNTHETIC APERTURE ARRAY TO SUPPORT EXPERIMENTS IN ACTIVE CONTROL OF SCATTERING by JAMES P. DULLEA B.N.E. GEORGIA...Ain Sonin Clmairnnn, Mechancal Engineering Departmenlal Graduate Committee 90 09 24 053 DESIGN OF A SYNTHETIC APERTURE ARRAY TO SUPPORT EXPERIMENTS IN...partial fulfillment of the requirements for the Degrees of Naval Engineer and Master of Science in Mechanical Engineering Abstract A synthetic aperture

  12. An Initial Model for Generative Design Research: Bringing Together Generative Focus Group (GFG) and Experience Reflection Modelling (ERM)

    ERIC Educational Resources Information Center

    Bakirlioglu, Yekta; Ogur, Dilruba; Dogan, Cagla; Turhan, Senem

    2016-01-01

    Understanding people's experiences and the context of use of a product at the earliest stages of the design process has in the last decade become an important aspect of both the design profession and design education. Generative design research helps designers understand user experiences, while also throwing light on their current needs,…

  13. IsoDesign: a software for optimizing the design of 13C-metabolic flux analysis experiments.

    PubMed

    Millard, Pierre; Sokol, Serguei; Letisse, Fabien; Portais, Jean-Charles

    2014-01-01

    The growing demand for (13) C-metabolic flux analysis ((13) C-MFA) in the field of metabolic engineering and systems biology is driving the need to rationalize expensive and time-consuming (13) C-labeling experiments. Experimental design is a key step in improving both the number of fluxes that can be calculated from a set of isotopic data and the precision of flux values. We present IsoDesign, a software that enables these parameters to be maximized by optimizing the isotopic composition of the label input. It can be applied to (13) C-MFA investigations using a broad panel of analytical tools (MS, MS/MS, (1) H NMR, (13) C NMR, etc.) individually or in combination. It includes a visualization module to intuitively select the optimal label input depending on the biological question to be addressed. Applications of IsoDesign are described, with an example of the entire (13) C-MFA workflow from the experimental design to the flux map including important practical considerations. IsoDesign makes the experimental design of (13) C-MFA experiments more accessible to a wider biological community. IsoDesign is distributed under an open source license at http://metasys.insa-toulouse.fr/software/isodes/

  14. Bayesian cross-entropy methodology for optimal design of validation experiments

    NASA Astrophysics Data System (ADS)

    Jiang, X.; Mahadevan, S.

    2006-07-01

    An important concern in the design of validation experiments is how to incorporate the mathematical model in the design in order to allow conclusive comparisons of model prediction with experimental output in model assessment. The classical experimental design methods are more suitable for phenomena discovery and may result in a subjective, expensive, time-consuming and ineffective design that may adversely impact these comparisons. In this paper, an integrated Bayesian cross-entropy methodology is proposed to perform the optimal design of validation experiments incorporating the computational model. The expected cross entropy, an information-theoretic distance between the distributions of model prediction and experimental observation, is defined as a utility function to measure the similarity of two distributions. A simulated annealing algorithm is used to find optimal values of input variables through minimizing or maximizing the expected cross entropy. The measured data after testing with the optimum input values are used to update the distribution of the experimental output using Bayes theorem. The procedure is repeated to adaptively design the required number of experiments for model assessment, each time ensuring that the experiment provides effective comparison for validation. The methodology is illustrated for the optimal design of validation experiments for a three-leg bolted joint structure and a composite helicopter rotor hub component.

  15. Using the experience-based design approach to improve orthodontic care.

    PubMed

    Ellis, Pamela E; Silverton, Sarah

    2014-12-01

    The experience-based design (ebd) approach is a method of measuring patient experience, which deliberately draws out subjective, emotional and personal feelings of the patients using a service. We describe how the experience-based design approach has been used to measure the experiences of teenage patients at orthodontic consultation appointments in a district general hospital. This has allowed us to identify the points in the patient's journey where they experience most anxiety and nervousness and to target service improvements in these areas. We found the ebd approach effective in measuring patient experience in a teenage patient population. We demonstrate how the service improvements implemented have reduced negative feelings during new patient consultations.

  16. Building International Experiences into an Engineering Curriculum--A Design Project-Based Approach

    ERIC Educational Resources Information Center

    Maldonado, Victor; Castillo, Luciano; Carbajal, Gerardo; Hajela, Prabhat

    2014-01-01

    This paper is a descriptive account of how short-term international and multicultural experiences can be integrated into early design experiences in an aerospace engineering curriculum. Such approaches are considered as important not only in fostering a student's interest in the engineering curriculum, but also exposing them to a multicultural…

  17. Designing an Acoustic Suspension Speaker System in the General Physics Laboratory: A Divergent experiment

    ERIC Educational Resources Information Center

    Horton, Philip B.

    1969-01-01

    Describes a student laboratory project involving the design of an "acoustic suspension speaker system. The characteristics of the loudspeaker used are measured as an extension of the inertia-balance experiment. The experiment may be extended to a study of Stelmholtz resonators, coupled oscillators, electromagnetic forces, thermodynamics and…

  18. Experiences of Design-and-Make Interventions with Indian Middle School Students

    ERIC Educational Resources Information Center

    Khunyakari, Ritesh P.

    2015-01-01

    Enabling learning through meaningful classroom experiences has always been a challenge for teachers. Bringing about a balance of the "conceptual" and the "hands-on", along with contextual embeddedness in problem-solving situations, broadly characterises the experience of development and trials of three Design and Technology…

  19. SUMS preliminary design and data analysis development. [shuttle upper atmosphere mass spectrometer experiment

    NASA Technical Reports Server (NTRS)

    Hinson, E. W.

    1981-01-01

    The preliminary analysis and data analysis system development for the shuttle upper atmosphere mass spectrometer (SUMS) experiment are discussed. The SUMS experiment is designed to provide free stream atmospheric density, pressure, temperature, and mean molecular weight for the high altitude, high Mach number region.

  20. Optimal experiment design: Link between the concentration and the accuracy of estimation of aggregation parameters

    NASA Astrophysics Data System (ADS)

    Evstigneev, Vladislav P.; Pashkova, Irina S.; Kostjukov, Viktor V.; Hernandez Santiago, Adrian A.; Evstigneev, Maxim P.

    2016-11-01

    The principal condition for optimal experiment design, required for getting reasonable error for equilibrium aggregation constant, K, determination is obtained. This condition states that the selected concentration range for performing titration experiment should be inversely proportional to the expected value of K. As a consequence, the choice of physico-chemical methods for determination of aggregation parameters must obey this condition.

  1. The Role of Flow Experience and CAD Tools in Facilitating Creative Behaviours for Architecture Design Students

    ERIC Educational Resources Information Center

    Dawoud, Husameddin M.; Al-Samarraie, Hosam; Zaqout, Fahed

    2015-01-01

    This study examined the role of flow experience in intellectual activity with an emphasis on the relationship between flow experience and creative behaviour in design using CAD. The study used confluence and psychometric approaches because of their unique abilities to depict a clear image of creative behaviour. A cross-sectional study…

  2. Best Bang for the Buck: Part 1 – The Size of Experiments Relative to Design Performance

    DOE PAGES

    Anderson-Cook, Christine Michaela; Lu, Lu

    2016-10-01

    There are many choices to make, when designing an experiment for a study, such as: what design factors to consider, which levels of the factors to use and which model to focus on. One aspect of design, however, is often left unquestioned: the size of the experiment. When learning about design of experiments, problems are often posed as "select a design for a particular objective with N runs." It’s tempting to consider the design size as a given constraint in the design-selection process. If you think of learning through designed experiments as a sequential process, however, strategically planning for themore » use of resources at different stages of data collection can be beneficial: Saving experimental runs for later is advantageous if you can efficiently learn with less in the early stages. Alternatively, if you’re too frugal in the early stages, you might not learn enough to proceed confidently with the next stages. Therefore, choosing the right-sized experiment is important—not too large or too small, but with a thoughtful balance to maximize the knowledge gained given the available resources. It can be a great advantage to think about the design size as flexible and include it as an aspect for comparisons. Sometimes you’re asked to provide a small design that is too ambitious for the goals of the study. Finally, if you can show quantitatively how the suggested design size might be inadequate or lead to problems during analysis—and also offer a formal comparison to some alternatives of different (likely larger) sizes—you may have a better chance to ask for additional resources to deliver statistically sound and satisfying results« less

  3. Best Bang for the Buck: Part 1 – The Size of Experiments Relative to Design Performance

    SciTech Connect

    Anderson-Cook, Christine Michaela; Lu, Lu

    2016-10-01

    There are many choices to make, when designing an experiment for a study, such as: what design factors to consider, which levels of the factors to use and which model to focus on. One aspect of design, however, is often left unquestioned: the size of the experiment. When learning about design of experiments, problems are often posed as "select a design for a particular objective with N runs." It’s tempting to consider the design size as a given constraint in the design-selection process. If you think of learning through designed experiments as a sequential process, however, strategically planning for the use of resources at different stages of data collection can be beneficial: Saving experimental runs for later is advantageous if you can efficiently learn with less in the early stages. Alternatively, if you’re too frugal in the early stages, you might not learn enough to proceed confidently with the next stages. Therefore, choosing the right-sized experiment is important—not too large or too small, but with a thoughtful balance to maximize the knowledge gained given the available resources. It can be a great advantage to think about the design size as flexible and include it as an aspect for comparisons. Sometimes you’re asked to provide a small design that is too ambitious for the goals of the study. Finally, if you can show quantitatively how the suggested design size might be inadequate or lead to problems during analysis—and also offer a formal comparison to some alternatives of different (likely larger) sizes—you may have a better chance to ask for additional resources to deliver statistically sound and satisfying results

  4. An application of the IMC software to controller design for the JPL LSCL Experiment Facility

    NASA Technical Reports Server (NTRS)

    Zhu, Guoming; Skelton, Robert E.

    1993-01-01

    A software package which Integrates Model reduction and Controller design (The IMC software) is applied to design controllers for the JPL Large Spacecraft Control Laboratory Experiment Facility. Modal Cost Analysis is used for the model reduction, and various Output Covariance Constraints are guaranteed by the controller design. The main motivation is to find the controller with the 'best' performance with respect to output variances. Indeed it is shown that by iterating on the reduced order design model, the controller designed does have better performance than that obtained with the first model reduction.

  5. Thermal design, analysis, and testing of the CETA Space Shuttle Flight Experiment

    NASA Technical Reports Server (NTRS)

    Witsil, Amy K.; Foss, Richard A.

    1990-01-01

    Attention is given to the Crew and Equipment Translation Aid (CETA) Space Shuttle flight experiment designed to demonstrate techniques and equipment for propelling and restraining crew during EVA. Emphasis is placed on the thermal analysis of the CETA hardware, including thermal design trade-offs, modeling assumptions, temperature predictions, and testing activities.

  6. Individual Differences and the Conundrums of User-Centered Design: Two Experiments.

    ERIC Educational Resources Information Center

    Allen, Bryce

    2000-01-01

    Discusses individual differences between users of information systems that can influence search performance, and describes two experiments that addressed user-centered design of information systems. Highlights include interaction between cognitive abilities and design features; compensation and capitalization perspectives; recall and precision;…

  7. Interim Service ISDN Satellite (ISIS) network model for advanced satellite designs and experiments

    NASA Technical Reports Server (NTRS)

    Pepin, Gerard R.; Hager, E. Paul

    1991-01-01

    The Interim Service Integrated Services Digital Network (ISDN) Satellite (ISIS) Network Model for Advanced Satellite Designs and Experiments describes a model suitable for discrete event simulations. A top-down model design uses the Advanced Communications Technology Satellite (ACTS) as its basis. The ISDN modeling abstractions are added to permit the determination and performance for the NASA Satellite Communications Research (SCAR) Program.

  8. Lifting off the Ground to Return Anew: Mediated Praxis, Transformative Learning, and Social Design Experiments

    ERIC Educational Resources Information Center

    Gutierrez, Kris D.; Vossoughi, Shirin

    2010-01-01

    This article examines a praxis model of teacher education and advances a new method for engaging novice teachers in reflective practice and robust teacher learning. Social design experiments--cultural historical formations designed to promote transformative learning for adults and children--are organized around expansive notions of learning and…

  9. Evaluating the Effectiveness of Developmental Mathematics by Embedding a Randomized Experiment within a Regression Discontinuity Design

    ERIC Educational Resources Information Center

    Moss, Brian G.; Yeaton, William H.; Lloyd, Jane E.

    2014-01-01

    Using a novel design approach, a randomized experiment (RE) was embedded within a regression discontinuity (RD) design (R-RE-D) to evaluate the impact of developmental mathematics at a large midwestern college ("n" = 2,122). Within a region of uncertainty near the cut-score, estimates of benefit from a prospective RE were closely…

  10. Classroom Experiences in an Engineering Design Graphics Course with a CAD/CAM Extension.

    ERIC Educational Resources Information Center

    Barr, Ronald E.; Juricic, Davor

    1997-01-01

    Reports on the development of a new CAD/CAM laboratory experience for an Engineering Design Graphics (EDG) course. The EDG curriculum included freehand sketching, introduction to Computer-Aided Design and Drafting (CADD), and emphasized 3-D solid modeling. Reviews the project and reports on the testing of the new laboratory components which were…

  11. Design Your Own Workup: A Guided-Inquiry Experiment for Introductory Organic Laboratory Courses

    ERIC Educational Resources Information Center

    Mistry, Nimesh; Fitzpatrick, Christopher; Gorman, Stephen

    2016-01-01

    A guided-inquiry experiment was designed and implemented in an introductory organic chemistry laboratory course. Students were given a mixture of compounds and had to isolate two of the components by designing a viable workup procedure using liquid-liquid separation methods. Students were given the opportunity to apply their knowledge of chemical…

  12. Paragogy and Flipped Assessment: Experience of Designing and Running a MOOC on Research Methods

    ERIC Educational Resources Information Center

    Lee, Yenn; Rofe, J. Simon

    2016-01-01

    This study draws on the authors' first-hand experience of designing, developing and delivering (3Ds) a massive open online course (MOOC) entitled "Understanding Research Methods" since 2014, largely but not exclusively for learners in the humanities and social sciences. The greatest challenge facing us was to design an assessment…

  13. Development of Metacognitive Skills: Designing Problem-Based Experiment with Prospective Science Teachers in Biology Laboratory

    ERIC Educational Resources Information Center

    Denis Çeliker, Huriye

    2015-01-01

    The purpose of this study is to investigate the effect of designing problem-based experiments (DPBE) on the level of metacognitive skills of prospective science teachers. For this purpose, pre test-post test design, without control group, was used in the research. The research group of the study comprised 113 second-grade prospective science…

  14. Experience with Teaching Design: Do We Blend the Old with the New?

    ERIC Educational Resources Information Center

    Flach, Lawrance

    1999-01-01

    Addresses some of the issues associated with teaching chemical engineering design, specifically the capstone design sequence developed at the University of Dayton and the experience gained developing and teaching these courses. Discusses the pros and cons of chemical-process flowsheet-simulator use. (Contains 11 references.) (WRM)

  15. The UCLA Design Diversity Experiment (DEDIX) system: A distributed testbed for multiple-version software

    NASA Technical Reports Server (NTRS)

    Avizienis, A.; Gunningberg, P.; Kelly, J. P. J.; Strigini, L.; Traverse, P. J.; Tso, K. S.; Voges, U.

    1986-01-01

    To establish a long-term research facility for experimental investigations of design diversity as a means of achieving fault-tolerant systems, a distributed testbed for multiple-version software was designed. It is part of a local network, which utilizes the Locus distributed operating system to operate a set of 20 VAX 11/750 computers. It is used in experiments to measure the efficacy of design diversity and to investigate reliability increases under large-scale, controlled experimental conditions.

  16. Design of a creep experiment for SiC/SiC composites in HFIR

    SciTech Connect

    Hecht, S.L.; Hamilton, M.L.; Jones, R.H.

    1997-08-01

    A new specimen was designed for performing in-reactor creep tests on composite materials, specifically on SiC/SiC composites. The design was tailored for irradiation at 800{degrees}C in a HFIR RB position. The specimen comprises a composite cylinder loaded by a pressurized internal bladder that is made of Nb1Zr. The experiment was designed for approximately a one year irradiation.

  17. Physical barriers formed from gelling liquids: 1. numerical design of laboratory and field experiments

    SciTech Connect

    Finsterle, S.; Moridis, G.J.; Pruess, K.; Persoff, P.

    1994-01-01

    The emplacement of liquids under controlled viscosity conditions is investigated by means of numerical simulations. Design calculations are performed for a laboratory experiment on a decimeter scale, and a field experiment on a meter scale. The purpose of the laboratory experiment is to study the behavior of multiple gout plumes when injected in a porous medium. The calculations for the field trial aim at designing a grout injection test from a vertical well in order to create a grout plume of a significant extent in the subsurface.

  18. Design Principles for High School Engineering Design Challenges: Experiences from High School Science Classrooms

    ERIC Educational Resources Information Center

    Schunn, Christian

    2011-01-01

    At the University of Pittsburgh, the author and his colleagues have been exploring a range of approaches to design challenges for implementation in high school science classrooms. In general, their approach has always involved students working during class time over the course of many weeks. So, their understanding of what works must be…

  19. The role of integral experiments and nuclear cross section evaluations in space nuclear reactor design

    NASA Astrophysics Data System (ADS)

    Moses, David L.; McKnight, Richard D.

    The importance of the nuclear and neutronic properties of candidate space reactor materials to the design process has been acknowledged as has been the use of benchmark reactor physics experiments to verify and qualify analytical tools used in design, safety, and performance evaluation. Since June 1966, the Cross Section Evaluation Working Group (CSEWG) has acted as an interagency forum for the assessment and evaluation of nuclear reaction data used in the nuclear design process. CSEWG data testing has involved the specification and calculation of benchmark experiments which are used widely for commercial reactor design and safety analysis. These benchmark experiments preceded the issuance of the industry standards for acceptance, but the benchmarks exceed the minimum acceptance criteria for such data. Thus, a starting place has been provided in assuring the accuracy and uncertainty of nuclear data important to space reactor applications.

  20. Design of experiments based variation mode and effect analysis of a conceptual air launched SLV

    NASA Astrophysics Data System (ADS)

    Rafique, Amer Farhan; Zeeshan, Qasim; Kamran, Ali

    2014-12-01

    Conceptual design stage is where the knowledge about the variation in system is still quite vague and herein we intend to analyze and compare various probable design concepts for Air Launched SLV by the use of basic variation mode and effect analysis. In this paper we present a methodology for the Variation Mode and Effect Analysis using Latin Hypercube Sampling based Design of Experiments for the conceptual Air launched Satellite Launch Vehicle. Variations are induced in the Control Variables based on knowledge and experience. The methodology is used to quantify the effect of Noise Factors on the performance of a conceptual Air Launched SLV. The insertion altitude of the Air Launched SLV is the Key Performance Indicator. Preliminary results of the performance and analysis for the simulated experiments are presented here. The performance of the proposed procedure has been tested and, thus, validated by the Air Launched SLV design problem. The Design of Experiment based Variation mode and effect analysis approach is intended for initial conceptual design purposes, thus, providing an immediate insight to the performance of the system in general and quantification of the sensitivity of the key performance indicator in particular, subject to the variations in noise factors prior to the detailed design phase.

  1. Simulation study to determine the impact of different design features on design efficiency in discrete choice experiments

    PubMed Central

    Vanniyasingam, Thuva; Cunningham, Charles E; Foster, Gary; Thabane, Lehana

    2016-01-01

    Objectives Discrete choice experiments (DCEs) are routinely used to elicit patient preferences to improve health outcomes and healthcare services. While many fractional factorial designs can be created, some are more statistically optimal than others. The objective of this simulation study was to investigate how varying the number of (1) attributes, (2) levels within attributes, (3) alternatives and (4) choice tasks per survey will improve or compromise the statistical efficiency of an experimental design. Design and methods A total of 3204 DCE designs were created to assess how relative design efficiency (d-efficiency) is influenced by varying the number of choice tasks (2–20), alternatives (2–5), attributes (2–20) and attribute levels (2–5) of a design. Choice tasks were created by randomly allocating attribute and attribute level combinations into alternatives. Outcome Relative d-efficiency was used to measure the optimality of each DCE design. Results DCE design complexity influenced statistical efficiency. Across all designs, relative d-efficiency decreased as the number of attributes and attribute levels increased. It increased for designs with more alternatives. Lastly, relative d-efficiency converges as the number of choice tasks increases, where convergence may not be at 100% statistical optimality. Conclusions Achieving 100% d-efficiency is heavily dependent on the number of attributes, attribute levels, choice tasks and alternatives. Further exploration of overlaps and block sizes are needed. This study's results are widely applicable for researchers interested in creating optimal DCE designs to elicit individual preferences on health services, programmes, policies and products. PMID:27436671

  2. Pliocene Model Intercomparison Project (PlioMIP): Experimental Design and Boundary Conditions (Experiment 2)

    NASA Technical Reports Server (NTRS)

    Haywood, A. M.; Dowsett, H. J.; Robinson, M. M.; Stoll, D. K.; Dolan, A. M.; Lunt, D. J.; Otto-Bliesner, B.; Chandler, M. A.

    2011-01-01

    The Palaeoclimate Modelling Intercomparison Project has expanded to include a model intercomparison for the mid-Pliocene warm period (3.29 to 2.97 million yr ago). This project is referred to as PlioMIP (the Pliocene Model Intercomparison Project). Two experiments have been agreed upon and together compose the initial phase of PlioMIP. The first (Experiment 1) is being performed with atmosphere only climate models. The second (Experiment 2) utilizes fully coupled ocean-atmosphere climate models. Following on from the publication of the experimental design and boundary conditions for Experiment 1 in Geoscientific Model Development, this paper provides the necessary description of differences and/or additions to the experimental design for Experiment 2.

  3. Pliocene Model Intercomparison Project (PlioMIP): experimental design and boundary conditions (Experiment 2)

    USGS Publications Warehouse

    Haywood, A.M.; Dowsett, H.J.; Robinson, M.M.; Stoll, D.K.; Dolan, A.M.; Lunt, D.J.; Otto-Bliesner, B.; Chandler, M.A.

    2011-01-01

    The Palaeoclimate Modelling Intercomparison Project has expanded to include a model intercomparison for the mid-Pliocene warm period (3.29 to 2.97 million yr ago). This project is referred to as PlioMIP (the Pliocene Model Intercomparison Project). Two experiments have been agreed upon and together compose the initial phase of PlioMIP. The first (Experiment 1) is being performed with atmosphere-only climate models. The second (Experiment 2) utilises fully coupled ocean-atmosphere climate models. Following on from the publication of the experimental design and boundary conditions for Experiment 1 in Geoscientific Model Development, this paper provides the necessary description of differences and/or additions to the experimental design for Experiment 2.

  4. A global parallel model based design of experiments method to minimize model output uncertainty.

    PubMed

    Bazil, Jason N; Buzzard, Gregory T; Rundell, Ann E

    2012-03-01

    Model-based experiment design specifies the data to be collected that will most effectively characterize the biological system under study. Existing model-based design of experiment algorithms have primarily relied on Fisher Information Matrix-based methods to choose the best experiment in a sequential manner. However, these are largely local methods that require an initial estimate of the parameter values, which are often highly uncertain, particularly when data is limited. In this paper, we provide an approach to specify an informative sequence of multiple design points (parallel design) that will constrain the dynamical uncertainty of the biological system responses to within experimentally detectable limits as specified by the estimated experimental noise. The method is based upon computationally efficient sparse grids and requires only a bounded uncertain parameter space; it does not rely upon initial parameter estimates. The design sequence emerges through the use of scenario trees with experimental design points chosen to minimize the uncertainty in the predicted dynamics of the measurable responses of the system. The algorithm was illustrated herein using a T cell activation model for three problems that ranged in dimension from 2D to 19D. The results demonstrate that it is possible to extract useful information from a mathematical model where traditional model-based design of experiments approaches most certainly fail. The experiments designed via this method fully constrain the model output dynamics to within experimentally resolvable limits. The method is effective for highly uncertain biological systems characterized by deterministic mathematical models with limited data sets. Also, it is highly modular and can be modified to include a variety of methodologies such as input design and model discrimination.

  5. Statistical Design of MOS VLSI (Very Large Scale Integrated) Circuits with Designed Experiments

    DTIC Science & Technology

    1990-03-01

    prohibitively large number of experimental runs, however. The Taguchi method collapses data from many (circuit simulator) runs into his so called "signal-to...designable parameters. We give a circuit example where the Taguchi objectives are met with about two-thirds fewer runs than [9]. The Taguchi method is...Chapter 5 we applied the circuit performance modeling method to achieve off-line quality control. The Taguchi method for off-line is reviewed. Taguchi’s

  6. Single and Multiresponse Adaptive Design of Experiments with Application to Design Optimization of Novel Heat Exchangers

    DTIC Science & Technology

    2009-01-01

    performance evaluation method for air- cooled heat exchangers in which conventional 3D Computational Fluid Dynamics (CFD) simulation is replaced with a 2D...parts to this research thrust. First, is a new multi-level performance evaluation method for air- cooled heat exchangers in which conventional 3D...performance of a novel air- cooled heat exchanger such as tube-fin or microchannels . The novel aspect generally refers to a new fin design or a tube

  7. The climateprediction.net BBC climate change experiment: design of the coupled model ensemble.

    PubMed

    Frame, D J; Aina, T; Christensen, C M; Faull, N E; Knight, S H E; Piani, C; Rosier, S M; Yamazaki, K; Yamazaki, Y; Allen, M R

    2009-03-13

    Perturbed physics experiments are among the most comprehensive ways to address uncertainty in climate change forecasts. In these experiments, parameters and parametrizations in atmosphere-ocean general circulation models are perturbed across ranges of uncertainty, and results are compared with observations. In this paper, we describe the largest perturbed physics climate experiment conducted to date, the British Broadcasting Corporation (BBC) climate change experiment, in which the physics of the atmosphere and ocean are changed, and run in conjunction with a forcing ensemble designed to represent uncertainty in past and future forcings, under the A1B Special Report on Emissions Scenarios (SRES) climate change scenario.

  8. Cryogenic design of the liquid helium experiment ``critical dynamics in microgravity``

    SciTech Connect

    Moeur, W.A.; Adriaans, M.J.; Boyd, S.T.; Strayer, D.M.; Duncan, R.V. |

    1995-10-01

    Although many well controlled experiments have been conducted to measure the static properties of systems near criticality, few experiments have explored the transport properties in systems driven far away from equilibrium as a phase transition occurs. The cryogenic design of an experiment to study the dynamic aspect of critical phenomena is reported here. Measurements of the thermal gradient across the superfluid (He II) -- normal fluid (He I) interface in helium under microgravity conditions will be performed as a heat flux holds the system away from equilibrium. New technologies are under development for this experiment, which is in the definition phase for a space shuttle flight.

  9. Optimal Design of Passive Flow Control for a Boundary-Layer-Ingesting Offset Inlet Using Design-of-Experiments

    NASA Technical Reports Server (NTRS)

    Allan, Brian G.; Owens, Lewis R., Jr.; Lin, John C.

    2006-01-01

    This research will investigate the use of Design-of-Experiments (DOE) in the development of an optimal passive flow control vane design for a boundary-layer-ingesting (BLI) offset inlet in transonic flow. This inlet flow control is designed to minimize the engine fan face distortion levels and first five Fourier harmonic half amplitudes while maximizing the inlet pressure recovery. Numerical simulations of the BLI inlet are computed using the Reynolds-averaged Navier-Stokes (RANS) flow solver, OVERFLOW, developed at NASA. These simulations are used to generate the numerical experiments for the DOE response surface model. In this investigation, two DOE optimizations were performed using a D-Optimal Response Surface model. The first DOE optimization was performed using four design factors which were vane height and angles-of-attack for two groups of vanes. One group of vanes was placed at the bottom of the inlet and a second group symmetrically on the sides. The DOE design was performed for a BLI inlet with a free-stream Mach number of 0.85 and a Reynolds number of 2 million, based on the length of the fan face diameter, matching an experimental wind tunnel BLI inlet test. The first DOE optimization required a fifth order model having 173 numerical simulation experiments and was able to reduce the DC60 baseline distortion from 64% down to 4.4%, while holding the pressure recovery constant. A second DOE optimization was performed holding the vanes heights at a constant value from the first DOE optimization with the two vane angles-of-attack as design factors. This DOE only required a second order model fit with 15 numerical simulation experiments and reduced DC60 to 3.5% with small decreases in the fourth and fifth harmonic amplitudes. The second optimal vane design was tested at the NASA Langley 0.3-Meter Transonic Cryogenic Tunnel in a BLI inlet experiment. The experimental results showed a 80% reduction of DPCPavg, the circumferential distortion level at the engine

  10. Optimal Design of Passive Flow Control for a Boundary-Layer-Ingesting Offset Inlet Using Design-of-Experiments

    NASA Technical Reports Server (NTRS)

    Allan, Brian G.; Owens, Lewis R.; Lin, John C.

    2006-01-01

    This research will investigate the use of Design-of-Experiments (DOE) in the development of an optimal passive flow control vane design for a boundary-layer-ingesting (BLI) offset inlet in transonic flow. This inlet flow control is designed to minimize the engine fan-face distortion levels and first five Fourier harmonic half amplitudes while maximizing the inlet pressure recovery. Numerical simulations of the BLI inlet are computed using the Reynolds-averaged Navier-Stokes (RANS) flow solver, OVERFLOW, developed at NASA. These simulations are used to generate the numerical experiments for the DOE response surface model. In this investigation, two DOE optimizations were performed using a D-Optimal Response Surface model. The first DOE optimization was performed using four design factors which were vane height and angles-of-attack for two groups of vanes. One group of vanes was placed at the bottom of the inlet and a second group symmetrically on the sides. The DOE design was performed for a BLI inlet with a free-stream Mach number of 0.85 and a Reynolds number of 2 million, based on the length of the fan-face diameter, matching an experimental wind tunnel BLI inlet test. The first DOE optimization required a fifth order model having 173 numerical simulation experiments and was able to reduce the DC60 baseline distortion from 64% down to 4.4%, while holding the pressure recovery constant. A second DOE optimization was performed holding the vanes heights at a constant value from the first DOE optimization with the two vane angles-of-attack as design factors. This DOE only required a second order model fit with 15 numerical simulation experiments and reduced DC60 to 3.5% with small decreases in the fourth and fifth harmonic amplitudes. The second optimal vane design was tested at the NASA Langley 0.3- Meter Transonic Cryogenic Tunnel in a BLI inlet experiment. The experimental results showed a 80% reduction of DPCP(sub avg), the circumferential distortion level at the

  11. Design and Performance of an Automated Bioreactor for Cell Culture Experiments in a Microgravity Environment

    NASA Astrophysics Data System (ADS)

    Kim, Youn-Kyu; Park, Seul-Hyun; Lee, Joo-Hee; Choi, Gi-Hyuk

    2015-03-01

    In this paper, we describe the development of a bioreactor for a cell-culture experiment on the International Space Station (ISS). The bioreactor is an experimental device for culturing mouse muscle cells in a microgravity environment. The purpose of the experiment was to assess the impact of microgravity on the muscles to address the possibility of longterm human residence in space. After investigation of previously developed bioreactors, and analysis of the requirements for microgravity cell culture experiments, a bioreactor design is herein proposed that is able to automatically culture 32 samples simultaneously. This reactor design is capable of automatic control of temperature, humidity, and culture-medium injection rate; and satisfies the interface requirements of the ISS. Since bioreactors are vulnerable to cell contamination, the medium-circulation modules were designed to be a completely replaceable, in order to reuse the bioreactor after each experiment. The bioreactor control system is designed to circulate culture media to 32 culture chambers at a maximum speed of 1 ml/min, to maintain the temperature of the reactor at 36°C, and to keep the relative humidity of the reactor above 70%. Because bubbles in the culture media negatively affect cell culture, a de-bubbler unit was provided to eliminate such bubbles. A working model of the reactor was built according to the new design, to verify its performance, and was used to perform a cell culture experiment that confirmed the feasibility of this device.

  12. Scaling of Thermal-Hydraulic Experiments for a Space Rankine Cycle and Selection of a Preconceptual Scaled Experiment Design

    SciTech Connect

    Sulfredge, CD

    2006-01-27

    To assist with the development of a space-based Rankine cycle power system using liquid potassium as the working fluid, a study has been conducted on possible scaled experiments with simulant fluids. This report will consider several possible working fluids and describe a scaling methodology to achieve thermal-hydraulic similarity between an actual potassium system and scaled representations of the Rankine cycle boiler or condenser. The most practical scaling approach examined is based on the selection of perfluorohexane (FC-72) as the simulant. Using the scaling methodology, a series of possible solutions have been calculated for the FC-72 boiler and condenser. The possible scaled systems will then be compared and preconceptual specifications and drawings given for the most promising design. The preconceptual design concept will also include integrating the scaled boiler and scaled condenser into a single experimental loop. All the preconceptual system specifications appear practical from a fabrication and experimental standpoint, but further work will be needed to arrive at a final experiment design.

  13. All voices matter in experience design: A commitment to action in engaging patient and family voice.

    PubMed

    Wolf, Jason A

    2016-09-01

    This article intends to frame the broader concept of experience design and the engagement of patient and family voice, reinforcing how truly aligned healthcare professionals are not only on the value of this work but also in understanding the benefits of it. When addressing the idea of design, it is important to look at the broadest possible construct and consider the engagement of patient and family voices in healthcare operational efforts, not as passive advisors but as active participants in data gathering, providing input, and with actual decision-making. The article offers engagement is not just part of process, facility, or experience design but must be part of the decisions made in how organizations in healthcare today are built, led, and sustained, fundamentally reinforcing our opportunity in healthcare is to focus on overall experience with purpose and intention. This commitment is what will lead to the outcomes all ultimately hope to achieve.

  14. Opto-mechanical design of vacuum laser resonator for the OSQAR experiment

    NASA Astrophysics Data System (ADS)

    Hošek, Jan; Macúchová, Karolina; Nemcová, Šárka; Kunc, Štěpán.; Šulc, Miroslav

    2015-01-01

    This paper gives short overview of laser-based experiment OSQAR at CERN which is focused on search of axions and axion-like particles. The OSQAR experiment uses two experimental methods for axion search - measurement of the ultra-fine vacuum magnetic birefringence and a method based on the "Light shining through the wall" experiment. Because both experimental methods have reached its attainable limits of sensitivity we have focused on designing a vacuum laser resonator. The resonator will increase the number of convertible photons and their endurance time within the magnetic field. This paper presents an opto-mechanical design of a two component transportable vacuum laser resonator. Developed optical resonator mechanical design allows to be used as a 0.8 meter long prototype laser resonator for laboratory testing and after transportation and replacement of the mirrors it can be mounted on the LHC magnet in CERN to form a 20 meter long vacuum laser resonator.

  15. Designing and Developing Game-Like Learning Experience in Virtual Worlds: Challenges and Design Decisions of Novice Instructional Designers

    ERIC Educational Resources Information Center

    Yilmaz, Turkan Karakus; Cagiltay, Kursat

    2016-01-01

    Many virtual worlds have been adopted for implementation within educational settings because they are potentially useful for building effective learning environments. Since the flexibility of virtual worlds challenges to obtain effective and efficient educational outcomes, the design of such platforms need more attention. In the present study, the…

  16. Effects of Spatial Experiences & Cognitive Styles in the Solution Process of Space-Based Design Problems in the First Year of Architectural Design Education

    ERIC Educational Resources Information Center

    Erkan Yazici, Yasemin

    2013-01-01

    There are many factors that influence designers in the architectural design process. Cognitive style, which varies according to the cognitive structure of persons, and spatial experience, which is created with spatial data acquired during life are two of these factors. Designers usually refer to their spatial experiences in order to find solutions…

  17. The NASA Langley Laminar-Flow-Control (LFC) experiment on a swept, supercritical airfoil: Design overview

    NASA Technical Reports Server (NTRS)

    Harris, Charles D.; Harvey, William D.; Brooks, Cuyler W., Jr.

    1988-01-01

    A large-chord, swept, supercritical, laminar-flow-control (LFC) airfoil was designed and constructed and is currently undergoing tests in the Langley 8 ft Transonic Pressure Tunnel. The experiment was directed toward evaluating the compatibility of LFC and supercritical airfoils, validating prediction techniques, and generating a data base for future transport airfoil design as part of NASA's ongoing research program to significantly reduce drag and increase aircraft efficiency. Unique features of the airfoil included a high design Mach number with shock free flow and boundary layer control by suction. Special requirements for the experiment included modifications to the wind tunnel to achieve the necessary flow quality and contouring of the test section walls to simulate free air flow about a swept model at transonic speeds. Design of the airfoil with a slotted suction surface, the suction system, and modifications to the tunnel to meet test requirements are discussed.

  18. Participatory design of computer-supported organizational learning in health care: methods and experiences.

    PubMed

    Timpka, T; Sjöberg, C; Hallberg, N; Eriksson, H; Lindblom, P; Hedblom, P; Svensson, B; Marmolin, H

    1995-01-01

    This paper outlines a Computer-Supported Co-operative Work (CSCW) system for primary care and presents from its participatory design process time consumption, costs, and experiences. The system integrates a hypermedia environment, a computerized patient record, and an electronicmessage system. It is developed to coordinate organizational learning in primary care from micro to macro levels by connecting strategic planning to monitoring of patient routines. Summing up design experiences, critical issues for making CSCW systems support cost-effectiveness of health care are discussed.

  19. Design of high-Reynolds-number flat-plate experiments in the NTF

    NASA Technical Reports Server (NTRS)

    Saric, William S.

    1988-01-01

    The design of an experiment to measure skin friction and turbulent boundary layer characteristics at Reynolds numbers exceeding 1 x 10 to the 9th is described. The experiment will be conducted in a zero-pressure-gradient flow on a flat plate in the National Transonic Facility (NTF). The development of computational codes to analyze the aerodynamic loads and the blockage is documented. Novel instrumentation techniques and models, designed to operate in cryogenic environments, are presented. Special problems associated with aerodynamic loads, surface finish, and hot-wire anemometers are discussed.

  20. Participatory design of computer-supported organizational learning in health care: methods and experiences.

    PubMed Central

    Timpka, T.; Sjöberg, C.; Hallberg, N.; Eriksson, H.; Lindblom, P.; Hedblom, P.; Svensson, B.; Marmolin, H.

    1995-01-01

    This paper outlines a Computer-Supported Co-operative Work (CSCW) system for primary care and presents from its participatory design process time consumption, costs, and experiences. The system integrates a hypermedia environment, a computerized patient record, and an electronicmessage system. It is developed to coordinate organizational learning in primary care from micro to macro levels by connecting strategic planning to monitoring of patient routines. Summing up design experiences, critical issues for making CSCW systems support cost-effectiveness of health care are discussed. PMID:8563401

  1. Optimizing ELISAs for precision and robustness using laboratory automation and statistical design of experiments.

    PubMed

    Joelsson, Daniel; Moravec, Phil; Troutman, Matthew; Pigeon, Joseph; DePhillips, Pete

    2008-08-20

    Transferring manual ELISAs to automated platforms requires optimizing the assays for each particular robotic platform. These optimization experiments are often time consuming and difficult to perform using a traditional one-factor-at-a-time strategy. In this manuscript we describe the development of an automated process using statistical design of experiments (DOE) to quickly optimize immunoassays for precision and robustness on the Tecan EVO liquid handler. By using fractional factorials and a split-plot design, five incubation time variables and four reagent concentration variables can be optimized in a short period of time.

  2. Radiometric and photometric design for an Acoustic Containerless Experiment System. [for space processing

    NASA Technical Reports Server (NTRS)

    Glavich, T. A.

    1981-01-01

    The design of an optical system for a high temperature Acoustic Containerless Experiment System is examined. The optical system provides two-axis video, cine and infrared images of an acoustically positioned sample over a temperature range of 20 to 1200 C. Emphasis is placed on the radiometric and photometric characterization of the elements in the optical system and the oven to assist image data determination. Sample visibility due to wall radiance is investigated along with visibility due to strobe radiance. The optical system is designed for operation in Spacelab, and is used for a variety of materials processing experiments.

  3. Experiment Design for Complex VTOL Aircraft with Distributed Propulsion and Tilt Wing

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick C.; Landman, Drew

    2015-01-01

    Selected experimental results from a wind tunnel study of a subscale VTOL concept with distributed propulsion and tilt lifting surfaces are presented. The vehicle complexity and automated test facility were ideal for use with a randomized designed experiment. Design of Experiments and Response Surface Methods were invoked to produce run efficient, statistically rigorous regression models with minimized prediction error. Static tests were conducted at the NASA Langley 12-Foot Low-Speed Tunnel to model all six aerodynamic coefficients over a large flight envelope. This work supports investigations at NASA Langley in developing advanced configurations, simulations, and advanced control systems.

  4. Discussion of “Bayesian design of experiments for industrial and scientific applications via gaussian processes”

    SciTech Connect

    Anderson-Cook, Christine M.; Burke, Sarah E.

    2016-10-18

    First, we would like to commend Dr. Woods on his thought-provoking paper and insightful presentation at the 4th Annual Stu Hunter conference. We think that the material presented highlights some important needs in the area of design of experiments for generalized linear models (GLMs). In addition, we agree with Dr. Woods that design of experiements of GLMs does implicitly require expert judgement about model parameters, and hence using a Bayesian approach to capture this knowledge is a natural strategy to summarize what is known with the opportunity to incorporate associated uncertainty about that information.

  5. Design of a CO2 laser power control system for a Spacelab microgravity experiment

    NASA Technical Reports Server (NTRS)

    Wenzler, Carl J.; Eichenberg, Dennis J.

    1990-01-01

    The surface tension driven convection experiment (STDCE) is a Space Transportation System flight experiment manifested to fly aboard the USML-1 Spacelab mission. A CO2 laser is used to heat a spot on the surface of silicone oil contained inside a test chamber. Several CO2 laser control systems were evaluated and the selected system will be interfaced with the balance of the experimental hardware to constitute a working engineering model. Descriptions and a discussion of these various design approaches are presented.

  6. Building international experiences into an engineering curriculum - a design project-based approach

    NASA Astrophysics Data System (ADS)

    Maldonado, Victor; Castillo, Luciano; Carbajal, Gerardo; Hajela, Prabhat

    2014-07-01

    This paper is a descriptive account of how short-term international and multicultural experiences can be integrated into early design experiences in an aerospace engineering curriculum. Such approaches are considered as important not only in fostering a student's interest in the engineering curriculum, but also exposing them to a multicultural setting that they are likely to encounter in their professional careers. In the broader sense, this programme is described as a model that can be duplicated in other engineering disciplines as a first-year experience. In this study, undergraduate students from Rensselaer Polytechnic Institute (RPI) and Universidad del Turabo (UT) in Puerto Rico collaborated on a substantial design project consisting of designing, fabricating, and flight-testing radio-controlled model aircraft as a capstone experience in a semester-long course on Fundamentals of Flight. The two-week long experience in Puerto Rico was organised into academic and cultural components designed with the following objectives: (i) to integrate students in a multicultural team-based academic and social environment, (ii) to practise team-building skills and develop students' critical thinking and analytical skills, and finally (iii) to excite students about their engineering major through practical applications of aeronautics and help them decide if it is a right fit for them.

  7. The usefulness of systematic reviews of animal experiments for the design of preclinical and clinical studies.

    PubMed

    de Vries, Rob B M; Wever, Kimberley E; Avey, Marc T; Stephens, Martin L; Sena, Emily S; Leenaars, Marlies

    2014-01-01

    The question of how animal studies should be designed, conducted, and analyzed remains underexposed in societal debates on animal experimentation. This is not only a scientific but also a moral question. After all, if animal experiments are not appropriately designed, conducted, and analyzed, the results produced are unlikely to be reliable and the animals have in effect been wasted. In this article, we focus on one particular method to address this moral question, namely systematic reviews of previously performed animal experiments. We discuss how the design, conduct, and analysis of future (animal and human) experiments may be optimized through such systematic reviews. In particular, we illustrate how these reviews can help improve the methodological quality of animal experiments, make the choice of an animal model and the translation of animal data to the clinic more evidence-based, and implement the 3Rs. Moreover, we discuss which measures are being taken and which need to be taken in the future to ensure that systematic reviews will actually contribute to optimizing experimental design and thereby to meeting a necessary condition for making the use of animals in these experiments justified.

  8. Improving Statistical Rigor in Defense Test and Evaluation: Use of Tolerance Intervals in Designed Experiments

    DTIC Science & Technology

    2014-10-01

    first introduced in the seminal paper by Wallis (1951). Wallis extended the previous work of Wald and Wolfowitz (1946) for a normally distrib- uted...Statistical tolerance regions: Theory, applications, and computation (Vol. 744). Hoboken, NJ: John Wiley & Sons. 22Defense ARJ, October 2014...D. C. (2001). Design and analysis of experiments (5th ed.). Hoboken, NJ: John Wiley & Sons. Montgomery, D. C. (2005). Design and analysis of

  9. Muon-catalyzed fusion experiment target and detector system. Preliminary design report

    SciTech Connect

    Jones, S.E.; Watts, K.D.; Caffrey, A.J.; Walter, J.B.

    1982-03-01

    We present detailed plans for the target and particle detector systems for the muon-catalyzed fusion experiment. Requirements imposed on the target vessel by experimental conditions and safety considerations are delineated. Preliminary designs for the target vessel capsule and secondary containment vessel have been developed which meet these requirements. In addition, the particle detection system is outlined, including associated fast electronics and on-line data acquisition. Computer programs developed to study the target and detector system designs are described.

  10. Active structural control design and experiment for the Mini-Mast

    NASA Technical Reports Server (NTRS)

    Wie, Bong; Horta, Lucas; Sulla, Jeff

    1990-01-01

    Control system design and closed-loop test results for the Mini-Mast truss structure located at the NASA Langley Research Center are presented. The simplicity and effectiveness of a classical control approach to the active structural control design are demonstrated by ground experiments. The concepts of robust nonminimum phase compensation and periodic disturbance rejection are also experimentally validated. The practicality of a sensor output decoupling approach is demonstrated for the inherent, multivariable control problem of the Mini-Mast.

  11. Application of Modern Design of Experiments to CARS Thermometry in a Model Scramjet Engine

    NASA Technical Reports Server (NTRS)

    Danehy, P. M.; DeLoach, R.; Cutler, A. D.

    2002-01-01

    We have applied formal experiment design and analysis to optimize the measurement of temperature in a supersonic combustor at NASA Langley Research Center. We used the coherent anti-Stokes Raman spectroscopy (CARS) technique to map the temperature distribution in the flowfield downstream of an 1160 K, Mach 2 freestream into which supersonic hydrogen fuel is injected at an angle of 30 degrees. CARS thermometry is inherently a single-point measurement technique; it was used to map thc flow by translating the measurement volume through the flowfield. The method known as "Modern Design of Experiments" (MDOE) was used to estimate the data volume required, design the test matrix, perform the experiment and analyze the resulting data. MDOE allowed us to match the volume of data acquired to the precision requirements of the customer. Furthermore, one aspect of MDOE, known as response surface methodology, allowed us to develop precise maps of the flowfield temperature, allowing interpolation between measurement points. An analytic function in two spatial variables was fit to the data from a single measurement plane. Fitting with a Cosine Series Bivariate Function allowed the mean temperature to be mapped with 95% confidence interval half-widths of +/- 30 Kelvin, comfortably meeting the confidence of +/- 50 Kelvin specified prior to performing the experiments. We estimate that applying MDOE to the present experiment saved a factor of 5 in data volume acquired, compared to experiments executed in the traditional manner. Furthermore, the precision requirements could have been met with less than half the data acquired.

  12. Development of display design and command usage guidelines for Spacelab experiment computer applications

    NASA Technical Reports Server (NTRS)

    Dodson, D. W.; Shields, N. L., Jr.

    1979-01-01

    Individual Spacelab experiments are responsible for developing their CRT display formats and interactive command scenarios for payload crew monitoring and control of experiment operations via the Spacelab Data Display System (DDS). In order to enhance crew training and flight operations, it was important to establish some standardization of the crew/experiment interface among different experiments by providing standard methods and techniques for data presentation and experiment commanding via the DDS. In order to establish optimum usage guidelines for the Spacelab DDS, the capabilities and limitations of the hardware and Experiment Computer Operating System design had to be considered. Since the operating system software and hardware design had already been established, the Display and Command Usage Guidelines were constrained to the capabilities of the existing system design. Empirical evaluations were conducted on a DDS simulator to determine optimum operator/system interface utilization of the system capabilities. Display parameters such as information location, display density, data organization, status presentation and dynamic update effects were evaluated in terms of response times and error rates.

  13. Design of Experiments Study of Hydroxyapatite Synthesis for Orthopaedic Application Using Fractional Factorial Design

    NASA Astrophysics Data System (ADS)

    Kehoe, S.; Ardhaoui, M.; Stokes, J.

    2011-11-01

    Hydroxyapatite, (HAp), Ca10(PO4)6(OH)2, is a naturally occurring mineral found in the inorganic component of enamel and human bone, consequently the present research focus is its ability to promote bone growth onto femoral implants when the HAp powder is sprayed using plasma thermal spraying. As the sprayed deposit requires certain mechanical and biological performances, the characteristics of the starting HAp powder will provide these properties. Hap powders were synthesized via a wet chemical precipitation technique using a Fractional Factorial, Resolution IV, two-level experimental design to evaluate the critical process parameters (reagent addition rate, reaction temperature, stirring speed, ripening time, initial calcium concentration, and the presence of an inert atmosphere) and their effect (main and interaction) on the final HAp powder characteristics, such as, phase composition, purity, crystallinity, crystallite size, lattice parameters, particle size, and particle size distribution. All six selected variables investigated, showed an influence (either as a minor or major significance) on one or more of the responses investigated, either as a main or interaction effect. However, both the ripening time and the stirring speed were found to significantly affect the majority of the five responses, with the reaction temperature also having a significant effect on the final phase composition, lattice parameters, and particle size.

  14. Teaching examples for the design of experiments: geographical sensitivity and the self-fulfilling prophecy.

    PubMed

    Lendrem, Dennis W; Lendrem, B Clare; Rowland-Jones, Ruth; D'Agostino, Fabio; Linsley, Matt; Owen, Martin R; Isaacs, John D

    2016-01-01

    Many scientists believe that small experiments, guided by scientific intuition, are simpler and more efficient than design of experiments. This belief is strong and persists even in the face of data demonstrating that it is clearly wrong. In this paper, we present two powerful teaching examples illustrating the dangers of small experiments guided by scientific intuition. We describe two, simple, two-dimensional spaces. These two spaces give rise to, and at the same time appear to generate supporting data for, scientific intuitions that are deeply flawed or wholly incorrect. We find these spaces useful in unfreezing scientific thinking and challenging the misplaced confidence in scientific intuition.

  15. Analytical and experimental investigation of liquid double drop dynamics: Preliminary design for space shuttle experiments

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The preliminary grant assessed the use of laboratory experiments for simulating low g liquid drop experiments in the space shuttle environment. Investigations were begun of appropriate immiscible liquid systems, design of experimental apparatus and analyses. The current grant continued these topics, completed construction and preliminary testing of the experimental apparatus, and performed experiments on single and compound liquid drops. A continuing assessment of laboratory capabilities, and the interests of project personnel and available collaborators, led to, after consultations with NASA personnel, a research emphasis specializing on compound drops consisting of hollow plastic or elastic spheroids filled with liquids.

  16. Optimization of detection sensitivity for a Fiber Optic Intrusion Detection System (FOIDS) using design of experiments.

    SciTech Connect

    Miller, Larry D.; Mack, Thomas Kimball; Mitchiner, Kim W.; Varoz, Carmella A.

    2010-06-01

    The Fiber Optic Intrusion Detection System (FOIDS)1 is a physical security sensor deployed on fence lines to detect climb or cut intrusions by adversaries. Calibration of detection sensitivity can be time consuming because, for example, the FiberSenSys FD-332 has 32 settings that can be adjusted independently to provide a balance between a high probability of detection and a low nuisance alarm rate. Therefore, an efficient method of calibrating the FOIDS in the field, other than by trial and error, was needed. This study was conducted to: x Identify the most significant settings for controlling detection x Develop a way of predicting detection sensitivity for given settings x Develop a set of optimal settings for validation The Design of Experiments (DoE) 2-4 methodology was used to generate small, planned test matrixes, which could be statistically analyzed to yield more information from the test data. Design of Experiments is a statistical methodology for quickly optimizing performance of systems with measurable input and output variables. DoE was used to design custom screening experiments based on 11 FOIDS settings believed to have the most affect on WKH types of fence perimeter intrusions were evaluated: simulated cut intrusions and actual climb intrusions. Two slightly different two-level randomized fractional factorial designed experiment matrixes consisting of 16 unique experiments were performed in the field for each type of intrusion. Three repetitions were conducted for every cut test; two repetitions were conducted for every climb test. Total number of cut tests analyzed was 51; the total number of climb tests was 38. This paper discusses the results and benefits of using Design of Experiments (DoE) to calibrate and optimize the settings for a FOIDS sensor

  17. Mathematical models as aids for design and development of experiments: the case of transgenic mosquitoes.

    PubMed

    Robert, Michael A; Legros, Mathieu; Facchinelli, Luca; Valerio, Laura; Ramsey, Janine M; Scott, Thomas W; Gould, Fred; Lloyd, Alun L

    2012-11-01

    We demonstrate the utility of models as aids in the design and development of experiments aimed at measuring the effects of proposed vector population control strategies. We describe the exploration of a stochastic, age-structured model that simulates field cage experiments that test the ability of a female-killing strain of the mosquito Aedes aegypti (L.) to suppress a wild-type population. Model output predicts that choices of release ratio and population size can impact mean extinction time and variability in extinction time among experiments. We find that unless fitness costs are >80% they will not be detectable in experiments with high release ratios. At lower release ratios, the predicted length of the experiment increases significantly for fitness costs >20%. Experiments with small populations may more accurately reflect field conditions, but extinction can occur even in the absence of a functional female-killing construct because of stochastic effects. We illustrate how the model can be used to explore experimental designs that aim to study the impact of density dependence and immigration; predictions indicate that cage population eradication may not always be obtainable in an operationally realistic time frame. We propose a method to predict the extinction time of a cage population based on the rate of population reduction with the goal of shortening the duration of the experiment. We discuss the model as a tool for exploring and assessing the utility of a wider range of scenarios than would be feasible to test experimentally because of financial and temporal restraints.

  18. Specifications for and preliminary design of a plant growth chamber for orbital experimental experiments

    NASA Technical Reports Server (NTRS)

    Sweet, H. C.; Simmonds, R. C.

    1976-01-01

    It was proposed that plant experiments be performed on board the space shuttle. To permit the proper execution of most tests, the craft must contain a plant growth chamber which is adequately designed to control those environmental factors which can induce changes in a plant's physiology and morphology. The various needs of, and environmental factors affecting, plants are identified. The permissilbe design, construction and performance limits for a plant-growth chamber are set, and tentative designs were prepared for units which are compatible with both the botanical requirements and the constraints imposed by the space shuttle.

  19. Optimizing an experimental design for a CSEM experiment: methodology and synthetic tests

    NASA Astrophysics Data System (ADS)

    Roux, E.; Garcia, X.

    2014-04-01

    Optimizing an experimental design is a compromise between maximizing information we get about the target and limiting the cost of the experiment, providing a wide range of constraints. We present a statistical algorithm for experiment design that combines the use of linearized inverse theory and stochastic optimization technique. Linearized inverse theory is used to quantify the quality of one given experiment design while genetic algorithm (GA) enables us to examine a wide range of possible surveys. The particularity of our algorithm is the use of the multi-objective GA NSGA II that searches designs that fit several objective functions (OFs) simultaneously. This ability of NSGA II is helping us in defining an experiment design that focuses on a specified target area. We present a test of our algorithm using a 1-D electrical subsurface structure. The model we use represents a simple but realistic scenario in the context of CO2 sequestration that motivates this study. Our first synthetic test using a single OF shows that a limited number of well-distributed observations from a chosen design have the potential to resolve the given model. This synthetic test also points out the importance of a well-chosen OF, depending on our target. In order to improve these results, we show how the combination of two OFs using a multi-objective GA enables us to determine an experimental design that maximizes information about the reservoir layer. Finally, we present several tests of our statistical algorithm in more challenging environments by exploring the influence of noise, specific site characteristics or its potential for reservoir monitoring.

  20. Evaluation for the design of experience in virtual environments: modeling breakdown of interaction and illusion.

    PubMed

    Marsh, T; Wright, P; Smith, S

    2001-04-01

    New and emerging media technologies have the potential to induce a variety of experiences in users. In this paper, it is argued that the inducement of experience presupposes that users are absorbed in the illusion created by these media. Looking to another successful visual medium, film, this paper borrows from the techniques used in "shaping experience" to hold spectators' attention in the illusion of film, and identifies what breaks the illusion/experience for spectators. This paper focuses on one medium, virtual reality (VR), and advocates a transparent or "invisible style" of interaction. We argue that transparency keeps users in the "flow" of their activities and consequently enhances experience in users. Breakdown in activities breaks the experience and subsequently provides opportunities to identify and analyze potential causes of usability problems. Adopting activity theory, we devise a model of interaction with VR--through consciousness and activity--and introduce the concept of breakdown in illusion. From this, a model of effective interaction with VR is devised and the occurrence of breakdown in interaction and illusion is identified along a continuum of engagement. Evaluation guidelines for the design of experience are proposed and applied to usability problems detected in an empirical study of a head-mounted display (HMD) VR system. This study shows that the guidelines are effective in the evaluation of VR. Finally, we look at the potential experiences that may be induced in users and propose a way to evaluate user experience in virtual environments (VEs) and other new and emerging media.

  1. Fluxes at experiment facilities in HEU and LEU designs for the FRM-II.

    SciTech Connect

    Hanan, N. A.

    1998-01-16

    An Alternative LEU Design for the FRM-II proposed by the RERTR Program at Argonne National Laboratory (ANL) has a compact core consisting of a single fuel element that uses LEU silicide fuel with a uranium density of 4.5 g/cm{sup 3} and has a power level of 32 MW. Both the HEU design by the Technical University of Munich (TUM) and the alternative LEU design by ANL have the same fuel lifetime(50 days) and the same neutron flux performance (8 x 10{sup 14} n/cm{sup 2}-s in the reflector). LEU silicide fuel with 4.5 g/cm{sup 3} has been thoroughly tested and is fully-qualified, licensable, and available now for use in a high flux reactor such as the FRM-II. Several issues that were raised by TUM have been addressed in Refs. 1-3. The conclusions of these analyses are summarized below. This paper addresses four additional issues that have been raised in several forums, including Ref 4: heat generation in the cold neutron source (CNS), the gamma and fast neutron fluxes which are components of the reactor noise in neutron scattering experiments in the experiment hall of the reactor, a fuel cycle length difference, and the reactivity worth of the beam tubes and other experiment facilities. The results show that: (a) for the same thermal neutron flux, the neutron and gamma heating in the CNS is smaller in the LEU design than in the HEU design, and cold neutron fluxes as good or better than those of the HEU design can be obtained with the LEU desin; (b) the gamma and fast neutron components of the reactor noise in the experiment hall are about the same in both designs; (c) the fuel cycle length is 50 days for both designs; and (d) the absolute value of the reactivity worth of the beam tubes and other experiment facilities is smaller in the LEU design, allowing its fuel cycle length to be increased to 53 or 54 days. Based on the excellent results for the Alternative LEU Design that were obtained in all analyses, the RERTR Program reiterates its conclusion that there are no

  2. Real-time PCR probe optimization using design of experiments approach.

    PubMed

    Wadle, S; Lehnert, M; Rubenwolf, S; Zengerle, R; von Stetten, F

    2016-03-01

    Primer and probe sequence designs are among the most critical input factors in real-time polymerase chain reaction (PCR) assay optimization. In this study, we present the use of statistical design of experiments (DOE) approach as a general guideline for probe optimization and more specifically focus on design optimization of label-free hydrolysis probes that are designated as mediator probes (MPs), which are used in reverse transcription MP PCR (RT-MP PCR). The effect of three input factors on assay performance was investigated: distance between primer and mediator probe cleavage site; dimer stability of MP and target sequence (influenza B virus); and dimer stability of the mediator and universal reporter (UR). The results indicated that the latter dimer stability had the greatest influence on assay performance, with RT-MP PCR efficiency increased by up to 10% with changes to this input factor. With an optimal design configuration, a detection limit of 3-14 target copies/10 μl reaction could be achieved. This improved detection limit was confirmed for another UR design and for a second target sequence, human metapneumovirus, with 7-11 copies/10 μl reaction detected in an optimum case. The DOE approach for improving oligonucleotide designs for real-time PCR not only produces excellent results but may also reduce the number of experiments that need to be performed, thus reducing costs and experimental times.

  3. Mobile App Design for Teaching and Learning: Educators' Experiences in an Online Graduate Course

    ERIC Educational Resources Information Center

    Hsu, Yu-Chang; Ching, Yu-Hui

    2013-01-01

    This research explored how educators with limited programming experiences learned to design mobile apps through peer support and instructor guidance. Educators were positive about the sense of community in this online course. They also considered App Inventor a great web-based visual programming tool for developing useful and fully functioning…

  4. Developing Vocabulary and Conceptual Knowledge for Low-Income Preschoolers: A Design Experiment

    ERIC Educational Resources Information Center

    Neuman, Susan B.; Dwyer, Julie

    2011-01-01

    The purpose of this design experiment was to research, test, and iteratively derive principles of word learning and word organization that could help to theoretically advance our understanding of vocabulary development for low-income preschoolers. Six Head Start teachers in morning and afternoon programs and their children (N = 89) were selected…

  5. Hydrogel design of experiments methodology to optimize hydrogel for iPSC-NPC culture.

    PubMed

    Lam, Jonathan; Carmichael, S Thomas; Lowry, William E; Segura, Tatiana

    2015-03-11

    Bioactive signals can be incorporated in hydrogels to direct encapsulated cell behavior. Design of experiments methodology methodically varies the signals systematically to determine the individual and combinatorial effects of each factor on cell activity. Using this approach enables the optimization of three ligands concentrations (RGD, YIGSR, IKVAV) for the survival and differentiation of neural progenitor cells.

  6. Enhancing Research and Practice in Early Childhood through Formative and Design Experiments

    ERIC Educational Resources Information Center

    Bradley, Barbara A.; Reinking, David

    2011-01-01

    This article describes formative and design experiments and how they can advance research and instructional practices in early childhood education. We argue that this relatively new approach to education research closes the gap between research and practice, and it addresses limitations that have been identified in early childhood research. We…

  7. Kinetic resolution of oxazinones: rational exploration of chemical space through the design of experiments.

    PubMed

    Renzi, Polyssena; Kronig, Christel; Carlone, Armando; Eröksüz, Serap; Berkessel, Albrecht; Bella, Marco

    2014-09-08

    The organocatalytic kinetic resolution of 4-substituted oxazinones has been optimised (selectivity factor S up to 98, chiral oxazinone ee values up to 99.6 % (1 a-g) and product ee values up to 90 % (3 a-g)) in a rational way by applying the Design of Experiments (DoE) approach.

  8. Exploring a Comprehensive Model for Early Childhood Vocabulary Instruction: A Design Experiment

    ERIC Educational Resources Information Center

    Wang, X. Christine; Christ, Tanya; Chiu, Ming Ming

    2014-01-01

    Addressing a critical need for effective vocabulary practices in early childhood classrooms, we conducted a design experiment to achieve three goals: (1) developing a comprehensive model for early childhood vocabulary instruction, (2) examining the effectiveness of this model, and (3) discerning the contextual conditions that hinder or facilitate…

  9. Conducting Design Experiments to Support Teachers' Learning: A Reflection from the Field

    ERIC Educational Resources Information Center

    Cobb, Paul; Zhao, Qing; Dean, Chrystal

    2009-01-01

    This article focuses on 3 conceptual challenges that we sought to address while conducting a design experiment in which we supported the learning of a group of middle school mathematics teachers. These challenges involved (a) situating teachers' activity in the institutional setting of the schools and district in which they worked, (b) developing…

  10. A Continuous Community Pharmacy Practice Experience: Design and Evaluation of Instructional Materials.

    ERIC Educational Resources Information Center

    Thomas, Selby Greer; And Others

    1996-01-01

    A two-year community pharmacy clinical experience using self-directed learning modules is described and evaluated. The modules were designed to stimulate interest in community pharmacy, motivate learning by demonstrating applicability of didactic work to contemporary practice, develop communication and psychosocial skills, and promote…

  11. Small-Scale Design Experiments as Working Space for Larger Mobile Communication Challenges

    ERIC Educational Resources Information Center

    Lowe, Sarah; Stuedahl, Dagny

    2014-01-01

    In this paper, a design experiment using Instagram as a cultural probe is submitted as a method for analyzing the challenges that arise when considering the implementation of social media within a distributed communication space. It outlines how small, iterative investigations can reveal deeper research questions relevant to the education of…

  12. Sharing Designer and User Perspectives of Web Site Evaluation: A Cross-Campus Collaborative Learning Experience.

    ERIC Educational Resources Information Center

    Collings, Penny; Pearce, Jon

    2002-01-01

    Presents an online collaborative process that facilitates usability evaluation of Web sites. Describes how the system was designed and used by students and staff at two Australian universities and shows that the process provides feedback on Web site usability and the experience of usability evaluation from the perspectives of a user and a…

  13. Board Games and Board Game Design as Learning Tools for Complex Scientific Concepts: Some Experiences

    ERIC Educational Resources Information Center

    Chiarello, Fabio; Castellano, Maria Gabriella

    2016-01-01

    In this paper the authors report different experiences in the use of board games as learning tools for complex and abstract scientific concepts such as Quantum Mechanics, Relativity or nano-biotechnologies. In particular we describe "Quantum Race," designed for the introduction of Quantum Mechanical principles, "Lab on a chip,"…

  14. Students' Sense of Community Based on Experiences with Residence Hall Design

    ERIC Educational Resources Information Center

    Heasley, Christopher L.

    2013-01-01

    This study seeks to determine students' sense of community outcomes based on experiences with different residence hall architectural designs. Sense of community is a "feeling that members have of belonging, a feeling that members matter to one another and to the group, and a shared faith that members' needs will be met through their…

  15. Designing Experiments on Thermal Interactions by Secondary-School Students in a Simulated Laboratory Environment

    ERIC Educational Resources Information Center

    Lefkos, Ioannis; Psillos, Dimitris; Hatzikraniotis, Euripides

    2011-01-01

    Background and purpose: The aim of this study was to explore the effect of investigative activities with manipulations in a virtual laboratory on students' ability to design experiments. Sample: Fourteen students in a lower secondary school in Greece attended a teaching sequence on thermal phenomena based on the use of information and…

  16. A Discovery-Based Friedel-Crafts Acylation Experiment: Student-Designed Experimental Procedure

    ERIC Educational Resources Information Center

    Reeve, Anne McElwee

    2004-01-01

    The unique experience of designing and conducting a reaction are afforded to the students in a similar way to what one would do in a research lab. It encouraged them to think about each procedural step and its purpose, by the discovery-based procedure.

  17. Design a Contract: A Simple Principal-Agent Problem as a Classroom Experiment

    ERIC Educational Resources Information Center

    Gachter, Simon; Konigstein, Manfred

    2009-01-01

    The authors present a simple classroom experiment that can be used as a teaching device to introduce important concepts of organizational economics and incentive contracting. First, students take the role of a principal and design a contract that consists of a fixed payment and an incentive component. Second, students take the role of agents and…

  18. Design of experiments (DOE) - history, concepts, and relevance to in vitro culture

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Design of experiments (DOE) is a large and well-developed field for understanding and improving the performance of complex systems. Because in vitro culture systems are complex, but easily manipulated in controlled conditions, they are particularly well-suited for the application of DOE principle...

  19. The Scope and Design of Structured Group Learning Experiences at Community Colleges

    ERIC Educational Resources Information Center

    Hatch, Deryl K.; Bohlig, E. Michael

    2015-01-01

    This study explores through descriptive analysis the similarities of structured group learning experiences such as first-year seminars, learning communities, orientation, success courses, and accelerated developmental education programs, in terms of their design features and implementation at community colleges. The study takes as its conceptual…

  20. Improved Hyperspectral Image Testing Using Synthetic Imagery and Factorial Designed Experiments

    DTIC Science & Technology

    2007-03-01

    Mendenhall, and Scheaffer , fully detail the designing of experiments in their respective texts. For purposes of this thesis, a simplified explanation of...Air Force Institute of Technology (AU), Wright-Patterson AFB OH, March 2007. Wackerly, Dennis D., William Mendenhall, and Richard L. Scheaffer

  1. Participating with Experience--A Case Study of Students as Co-Producers of Course Design

    ERIC Educational Resources Information Center

    Reneland-Forsman, Linda

    2016-01-01

    Higher Education (HE) needs to handle a diverse student population. The role of student expectations and previous experience is a key to fully participate. This study investigates student meaning making and interaction in a course designed to stimulate student as co-creators of course content and aims. Results revealed that rich communication…

  2. Recommended practices in elevated temperature design: A compendium of breeder reactor experiences (1970-1986): An overview

    SciTech Connect

    Wei, B.C.; Cooper, W.L. Jr.; Dhalla, A.K.

    1987-09-01

    Significant experiences have been accumulated in the establishment of design methods and criteria applicable to the design of Liquid Metal Fast Breeder Reactor (LMFBR) components. The Subcommittee of the Elevated Temperature Design under the Pressure Vessel Research Council (PVRC) has undertaken to collect, on an international basis, design experience gained, and the lessons learned, to provide guidelines for next generation advanced reactor designs. This paper shall present an overview and describe the highlights of the work.

  3. Lattice design of the integrable optics test accelerator and optical stochastic cooling experiment at Fermilab

    SciTech Connect

    Kafka, Gene

    2015-05-01

    The Integrable Optics Test Accelerator (IOTA) storage ring at Fermilab will serve as the backbone for a broad spectrum of Advanced Accelerator R&D (AARD) experiments, and as such, must be designed with signi cant exibility in mind, but without compromising cost e ciency. The nonlinear experiments at IOTA will include: achievement of a large nonlinear tune shift/spread without degradation of dynamic aperture; suppression of strong lattice resonances; study of stability of nonlinear systems to perturbations; and studies of di erent variants of nonlinear magnet design. The ring optics control has challenging requirements that reach or exceed the present state of the art. The development of a complete self-consistent design of the IOTA ring optics, meeting the demands of all planned AARD experiments, is presented. Of particular interest are the precise control for nonlinear integrable optics experiments and the transverse-to-longitudinal coupling and phase stability for the Optical Stochastic Cooling Experiment (OSC). Since the beam time-of- ight must be tightly controlled in the OSC section, studies of second order corrections in this section are presented.

  4. Aspects of experimental design for plant metabolomics experiments and guidelines for growth of plant material.

    PubMed

    Gibon, Yves; Rolin, Dominique

    2012-01-01

    Experiments involve the deliberate variation of one or more factors in order to provoke responses, the identification of which then provides the first step towards functional knowledge. Because environmental, biological, and/or technical noise is unavoidable, biological experiments usually need to be designed. Thus, once the major sources of experimental noise have been identified, individual samples can be grouped, randomised, and/or pooled. Like other 'omics approaches, metabolomics is characterised by the numbers of analytes largely exceeding sample number. While this unprecedented singularity in biology dramatically increases false discovery, experimental error can nevertheless be decreased in plant metabolomics experiments. For this, each step from plant cultivation to data acquisition needs to be evaluated in order to identify the major sources of error and then an appropriate design can be produced, as with any other experimental approach. The choice of technology, the time at which tissues are harvested, and the way metabolism is quenched also need to be taken into consideration, as they decide which metabolites can be studied. A further recommendation is to document data and metadata in a machine readable way. The latter should also describe every aspect of the experiment. This should provide valuable hints for future experimental design and ultimately give metabolomic data a second life. To facilitate the identification of critical steps, a list of items to be considered before embarking on time-consuming and costly metabolomic experiments is proposed.

  5. Edesign: Primer and Enhanced Internal Probe Design Tool for Quantitative PCR Experiments and Genotyping Assays.

    PubMed

    Kimura, Yasumasa; Soma, Takahiro; Kasahara, Naoko; Delobel, Diane; Hanami, Takeshi; Tanaka, Yuki; de Hoon, Michiel J L; Hayashizaki, Yoshihide; Usui, Kengo; Harbers, Matthias

    2016-01-01

    Analytical PCR experiments preferably use internal probes for monitoring the amplification reaction and specific detection of the amplicon. Such internal probes have to be designed in close context with the amplification primers, and may require additional considerations for the detection of genetic variations. Here we describe Edesign, a new online and stand-alone tool for designing sets of PCR primers together with an internal probe for conducting quantitative real-time PCR (qPCR) and genotypic experiments. Edesign can be used for selecting standard DNA oligonucleotides like for instance TaqMan probes, but has been further extended with new functions and enhanced design features for Eprobes. Eprobes, with their single thiazole orange-labelled nucleotide, allow for highly sensitive genotypic assays because of their higher DNA binding affinity as compared to standard DNA oligonucleotides. Using new thermodynamic parameters, Edesign considers unique features of Eprobes during primer and probe design for establishing qPCR experiments and genotyping by melting curve analysis. Additional functions in Edesign allow probe design for effective discrimination between wild-type sequences and genetic variations either using standard DNA oligonucleotides or Eprobes. Edesign can be freely accessed online at http://www.dnaform.com/edesign2/, and the source code is available for download.

  6. Design Optimization of PZT-Based Piezoelectric Cantilever Beam by Using Computational Experiments

    NASA Astrophysics Data System (ADS)

    Kim, Jihoon; Park, Sanghyun; Lim, Woochul; Jang, Junyong; Lee, Tae Hee; Hong, Seong Kwang; Song, Yewon; Sung, Tae Hyun

    2016-08-01

    Piezoelectric energy harvesting is gaining huge research interest since it provides high power density and has real-life applicability. However, investigative research for the mechanical-electrical coupling phenomenon remains challenging. Many researchers depend on physical experiments to choose devices with the best performance which meet design objectives through case analysis; this involves high design costs. This study aims to develop a practical model using computer simulations and to propose an optimized design for a lead zirconate titanate (PZT)-based piezoelectric cantilever beam which is widely used in energy harvesting. In this study, the commercial finite element (FE) software is used to predict the voltage generated from vibrations of the PZT-based piezoelectric cantilever beam. Because the initial FE model differs from physical experiments, the model is calibrated by multi-objective optimization to increase the accuracy of the predictions. We collect data from physical experiments using the cantilever beam and use these experimental results in the calibration process. Since dynamic analysis in the FE analysis of the piezoelectric cantilever beam with a dense step size is considerably time-consuming, a surrogate model is employed for efficient optimization. Through the design optimization of the PZT-based piezoelectric cantilever beam, a high-performance piezoelectric device was developed. The sensitivity of the variables at the optimum design is analyzed to suggest a further improved device.

  7. Design of interferometer system on Versatile Experiment Spherical Torus (VEST) at Seoul National University

    NASA Astrophysics Data System (ADS)

    Choi, D. H.; An, Y. H.; Chung, K. J.; Hwang, Y. S.

    2012-01-01

    A 94 GHz heterodyne interferometer system was designed to measure the plasma density of VEST (Versatile Experiment Spherical Torus), which was recently built at Seoul National University. Two 94 GHz Gunn oscillators with a frequency difference of 40 MHz were used in the microwave electronics part of a heterodyne interferometer system. A compact beam focusing system utilizing a pair of plano-convex lenses and a concave mirror was designed to maximize the effective beam reception and spatial resolution. Beam path analysis based on Gaussian optics was used in the design of the beam focusing system. The design of the beam focusing system and the beam path analysis were verified with a couple of experiments that were done within an experimental framework that considered the real dimensions of a vacuum vessel. Optimum distances between the optical components and the beam radii along the beam path obtained from the experiments were in good agreement with the beam path analysis using the Gaussian optics. Both experimentation and numerical calculations confirmed that the designed beam focusing system maximized the spatial resolution of the measurement; moreover, the beam waist was located at the center of the plasma to generate a phase shift more effectively in plasmas. The interferometer system presented in this paper is expected to be used in the measurements of line integrated plasma densities during the start-up phase of VEST.

  8. Design and development status of ETS-7, an RVD and space robot experiment satellite

    NASA Technical Reports Server (NTRS)

    Oda, M.; Inagaki, T.; Nishida, M.; Kibe, K.; Yamagata, F.

    1994-01-01

    ETS-7 (Engineering Test Satellite #7) is an experimental satellite for the in-orbit experiment of the Rendezvous Docking (RVD) and the space robot (RBT) technologies. ETS-7 is a set of two satellites, a chaser satellite and a target satellite. Both satellites will be launched together by NASDA's H-2 rocket into a low earth orbit. Development of ETS-7 started in 1990. Basic design and EM (Engineering Model) development are in progress now in 1994. The satellite will be launched in mid 1997 and the above in-orbit experiments will be conducted for 1.5 years. Design of ETS-7 RBT experiment system and development status are described in this paper.

  9. Efficient Testing Combing Design of Experiment and Learn-to-Fly Strategies

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick C.; Brandon, Jay M.

    2017-01-01

    Rapid modeling and efficient testing methods are important in a number of aerospace applications. In this study efficient testing strategies were evaluated in a wind tunnel test environment and combined to suggest a promising approach for both ground-based and flight-based experiments. Benefits of using Design of Experiment techniques, well established in scientific, military, and manufacturing applications are evaluated in combination with newly developing methods for global nonlinear modeling. The nonlinear modeling methods, referred to as Learn-to-Fly methods, utilize fuzzy logic and multivariate orthogonal function techniques that have been successfully demonstrated in flight test. The blended approach presented has a focus on experiment design and identifies a sequential testing process with clearly defined completion metrics that produce increased testing efficiency.

  10. Design and development status of ETS-7, an RVD and space robot experiment satellite

    NASA Astrophysics Data System (ADS)

    Oda, M.; Inagaki, T.; Nishida, M.; Kibe, K.; Yamagata, F.

    1994-10-01

    ETS-7 (Engineering Test Satellite #7) is an experimental satellite for the in-orbit experiment of the Rendezvous Docking (RVD) and the space robot (RBT) technologies. ETS-7 is a set of two satellites, a chaser satellite and a target satellite. Both satellites will be launched together by NASDA's H-2 rocket into a low earth orbit. Development of ETS-7 started in 1990. Basic design and EM (Engineering Model) development are in progress now in 1994. The satellite will be launched in mid 1997 and the above in-orbit experiments will be conducted for 1.5 years. Design of ETS-7 RBT experiment system and development status are described in this paper.

  11. Development of a Plastic Melt Waste Compactor for Space Missions Experiments and Prototype Design

    NASA Technical Reports Server (NTRS)

    Pace, Gregory; Wignarajah, Kanapathipillai; Pisharody, Suresh; Fisher, John

    2004-01-01

    This paper describes development at NASA Ames Research Center of a heat melt compactor that can be used on both near term and far term missions. Experiments have been performed to characterize the behavior of composite wastes that are representative of the types of wastes produced on current and previous space missions such as International Space Station, Space Shuttle, MIR and Skylab. Experiments were conducted to characterize the volume reduction, bonding, encapsulation and biological stability of the waste composite and also to investigate other key design issues such as plastic extrusion, noxious off-gassing and removal of the of the plastic waste product from the processor. The experiments provided the data needed to design a prototype plastic melt waste processor, a description of which is included in the paper.

  12. Sodium Handling Technology and Engineering Design of the Madison Dynamo Experiment.

    NASA Astrophysics Data System (ADS)

    Kendrick, R.; Forest, C. B.; O'Connell, R.; Wright, A.; Robinson, K.

    1998-11-01

    A new liquid metal MHD experiment is being constructed at the University of Wisconsin to test several key predictions of dynamo theory: magnetic instabilities driven by sheared flow, the effects of turbulence on current generation, and the back-reaction of the self-generated magnetic field on the fluid motion which brings saturation. This presentation describes the engineering design of the experiment, which is a 0.5 m radius spherical vessel, filled with liquid sodium at 150 degrees Celsius. The experiment is designed to achieve a magnetic Reynolds number in excess of 100, which requires approximately 80 Hp of mechanical drive, producing flow velocities in sodium of 15 m/s through impellers. Handling liquid sodium offers a number of technical challenges, but routine techniques have been developed over the past several decades for safely handling large quantities for the fast breeder reactor. The handling strategy is discussed, technical details concerning seals and pressurazation are presented, and safety elements are highlighted.

  13. Thermal systems design and analysis for a 10 K Sorption Cryocooler flight experiment

    NASA Technical Reports Server (NTRS)

    Bhandari, Pradeep; Bard, Steven

    1993-01-01

    The design, analysis and predicted performance of the Brilliant Eyes Ten-Kelvin Sorption Cryocooler Experiment (BETSCE) is described from a thermal perspective. BETSCE is a shuttle side-wall mounted cryogenic technology demonstration experiment planned for launch in November 1994. BETSCE uses a significant amount of power (about 500 W peak) and the resultant heat must be rejected passively with radiators, as BETSCE has no access to the active cooling capability of the shuttle. It was a major challenge to design and configure the individual hardware assemblies, with their relatively large radiators, to enable them to reject their heat while satisfying numerous severe shuttle-imposed constraints. This paper is a useful case study of a small shuttle payload that needs to reject relatively high heat loads passively in a highly constrained thermal environment. The design approach described is consistent with today's era of 'faster, better, cheaper' small-scale space missions.

  14. Spaceflight Holography Investigation in a Virtual Apparatus (SHIVA) Ground Experiments and Concepts for Flight Design

    NASA Technical Reports Server (NTRS)

    Miernik, Janie H.; Trolinger, James D.; Lackey, Jeffrey D.; Milton, Martha E.; Waggoner, Jason; Pope, Regina D.

    2002-01-01

    This paper discusses the development and design of an experimental test cell for ground-based testing to provide requirements for the Spaceflight Holography Investigation in a Virtual Apparatus (SHIVA) experiment. Ground-based testing of a hardware breadboard set-up is being conducted at Marshall Space Flight Center in Huntsville, Alabama. SHIVA objectives are to test and validate new solutions of the general equation of motion of a particle in a fluid, including particle-particle interaction, wall effects, motion at higher Reynolds Number, and a motion and dissolution of a crystal moving in a fluid. These objectives will be achieved by recording a large number of holograms of particle motion in the International Space Station (ISS) glove box under controlled conditions, extracting the precise three- dimensional position of all the particles as a function of time, and examining the effects of all parameters on the motion of the particles. This paper will describe the mechanistic approach to enabling the SHIVA experiment to be performed in a ISS glove box in microgravity. Because the particles are very small, surface tension becomes a major consideration in designing the mechanical method to meet the experiments objectives in microgravity, To keep a particle or particles in the center of the test cell long enough to perform and record the experiment and to preclude contribution to particle motion, requires avoiding any initial velocity in particle placement. A Particle Injection Mechanism (PIM) designed for microgravity has been devised and tested to enable SHIVA imaging. Also, a test cell capture mechanism, to secure the test cell during vibration on a specially designed shaker table for the SHIVA experiment will be described. Concepts for flight design are also presented.

  15. Comparison of Resource Requirements for a Wind Tunnel Test Designed with Conventional vs. Modern Design of Experiments Methods

    NASA Technical Reports Server (NTRS)

    DeLoach, Richard; Micol, John R.

    2011-01-01

    The factors that determine data volume requirements in a typical wind tunnel test are identified. It is suggested that productivity in wind tunnel testing can be enhanced by managing the inference error risk associated with evaluating residuals in a response surface modeling experiment. The relationship between minimum data volume requirements and the factors upon which they depend is described and certain simplifications to this relationship are realized when specific model adequacy criteria are adopted. The question of response model residual evaluation is treated and certain practical aspects of response surface modeling are considered, including inference subspace truncation. A wind tunnel test plan developed by using the Modern Design of Experiments illustrates the advantages of an early estimate of data volume requirements. Comparisons are made with a representative One Factor At a Time (OFAT) wind tunnel test matrix developed to evaluate a surface to air missile.

  16. Iterative experiment design guides the characterization of a light-inducible gene expression circuit.

    PubMed

    Ruess, Jakob; Parise, Francesca; Milias-Argeitis, Andreas; Khammash, Mustafa; Lygeros, John

    2015-06-30

    Systems biology rests on the idea that biological complexity can be better unraveled through the interplay of modeling and experimentation. However, the success of this approach depends critically on the informativeness of the chosen experiments, which is usually unknown a priori. Here, we propose a systematic scheme based on iterations of optimal experiment design, flow cytometry experiments, and Bayesian parameter inference to guide the discovery process in the case of stochastic biochemical reaction networks. To illustrate the benefit of our methodology, we apply it to the characterization of an engineered light-inducible gene expression circuit in yeast and compare the performance of the resulting model with models identified from nonoptimal experiments. In particular, we compare the parameter posterior distributions and the precision to which the outcome of future experiments can be predicted. Moreover, we illustrate how the identified stochastic model can be used to determine light induction patterns that make either the average amount of protein or the variability in a population of cells follow a desired profile. Our results show that optimal experiment design allows one to derive models that are accurate enough to precisely predict and regulate the protein expression in heterogeneous cell populations over extended periods of time.

  17. Design and test of a mechanically pumped two-phase thermal control flight experiment

    NASA Technical Reports Server (NTRS)

    Grote, M. G.; Stark, J. A.; Butler, C. D.; Mcintosh, R.

    1987-01-01

    A flight experiment of a mechanically pumped two-phase ammonia thermal control system, incorporating a number of new component designs, has been assembled and tested in a 1-g environment. Additional microgravity tests are planned on the Space Shuttle when Shuttle flights are resumed. The primary purpose of this experiment is to evaluate the operation of a mechanically pumped two-phase ammonia system, with emphasis on determining the performance of an evaporative Two-Phase Mounting Plate. The experiment also evaluates the performance of other specially designed components, such as the two-phase reservoir for temperature control, condensing radiator/heat sink, spiral tube boiler, and pressure drop experiment. The 1-g tests have shown that start-up of the two-phase experiment is easily accomplished with only a partial fill of ammonia. The experiment maintained a constant mounting plate temperature without flow rate controls over a very wide range of heat loads, flow rates, inlet flow conditions and exit qualities. The tests also showed the successful operation of the mounting plate in the heat sharing condensing mode.

  18. Designing microarray and RNA-Seq experiments for greater systems biology discovery in modern plant genomics.

    PubMed

    Yang, Chuanping; Wei, Hairong

    2015-02-01

    Microarray and RNA-seq experiments have become an important part of modern genomics and systems biology. Obtaining meaningful biological data from these experiments is an arduous task that demands close attention to many details. Negligence at any step can lead to gene expression data containing inadequate or composite information that is recalcitrant for pattern extraction. Therefore, it is imperative to carefully consider experimental design before launching a time-consuming and costly experiment. Contemporarily, most genomics experiments have two objectives: (1) to generate two or more groups of comparable data for identifying differentially expressed genes, gene families, biological processes, or metabolic pathways under experimental conditions; (2) to build local gene regulatory networks and identify hierarchically important regulators governing biological processes and pathways of interest. Since the first objective aims to identify the active molecular identities and the second provides a basis for understanding the underlying molecular mechanisms through inferring causality relationships mediated by treatment, an optimal experiment is to produce biologically relevant and extractable data to meet both objectives without substantially increasing the cost. This review discusses the major issues that researchers commonly face when embarking on microarray or RNA-seq experiments and summarizes important aspects of experimental design, which aim to help researchers deliberate how to generate gene expression profiles with low background noise but with more interaction to facilitate novel biological discoveries in modern plant genomics.

  19. Design and Development of a CPCI-Based Electronics Package for Space Station Experiments

    NASA Technical Reports Server (NTRS)

    Kolacz, John S.; Clapper, Randy S.; Wade, Raymond P.

    2006-01-01

    The NASA John H. Glenn Research Center is developing a Compact-PCI (CPCI) based electronics package for controlling space experiment hardware on the International Space Station. Goals of this effort include an easily modified, modular design that allows for changes in experiment requirements. Unique aspects of the experiment package include a flexible circuit used for internal interconnections and a separate enclosure (box in a box) for controlling 1 kW of power for experiment fuel heating requirements. This electronics package was developed as part of the FEANICS (Flow Enclosure Accommodating Novel Investigations in Combustion of Solids) mini-facility which is part of the Fluids and Combustion Facility s Combustion Integrated Rack (CIR). The CIR will be the platform for future microgravity combustion experiments and will reside on the Destiny Module of the International Space Station (ISS). The FEANICS mini-facility will be the primary means for conducting solid fuel combustion experiments in the CIR on ISS. The main focus of many of these solid combustion experiments will be to conduct applied scientific investigations in fire-safety to support NASA s future space missions. A description of the electronics package and the results of functional testing are the subjects of this report. The report concludes that the use of innovative packaging methods combined with readily available COTS hardware can provide a modular electronics package which is easily modified for changing experiment requirements.

  20. Combined Cycle Engine Large-Scale Inlet for Mode Transition Experiments: System Identification Rack Hardware Design

    NASA Technical Reports Server (NTRS)

    Thomas, Randy; Stueber, Thomas J.

    2013-01-01

    The System Identification (SysID) Rack is a real-time hardware-in-the-loop data acquisition (DAQ) and control instrument rack that was designed and built to support inlet testing in the NASA Glenn Research Center 10- by 10-Foot Supersonic Wind Tunnel. This instrument rack is used to support experiments on the Combined-Cycle Engine Large-Scale Inlet for Mode Transition Experiment (CCE? LIMX). The CCE?LIMX is a testbed for an integrated dual flow-path inlet configuration with the two flow paths in an over-and-under arrangement such that the high-speed flow path is located below the lowspeed flow path. The CCE?LIMX includes multiple actuators that are designed to redirect airflow from one flow path to the other; this action is referred to as "inlet mode transition." Multiple phases of experiments have been planned to support research that investigates inlet mode transition: inlet characterization (Phase-1) and system identification (Phase-2). The SysID Rack hardware design met the following requirements to support Phase-1 and Phase-2 experiments: safely and effectively move multiple actuators individually or synchronously; sample and save effector control and position sensor feedback signals; automate control of actuator positioning based on a mode transition schedule; sample and save pressure sensor signals; and perform DAQ and control processes operating at 2.5 KHz. This document describes the hardware components used to build the SysID Rack including their function, specifications, and system interface. Furthermore, provided in this document are a SysID Rack effectors signal list (signal flow); system identification experiment setup; illustrations indicating a typical SysID Rack experiment; and a SysID Rack performance overview for Phase-1 and Phase-2 experiments. The SysID Rack described in this document was a useful tool to meet the project objectives.

  1. Application of Design of Experiments and Surrogate Modeling within the NASA Advanced Concepts Office, Earth-to-Orbit Design Process

    NASA Technical Reports Server (NTRS)

    Zwack, Mathew R.; Dees, Patrick D.; Holt, James B.

    2016-01-01

    Decisions made during early conceptual design have a large impact upon the expected life-cycle cost (LCC) of a new program. It is widely accepted that up to 80% of such cost is committed during these early design phases [1]. Therefore, to help minimize LCC, decisions made during conceptual design must be based upon as much information as possible. To aid in the decision making for new launch vehicle programs, the Advanced Concepts Office (ACO) at NASA Marshall Space Flight Center (MSFC) provides rapid turnaround pre-phase A and phase A concept definition studies. The ACO team utilizes a proven set of tools to provide customers with a full vehicle mass breakdown to tertiary subsystems, preliminary structural sizing based upon worst-case flight loads, and trajectory optimization to quantify integrated vehicle performance for a given mission [2]. Although the team provides rapid turnaround for single vehicle concepts, the scope of the trade space can be limited due to analyst availability and the manpower requirements for manual execution of the analysis tools. In order to enable exploration of a broader design space, the ACO team has implemented an advanced design methods (ADM) based approach. This approach applies the concepts of design of experiments (DOE) and surrogate modeling to more exhaustively explore the trade space and provide the customer with additional design information to inform decision making. This paper will first discuss the automation of the ACO tool set, which represents a majority of the development effort. In order to fit a surrogate model within tolerable error bounds a number of DOE cases are needed. This number will scale with the number of variable parameters desired and the complexity of the system's response to those variables. For all but the smallest design spaces, the number of cases required cannot be produced within an acceptable timeframe using a manual process. Therefore, automation of the tools was a key enabler for the successful

  2. The Role of Formal Experiment Design in Hypersonic Flight System Technology Development

    NASA Technical Reports Server (NTRS)

    McClinton, Charles R.; Ferlemann, Shelly M.; Rock, Ken E.; Ferlemann, Paul G.

    2002-01-01

    Hypersonic airbreathing engine (scramjet) powered vehicles are being considered to replace conventional rocket-powered launch systems. Effective utilization of scramjet engines requires careful integration with the air vehicle. This integration synergistically combines aerodynamic forces with propulsive cycle functions of the engine. Due to the highly integrated nature of the hypersonic vehicle design problem, the large flight envelope, and the large number of design variables, the use of a statistical design approach in design is effective. Modern Design-of-Experiments (MDOE) has been used throughout the Hyper-X program, for both systems analysis and experimental testing. Application of MDOE fall into four categories: (1) experimental testing; (2) studies of unit phenomena; (3) refining engine design; and (4) full vehicle system optimization. The MDOE process also provides analytical models, which are also used to document lessons learned, supplement low-level design tools, and accelerate future studies. This paper will discuss the design considerations for scramjet-powered vehicles, specifics of MDOE utilized for Hyper-X, and present highlights from the use of these MDOE methods within the Hyper-X Program.

  3. How scientific experiments are designed: Problem solving in a knowledge-rich, error-rich environment

    NASA Astrophysics Data System (ADS)

    Baker, Lisa M.

    While theory formation and the relation between theory and data has been investigated in many studies of scientific reasoning, researchers have focused less attention on reasoning about experimental design, even though the experimental design process makes up a large part of real-world scientists' reasoning. The goal of this thesis was to provide a cognitive account of the scientific experimental design process by analyzing experimental design as problem-solving behavior (Newell & Simon, 1972). Three specific issues were addressed: the effect of potential error on experimental design strategies, the role of prior knowledge in experimental design, and the effect of characteristics of the space of alternate hypotheses on alternate hypothesis testing. A two-pronged in vivo/in vitro research methodology was employed, in which transcripts of real-world scientific laboratory meetings were analyzed as well as undergraduate science and non-science majors' design of biology experiments in the psychology laboratory. It was found that scientists use a specific strategy to deal with the possibility of error in experimental findings: they include "known" control conditions in their experimental designs both to determine whether error is occurring and to identify sources of error. The known controls strategy had not been reported in earlier studies with science-like tasks, in which participants' responses to error had consisted of replicating experiments and discounting results. With respect to prior knowledge: scientists and undergraduate students drew on several types of knowledge when designing experiments, including theoretical knowledge, domain-specific knowledge of experimental techniques, and domain-general knowledge of experimental design strategies. Finally, undergraduate science students generated and tested alternates to their favored hypotheses when the space of alternate hypotheses was constrained and searchable. This result may help explain findings of confirmation

  4. Do We Need to Design Course-Based Undergraduate Research Experiences for Authenticity?

    PubMed

    Rowland, Susan; Pedwell, Rhianna; Lawrie, Gwen; Lovie-Toon, Joseph; Hung, Yu

    2016-01-01

    The recent push for more authentic teaching and learning in science, technology, engineering, and mathematics indicates a shared agreement that undergraduates require greater exposure to professional practices. There is considerable variation, however, in how "authentic" science education is defined. In this paper we present our definition of authenticity as it applies to an "authentic" large-scale undergraduate research experience (ALURE); we also look to the literature and the student voice for alternate perceptions around this concept. A metareview of science education literature confirmed the inconsistency in definitions and application of the notion of authentic science education. An exploration of how authenticity was explained in 604 reflections from ALURE and traditional laboratory students revealed contrasting and surprising notions and experiences of authenticity. We consider the student experience in terms of alignment with 1) the intent of our designed curriculum and 2) the literature definitions of authentic science education. These findings contribute to the conversation surrounding authenticity in science education. They suggest two things: 1) educational experiences can have significant authenticity for the participants, even when there is no purposeful design for authentic practice, and 2) the continuing discussion of and design for authenticity in UREs may be redundant.

  5. Do We Need to Design Course-Based Undergraduate Research Experiences for Authenticity?

    PubMed Central

    Rowland, Susan; Pedwell, Rhianna; Lawrie, Gwen; Lovie-Toon, Joseph; Hung, Yu

    2016-01-01

    The recent push for more authentic teaching and learning in science, technology, engineering, and mathematics indicates a shared agreement that undergraduates require greater exposure to professional practices. There is considerable variation, however, in how “authentic” science education is defined. In this paper we present our definition of authenticity as it applies to an “authentic” large-scale undergraduate research experience (ALURE); we also look to the literature and the student voice for alternate perceptions around this concept. A metareview of science education literature confirmed the inconsistency in definitions and application of the notion of authentic science education. An exploration of how authenticity was explained in 604 reflections from ALURE and traditional laboratory students revealed contrasting and surprising notions and experiences of authenticity. We consider the student experience in terms of alignment with 1) the intent of our designed curriculum and 2) the literature definitions of authentic science education. These findings contribute to the conversation surrounding authenticity in science education. They suggest two things: 1) educational experiences can have significant authenticity for the participants, even when there is no purposeful design for authentic practice, and 2) the continuing discussion of and design for authenticity in UREs may be redundant. PMID:27909029

  6. Statistical Models for the Analysis and Design of Digital Polymerase Chain Reaction (dPCR) Experiments.

    PubMed

    Dorazio, Robert M; Hunter, Margaret E

    2015-11-03

    Statistical methods for the analysis and design of experiments using digital PCR (dPCR) have received only limited attention and have been misused in many instances. To address this issue and to provide a more general approach to the analysis of dPCR data, we describe a class of statistical models for the analysis and design of experiments that require quantification of nucleic acids. These models are mathematically equivalent to generalized linear models of binomial responses that include a complementary, log-log link function and an offset that is dependent on the dPCR partition volume. These models are both versatile and easy to fit using conventional statistical software. Covariates can be used to specify different sources of variation in nucleic acid concentration, and a model's parameters can be used to quantify the effects of these covariates. For purposes of illustration, we analyzed dPCR data from different types of experiments, including serial dilution, evaluation of copy number variation, and quantification of gene expression. We also showed how these models can be used to help design dPCR experiments, as in selection of sample sizes needed to achieve desired levels of precision in estimates of nucleic acid concentration or to detect differences in concentration among treatments with prescribed levels of statistical power.

  7. Conceptual design of a Moving Belt Radiator (MBR) shuttle-attached experiment

    NASA Technical Reports Server (NTRS)

    Aguilar, Jerry L.

    1990-01-01

    The conceptual design of a shuttle-attached Moving Belt Radiator (MBR) experiment is presented. The MBR is an advanced radiator concept in which a rotating belt is used to radiate thermal energy to space. The experiment is developed with the primary focus being the verification of the dynamic characteristics of a rotating belt with a secondary objective of proving the thermal and sealing aspects in a reduced gravity, vacuum environment. The mechanical design, selection of the belt material and working fluid, a preliminary test plan, and program plan are presented. The strategy used for selecting the basic sizes and materials of the components are discussed. Shuttle and crew member requirements are presented with some options for increasing or decreasing the demands on the STS. An STS carrier and the criteria used in the selection process are presented. The proposed carrier for the Moving Belt Radiator experiment is the Hitchhiker-M. Safety issues are also listed with possible results. This experiment is designed so that a belt can be deployed, run at steady state conditions, run with dynamic perturbations imposed, verify the operation of the interface heat exchanger and seals, and finally be retracted into a stowed position for transport back to earth.

  8. A differentiable reformulation for E-optimal design of experiments in nonlinear dynamic biosystems.

    PubMed

    Telen, Dries; Van Riet, Nick; Logist, Flip; Van Impe, Jan

    2015-06-01

    Informative experiments are highly valuable for estimating parameters in nonlinear dynamic bioprocesses. Techniques for optimal experiment design ensure the systematic design of such informative experiments. The E-criterion which can be used as objective function in optimal experiment design requires the maximization of the smallest eigenvalue of the Fisher information matrix. However, one problem with the minimal eigenvalue function is that it can be nondifferentiable. In addition, no closed form expression exists for the computation of eigenvalues of a matrix larger than a 4 by 4 one. As eigenvalues are normally computed with iterative methods, state-of-the-art optimal control solvers are not able to exploit automatic differentiation to compute the derivatives with respect to the decision variables. In the current paper a reformulation strategy from the field of convex optimization is suggested to circumvent these difficulties. This reformulation requires the inclusion of a matrix inequality constraint involving positive semidefiniteness. In this paper, this positive semidefiniteness constraint is imposed via Sylverster's criterion. As a result the maximization of the minimum eigenvalue function can be formulated in standard optimal control solvers through the addition of nonlinear constraints. The presented methodology is successfully illustrated with a case study from the field of predictive microbiology.

  9. Statistical models for the analysis and design of digital polymerase chain (dPCR) experiments

    USGS Publications Warehouse

    Dorazio, Robert; Hunter, Margaret

    2015-01-01

    Statistical methods for the analysis and design of experiments using digital PCR (dPCR) have received only limited attention and have been misused in many instances. To address this issue and to provide a more general approach to the analysis of dPCR data, we describe a class of statistical models for the analysis and design of experiments that require quantification of nucleic acids. These models are mathematically equivalent to generalized linear models of binomial responses that include a complementary, log–log link function and an offset that is dependent on the dPCR partition volume. These models are both versatile and easy to fit using conventional statistical software. Covariates can be used to specify different sources of variation in nucleic acid concentration, and a model’s parameters can be used to quantify the effects of these covariates. For purposes of illustration, we analyzed dPCR data from different types of experiments, including serial dilution, evaluation of copy number variation, and quantification of gene expression. We also showed how these models can be used to help design dPCR experiments, as in selection of sample sizes needed to achieve desired levels of precision in estimates of nucleic acid concentration or to detect differences in concentration among treatments with prescribed levels of statistical power.

  10. Designing a mixture experiment when the components are subject to a nonlinear multiple-component constraint

    SciTech Connect

    Piepel, Greg F.; Cooley, Scott K.; Vienna, John D.; Crum, Jarrod V.

    2015-12-14

    This article presents a case study of developing an experimental design for a constrained mixture experiment when the experimental region is defined by single-component constraints (SCCs), linear multiple-component constraints (MCCs), and a nonlinear MCC. Traditional methods and software for designing constrained mixture experiments with SCCs and linear MCCs are not directly applicable because of the nonlinear MCC. A modification of existing methodology to account for the nonlinear MCC was developed and is described in this article. The case study involves a 15-component nuclear waste glass example in which SO3 is one of the components. SO3 has a solubility limit in glass that depends on the composition of the balance of the glass. A goal was to design the experiment so that SO3 would not exceed its predicted solubility limit for any of the experimental glasses. The SO3 solubility limit had previously been modeled by a partial quadratic mixture (PQM) model expressed in the relative proportions of the 14 other components. The PQM model was used to construct a nonlinear MCC in terms of all 15 components. In addition, there were SCCs and linear MCCs. This article discusses the waste glass example and how a layered design was generated to (i) account for the SCCs, linear MCCs, and nonlinear MCC and (ii) meet the goals of the study.

  11. The design and development of a release mechanism for space shuttle life-science experiments

    NASA Technical Reports Server (NTRS)

    Jones, H. M.; Daniell, R. G.

    1984-01-01

    The design, development, and testing of a release mechanism for use in two life science experiments on the Spacelab 1, 4, and D1 missions is described. The mechanism is a self latching ball lock device actuated by a linear solenoid. An unusual feature is the tapering of the ball lock plunger to give it a near constant breakout force for release under a wide range of loads. The selection of the design, based on the design requirements, is discussed. A number of problems occurred during development and test, including problems caused by human factors that became apparent after initial delivery for crewtraining sessions. These problems and their solutions are described to assist in the design and testing of similar mechanisms.

  12. Proposed ATLAS liner design fabricated for hydrodynamics experiments on Shiva Star

    SciTech Connect

    Anderson, W. E.; Adams, C. D.; Armijo, E. V.; Bartos, J. J.; Cameron, B. J.; Garcia, F.; Henneke, B.; Randolph, B.; Salazar, M. A.; Steckle, W. P. , Jr.; Turchi, Peter J.; Gale, D.

    2001-01-01

    An entirely new cylindrical liner system has been designed and fabricated for use on the Shiva Star capacitor bank. The design incorporates features expected to be applicable to a future power flow channel of the Atlas capacitor bank with the intention of keeping any required liner design modifications to a minimum when the power flow channel at Atlas is available. Four shots were successfully conducted at Shiva Star that continued a series of hydrodynamics physics experiments started on the Los Alamos Pegasus capacitor bank. Departures from the diagnostic suite that had previously been used at Pegasus required new techniques in the fabrication of the experiment insert package. We describe new fabrication procedures that were developed by the Polymers and Coatings Group (MST-7) of the Los Alamos Materials Science Division to fabricate the Shiva Star experiment loads. Continuing MST-7 development of interference fit processes for liner experiment applications, current joints at the glide planes were assembled by thermal shrink fit using liquid nitrogen as a coolant. The liner material was low strength, high conductance 1100 series aluminum. The liner glide plane electrodes were machined from full hard copper rod with a 10 ramp to maintain liner to glide plane contact as the liner was imploded. The parts were fabricated with 0.015 mm radial interference fit between the liner inside diameter (ID) and the glide plane outside diameter (OD). to form the static liner current joints. The liner was assembled with some axial clearance at each end to allow slippage if any axial force was generated as the liner assembly cassette was bolted into Shiva Star, a precaution to guard against buckling the liner during installation of the load cassette. Other unique or unusual processes were developed and are described. Minor adaptations of the liner design are now being fabricated for first Atlas experiments.

  13. Analysis of helicopter downwash/frigate airwake interaction using statistically designed experiments

    NASA Astrophysics Data System (ADS)

    Nacakli, Yavuz

    A research program to investigate helicopter downwash/frigate airwake interaction has been initiated using a statistically robust experimental program featuring Design of Experiments. Engineering analysis of the helicopter/frigate interface is complicated by the fact that two flowfields become inherently coupled as separation distance decreases. The final objective of this work is to develop experimental methods to determine when computer simulations need to include the effects of a coupled flowfield versus using a simplified representation by superposing the velocity fields of the individual flowfields. The work presented was performed in the Old Dominion University Low Speed Wind Tunnel using a simplified 1/50 scale frigate waterline model and traverse mounted powered rotor with thrust measurement. Particle Image Velocimetry (PIV) velocity surveys were used with rotor thrust coefficient measurements at locations of identified interaction to help understand the underlying flow physics. Initially, PIV surveys of the frigate model landing deck in isolation and the rotor in isolation were performed to provide a baseline flow understanding. Next a designed experiment was devised yielding a response model for thrust coefficient as a function of vertical and longitudinal distance from the hangar door (base of the step), both with and without the rotor. This first experiment showed that thrust coefficient could be measured with enough precision to identify changes due to location using an advance ratio of 0.075 (Vinfinity = 5.14 m/s and o = 5000 rpm). A second designed experiment determined the practical spatial resolution for mapping the thrust coefficient response along the frigate's longitudinal center plane. Finally, a third designed experiment directly compared rotor thrust measurements between airwake and no-airwake cases and successfully identified regions that differed with statistical significance. Lastly, a qualitative comparison study was performed to

  14. Population Fisher information matrix and optimal design of discrete data responses in population pharmacodynamic experiments.

    PubMed

    Ogungbenro, Kayode; Aarons, Leon

    2011-08-01

    In the recent years, interest in the application of experimental design theory to population pharmacokinetic (PK) and pharmacodynamic (PD) experiments has increased. The aim is to improve the efficiency and the precision with which parameters are estimated during data analysis and sometimes to increase the power and reduce the sample size required for hypothesis testing. The population Fisher information matrix (PFIM) has been described for uniresponse and multiresponse population PK experiments for design evaluation and optimisation. Despite these developments and availability of tools for optimal design of population PK and PD experiments much of the effort has been focused on repeated continuous variable measurements with less work being done on repeated discrete type measurements. Discrete data arise mainly in PDs e.g. ordinal, nominal, dichotomous or count measurements. This paper implements expressions for the PFIM for repeated ordinal, dichotomous and count measurements based on analysis by a mixed-effects modelling technique. Three simulation studies were used to investigate the performance of the expressions. Example 1 is based on repeated dichotomous measurements, Example 2 is based on repeated count measurements and Example 3 is based on repeated ordinal measurements. Data simulated in MATLAB were analysed using NONMEM (Laplace method) and the glmmML package in R (Laplace and adaptive Gauss-Hermite quadrature methods). The results obtained for Examples 1 and 2 showed good agreement between the relative standard errors obtained using the PFIM and simulations. The results obtained for Example 3 showed the importance of sampling at the most informative time points. Implementation of these expressions will provide the opportunity for efficient design of population PD experiments that involve discrete type data through design evaluation and optimisation.

  15. Optimized Design and Analysis of Sparse-Sampling fMRI Experiments

    PubMed Central

    Perrachione, Tyler K.; Ghosh, Satrajit S.

    2013-01-01

    Sparse-sampling is an important methodological advance in functional magnetic resonance imaging (fMRI), in which silent delays are introduced between MR volume acquisitions, allowing for the presentation of auditory stimuli without contamination by acoustic scanner noise and for overt vocal responses without motion-induced artifacts in the functional time series. As such, the sparse-sampling technique has become a mainstay of principled fMRI research into the cognitive and systems neuroscience of speech, language, hearing, and music. Despite being in use for over a decade, there has been little systematic investigation of the acquisition parameters, experimental design considerations, and statistical analysis approaches that bear on the results and interpretation of sparse-sampling fMRI experiments. In this report, we examined how design and analysis choices related to the duration of repetition time (TR) delay (an acquisition parameter), stimulation rate (an experimental design parameter), and model basis function (an analysis parameter) act independently and interactively to affect the neural activation profiles observed in fMRI. First, we conducted a series of computational simulations to explore the parameter space of sparse design and analysis with respect to these variables; second, we validated the results of these simulations in a series of sparse-sampling fMRI experiments. Overall, these experiments suggest the employment of three methodological approaches that can, in many situations, substantially improve the detection of neurophysiological response in sparse fMRI: (1) Sparse analyses should utilize a physiologically informed model that incorporates hemodynamic response convolution to reduce model error. (2) The design of sparse fMRI experiments should maintain a high rate of stimulus presentation to maximize effect size. (3) TR delays of short to intermediate length can be used between acquisitions of sparse-sampled functional image volumes to increase

  16. Motivation for proposed experimentation in the realm of accelerated E. M. systems: A preliminary design for an experiment

    NASA Technical Reports Server (NTRS)

    Post, E. J.

    1970-01-01

    An experiment, designed to determine the difference between fields-magnetic and electric-surrounding a uniformly moving charge as contrasted with the fields surrounding an accelerated charge, is presented. A thought experiment is presented to illustrate the process.

  17. Design of the exhaust device for light vehicle engine pedestal experiment

    NASA Astrophysics Data System (ADS)

    Sun, Shuguang

    2017-01-01

    In view of the shortcomings and the insufficiency of the existing exhaust device for light vehicle engine pedestal experiment, improvement scheme is proposed to design a suitable multi-type exhaust device for light vehicle engine pedestal experiment, which has flex space and a certain degree of freedom in six directions x, y, z, x, y, z, so the problem of interference during the process of installation can be solved, the cost on research and development and test can be reduced and the development cycle can be shorten and it can also be multi-usage.

  18. Lost in space: design of experiments and scientific exploration in a Hogarth Universe.

    PubMed

    Lendrem, Dennis W; Lendrem, B Clare; Woods, David; Rowland-Jones, Ruth; Burke, Matthew; Chatfield, Marion; Isaacs, John D; Owen, Martin R

    2015-11-01

    A Hogarth, or 'wicked', universe is an irregular environment generating data to support erroneous beliefs. Here, we argue that development scientists often work in such a universe. We demonstrate that exploring these multidimensional spaces using small experiments guided by scientific intuition alone, gives rise to an illusion of validity and a misplaced confidence in that scientific intuition. By contrast, design of experiments (DOE) permits the efficient mapping of such complex, multidimensional spaces. We describe simulation tools that enable research scientists to explore these spaces in relative safety.

  19. Integral experiments for fusion-reactor shield design. Summary of progress

    SciTech Connect

    Santoro, R.T.; Alsmiller, R.G. Jr.; Barnes, J.M.; Chapman, G.T.

    1983-01-01

    Neutron and gamma-ray energy spectra from the reactions of approx. 14-MeV neutrons in blanket and shield materials and from the streaming of these neutrons through a cylindrical duct (L/D approx. 2) have been measured and calculated. These data are being obtained in a series of integral experiments to verify the radiation transport methods and nuclear data that are being used in nuclear design calculations for fusion reactors. The experimental procedures and analytical methods used to obtain the calculated data are reviewed. Comparisons between measured and calculated data for the experiments that have been performed to date are summarized.

  20. The Design of Useful Mix Characterization Experiments for the LLNL Reshock Platform

    NASA Astrophysics Data System (ADS)

    Islam, Tanim

    2015-11-01

    The NIF Re-shock platform has been extensively engineered to minimize boundary effects and polluting shocks. It is capable of comprehensively and reproducibly exploring a large parameter space important in mix experiments: strength and timing of shocks and reshocks; the amplitude and wavelength of Richtmyer-Meshkov-unstable interfaces; the Atwood number of these mixing layers; and using a technique developed with experiments at the Omega laser, the simultaneous visualization of spike and bubble fronts. In this work, I explore multimodal and roughened surface designed, and combinations of light and heavy materials, that may illuminate our understanding of mix in plasmas.

  1. Advanced Test Reactor In-Canal Ultrasonic Scanner: Experiment Design and Initial Results on Irradiated Plates

    SciTech Connect

    D. M. Wachs; J. M. Wight; D. T. Clark; J. M. Williams; S. C. Taylor; D. J. Utterbeck; G. L. Hawkes; G. S. Chang; R. G. Ambrosek; N. C. Craft

    2008-09-01

    An irradiation test device has been developed to support testing of prototypic scale plate type fuels in the Advanced Test Reactor. The experiment hardware and operating conditions were optimized to provide the irradiation conditions necessary to conduct performance and qualification tests on research reactor type fuels for the RERTR program. The device was designed to allow disassembly and reassembly in the ATR spent fuel canal so that interim inspections could be performed on the fuel plates. An ultrasonic scanner was developed to perform dimensional and transmission inspections during these interim investigations. Example results from the AFIP-2 experiment are presented.

  2. Design and Preparation of a Particle Dynamics Space Flight Experiment, SHIVA

    NASA Technical Reports Server (NTRS)

    Trolinger, James; L'Esperance, Drew; Rangel, Roger; Coimbra, Carlos; Wiltherow, William

    2003-01-01

    ABSTRACT This paper describes the flight experiment, supporting ground science, and the design rationale for project SHIVA (Spaceflight Holography Investigation in a Virtual Apparatus). SHIVA is a fundamental study of particle dynamics in fluids in microgravity. Gravity often dominates the equations of motion of a particle in a fluid, so microgravity provides an ideal environment to study the other forces, such as the pressure and viscous drag and especially the Basset history force. We have developed diagnostic recording methods using holography to save all of the particle field optical characteristics, essentially allowing the experiment to be transferred from space back to earth in what we call the "virtual apparatus" for on-earth microgravity experimentation. We can quantify precisely the three-dimensional motion of sets of particles, allowing us to test and apply new analytical solutions developed by members of the team as reported in the 2001 Conference (Banff, Canada). In addition to employing microgravity to augment the fundamental study of these forces, the resulting data will allow us to quantify and understand the ISS environment with great accuracy. This paper shows how we used both experiment and theory to identify and resolve critical issues and produce an optimal the study. We examined the response of particles of specific gravity from 0.1 to 20, with radii from 0.2 to 2mm. to fluid oscillation at frequencies up to 80 Hz with amplitudes up to 200 microns. To observe some of the interesting effects predicted by the new solutions requires the precise location of the position of a particle in three dimensions. To this end we have developed digital holography algorithms that enable particle position location to a small fraction of a pixel in a CCD array. The spaceflight system will record holograms both on film and electronically. The electronic holograms can be downlinked providing real time data, essentially acting like a remote window into the ISS

  3. Media milling process optimization for manufacture of drug nanoparticles using design of experiments (DOE).

    PubMed

    Nekkanti, Vijaykumar; Marwah, Ashwani; Pillai, Raviraj

    2015-01-01

    Design of experiments (DOE), a component of Quality by Design (QbD), is systematic and simultaneous evaluation of process variables to develop a product with predetermined quality attributes. This article presents a case study to understand the effects of process variables in a bead milling process used for manufacture of drug nanoparticles. Experiments were designed and results were computed according to a 3-factor, 3-level face-centered central composite design (CCD). The factors investigated were motor speed, pump speed and bead volume. Responses analyzed for evaluating these effects and interactions were milling time, particle size and process yield. Process validation batches were executed using the optimum process conditions obtained from software Design-Expert® to evaluate both the repeatability and reproducibility of bead milling technique. Milling time was optimized to <5 h to obtain the desired particle size (d90 < 400 nm). The desirability function used to optimize the response variables and observed responses were in agreement with experimental values. These results demonstrated the reliability of selected model for manufacture of drug nanoparticles with predictable quality attributes. The optimization of bead milling process variables by applying DOE resulted in considerable decrease in milling time to achieve the desired particle size. The study indicates the applicability of DOE approach to optimize critical process parameters in the manufacture of drug nanoparticles.

  4. Design and implementation of the protective cap/biobarrier experiment at the Idaho National Engineering Laboratory

    SciTech Connect

    Limbach, W.E.; Ratzlaff, T.D.; Anderson, J.E.; Reynolds, T.D.; Laundre, J.W. |

    1994-12-31

    The Protective Cap/Biobarrier Experiment (PCBE), initiated in 1993 at the Idaho National Engineering Laboratory (INEL), is a strip-split plot experiment with three replications designed to rigorously test a 2.0-m loessal soil cap against a cap recommended by the US Environmental Protection Agency and two caps with biological intrusion barriers. Past research at INEL indicates that it should be possible to exclude water from buried wastes using natural materials and natural processes in arid environments rather than expensive materials (geotextiles) and highly engineered caps. The PCBE will also test the effects of two vegetal covers and three irrigation levels on cap performance. Drainage pans, located at the bottom of each plot, will monitor cap failure. Soil water profiles will be monitored biweekly by neutron probe and continuously by time domain reflectometry. The performance of each cap design will be monitored under a variety of conditions through 1998. From 1994 to 1996, the authors will assess plant establishment, rooting depths, patterns of moisture extraction and their interactions among caps, vegetal covers, and irrigation levels. In 1996, they will introduce ants and burrowing mammals to test the structural integrity of each cap design. In 1998, the authors will apply sufficient water to determine the failure limit for each cap design. The PCBE should provide reliable knowledge of the performances of the four cap designs under a variety of conditions and aid in making hazardous-waste management decisions at INEL and at disposal sites in similar environments.

  5. Experiment Design and First Season Observations with the Degree Angular Scale Interferometer

    NASA Astrophysics Data System (ADS)

    Leitch, E. M.; Pryke, C.; Halverson, N. W.; Kovac, J.; Davidson, G.; LaRoque, S.; Schartman, E.; Yamasaki, J.; Carlstrom, J. E.; Holzapfel, W. L.; Dragovan, M.; Cartwright, J. K.; Mason, B. S.; Padin, S.; Pearson, T. J.; Readhead, A. C. S.; Shepherd, M. C.

    2002-03-01

    We describe the instrumentation, experiment design, and data reduction for the first season of observations with the Degree Angular Scale Interferometer (DASI), a compact microwave interferometer designed to measure anisotropy of the cosmic microwave background (CMB) on degree and subdegree scales (l~= 100-900). The telescope was deployed at the Amundsen-Scott South Pole Research Station during the 1999-2000 austral summer, and we conducted observations of the CMB throughout the following austral winter. In its first season of observations, DASI has mapped CMB fluctuations in 32 fields, each 3.4d across, with high sensitivity.

  6. Rotational fluid flow experiment: WPI/MITRE advanced space design GASCAN 2

    NASA Technical Reports Server (NTRS)

    Daly, Walter F.; Harr, Lee; Paduano, Rocco; Yee, Tony; Eubbani, Eddy; Delprado, Jaime; Khanna, Ajay

    1991-01-01

    The design and implementation is examined of an electro-mechanical system for studying vortex behavior in a microgravity environment. Most of the existing equipment was revised and redesigned as necessary. Emphasis was placed on the documentation and integration of the mechanical and electrical subsystems. Project results include the reconfiguration and thorough testing of all the hardware subsystems, the implementation of an infrared gas entrainment detector, new signal processing circuitry for the ultrasonic fluid circulation device, improved prototype interface circuits, and software for overall control of experiment design operation.

  7. Galileo Optical Experiment (GOPEX) optical train: Design and validation at the Table Mountain Facility

    NASA Technical Reports Server (NTRS)

    Yu, J.; Shao, M.

    1993-01-01

    The Galileo Optical Experiment (GOPEX) has demonstrated the first laser communications uplink to a deep space vehicle. The optical design and validation tests performed at the Table Mountain Facility (TMF) transmitter site are described. The system used a 0.6-m telescope and an optical system at coude focus to produce the uplink beam. The optical system used a pulsed neodymium:yttrium-aluminum-garnet (Nd:Yag) laser and beam diverger optics to produce the required optical output. In order to validate the optical design, a number of uplinks were performed on Earth-orbiting satellites (e.g., Lageos 1 and 2).

  8. Preliminary design and definition of field experiments for welded tuff rock mechanics program

    NASA Astrophysics Data System (ADS)

    Zimmerman, R. M.

    1982-06-01

    G-tunnel on the Nevada Test Site intersects the Grouse Canyon Member, Belted Range Tuff, that has similar thermal and mechanical properties to welded tuff strata in nearby Yucca Mountain, which is considered as a site for a nuclear waste repository. A rock mechanics testing program is developed. Field data used for site characterization and repository conceptual design is investigated. This preliminary design is prepared to provide a control document for definition, implementation, operation, and evaluation activities for five rock mechanics experiments that can be placed in this geologic unit.

  9. Design and implementation of an experiment scheduling system for the ACTS satellite

    NASA Technical Reports Server (NTRS)

    Ringer, Mark J.

    1994-01-01

    The Advanced Communication Technology Satellite (ACTS) was launched on the 12th of September 1993 aboard STS-51. All events since that time have proceeded as planned with user operations commencing on December 6th, 1993. ACTS is a geosynchronous satellite designed to extend the state of the art in communication satellite design and is available to experimenters on a 'time/bandwidth available' basis. The ACTS satellite requires the advance scheduling of experimental activities based upon a complex set of resource, state, and activity constraints in order to ensure smooth operations. This paper describes the software system developed to schedule experiments for ACTS.

  10. Design-of-experiments to Reduce Life-cycle Costs in Combat Aircraft Inlets

    NASA Technical Reports Server (NTRS)

    Anderson, Bernhard H.; Baust, Henry D.; Agrell, Johan

    2003-01-01

    It is the purpose of this study to demonstrate the viability and economy of Design- of-Experiments (DOE), to arrive at micro-secondary flow control installation designs that achieve optimal inlet performance for different mission strategies. These statistical design concepts were used to investigate the properties of "low unit strength" micro-effector installation. "Low unit strength" micro-effectors are micro-vanes, set a very low angle-of incidence, with very long chord lengths. They are designed to influence the neat wall inlet flow over an extended streamwise distance. In this study, however, the long chord lengths were replicated by a series of short chord length effectors arranged in series over multiple bands of effectors. In order to properly evaluate the performance differences between the single band extended chord length installation designs and the segmented multiband short chord length designs, both sets of installations must be optimal. Critical to achieving optimal micro-secondary flow control installation designs is the understanding of the factor interactions that occur between the multiple bands of micro-scale vane effectors. These factor interactions are best understood and brought together in an optimal manner through a structured DOE process, or more specifically Response Surface Methods (RSM).

  11. Design/build/mockup of the Waste Isolation Pilot Plant gas generation experiment glovebox

    SciTech Connect

    Rosenberg, K.E.; Benjamin, W.W.; Knight, C.J.; Michelbacher, J.A.

    1996-10-01

    A glovebox was designed, fabricated, and mocked-up for the WIPP Gas Generation Experiments (GGE) being conducted at ANL-W. GGE will determine the gas generation rates from materials in contact handled transuranic waste at likely long term repository temperature and pressure conditions. Since the customer`s schedule did not permit time for performing R&D of the support systems, designing the glovebox, and fabricating the glovebox in a serial fashion, a parallel approach was undertaken. As R&D of the sampling system and other support systems was initiated, a specification was written concurrently for contracting a manufacturer to design and build the glovebox and support equipment. The contractor understood that the R&D being performed at ANL-W would add additional functional requirements to the glovebox design. Initially, the contractor had sufficient information to design the glovebox shell. Once the shell design was approved, ANL-W built a full scale mockup of the shell out of plywood and metal framing; support systems were mocked up and resultant information was forwarded to the glovebox contractor to incorporate into the design. This approach resulted in a glovebox being delivered to ANL-W on schedule and within budget.

  12. Status and Design Concepts for the Hydrogen On-Orbit Storage and Supply Experiment

    NASA Technical Reports Server (NTRS)

    Chato, David J.; VanDyke, Melissa; Batty, J. Clair; Schick, Scott

    1998-01-01

    This paper studies concepts for the Hydrogen On-Orbit Storage and Supply Experiment (HOSS). HOSS is a space flight experiment whose objectives are: Show stable gas supply for storage and direct gain solar-thermal thruster designs; and evaluate and compare low-g performance of active and passive pressure control via a thermodynamic vent system (TVS) suitable for solar-thermal upper stages. This paper shows that the necessary experimental equipment for HOSS can be accommodated in a small hydrogen dewar of 36 to 80 liter. Thermal designs for these dewars which meet the on-orbit storage requirements can be achieved. Furthermore ground hold insulation and shielding concepts are achieved which enable storing initially subcooled liquid hydrogen in these small dewars without venting in excess of 144 hours.

  13. A design of experiment study of plasma sprayed alumina-titania coatings

    SciTech Connect

    Steeper, T.J.; Varacalle, D.J. Jr.; Wilson, G.C.; Riggs, W.L. II; Rotolico, A.J.; Nerz, J.E.

    1992-08-01

    An experimental study of the plasma spraying of alumina-titania powder is presented in this paper. This powder system is being used to fabricate heater tubes that emulate nuclear fuel tubes for use in thermal-hydraulic testing. Coating experiments were conducted using a Taguchi fractional-factorial design parametric study. Operating parameters were varied around the typical spray parameters in a systematic design of experiments in order to display the range of plasma processing conditions and their effect on the resultant coating. The coatings were characterized by hardness and electrical tests, image analysis, and optical metallography. Coating qualities are discussed with respect to dielectric strength, hardness, porosity, surface roughness, deposition efficiency, and microstructure. The attributes of the coatings are correlated with the changes in operating parameters.

  14. The opto-mechanical design of the sub-orbital local interstellar cloud experiment (SLICE)

    NASA Astrophysics Data System (ADS)

    Kane, Robert; Nell, Nicholas; Schultz, Ted; France, Kevin; Beasley, Matthew; Burgh, Eric; Bushinsky, Rachel; Hoadley, Keri

    2013-09-01

    We present the fabrication and testing of the Sub-orbital Local Interstellar Cloud Experiment (SLICE), a rocket-borne payload for ultraviolet astrophysics in the 1020 to 1070 Å bandpass. The SLICE optical system is composed of an ultraviolet-optimized telescope feeding a Rowland Circle spectrograph. The telescope is an 8-inch Classical Cassegrain operating at F/7, with Al optics overcoated with LiF for enhanced far-ultraviolet reflectivity. The holographically-ruled grating focuses light at an open-faced microchannel plate detector employing an opaque RbBr photocathode. In this proceeding, we describe the design trades and calibration issues confronted during the build-up of this payload. We place particular emphasis on the technical details of the design, modifications, construction, and alignment procedures for SLICE in order to provide a roadmap for the optimization of future ruggedized experiments for ultraviolet imaging and spectroscopy.

  15. Design Concepts Studied for the Hydrogen On-Orbit Storage and Supply Experiment

    NASA Technical Reports Server (NTRS)

    Chato, David J.

    1998-01-01

    The NASA Lewis Research Center, in conjunction with the Utah State University Space Dynamics Laboratory, studied concepts for the Hydrogen On-Orbit Storage and Supply Experiment (HOSS). HOSS is a space flight experiment whose objectives are (1) to show stable gas supply for solar-thermal thruster designs by using both storage and direct-gain approaches and (2) to evaluate and compare the low-gravity performance of active and passive pressure control via a thermodynamic vent system (TVS) suitable for solar-thermal upper stages. This study showed that the necessary experimental equipment for HOSS can be accommodated in a small hydrogen Dewar (36 to 80 liter). Thermal designs can be achieved that meet the on-orbit storage requirements for these Dewars. Furthermore, ground hold insulation concepts are easily achieved that can store liquid hydrogen in these small Dewars for more than 144 hr without venting.

  16. Precision Pointing Control System (PPCS) system design and analysis. [for gimbaled experiment platforms

    NASA Technical Reports Server (NTRS)

    Frew, A. M.; Eisenhut, D. F.; Farrenkopf, R. L.; Gates, R. F.; Iwens, R. P.; Kirby, D. K.; Mann, R. J.; Spencer, D. J.; Tsou, H. S.; Zaremba, J. G.

    1972-01-01

    The precision pointing control system (PPCS) is an integrated system for precision attitude determination and orientation of gimbaled experiment platforms. The PPCS concept configures the system to perform orientation of up to six independent gimbaled experiment platforms to design goal accuracy of 0.001 degrees, and to operate in conjunction with a three-axis stabilized earth-oriented spacecraft in orbits ranging from low altitude (200-2500 n.m., sun synchronous) to 24 hour geosynchronous, with a design goal life of 3 to 5 years. The system comprises two complementary functions: (1) attitude determination where the attitude of a defined set of body-fixed reference axes is determined relative to a known set of reference axes fixed in inertial space; and (2) pointing control where gimbal orientation is controlled, open-loop (without use of payload error/feedback) with respect to a defined set of body-fixed reference axes to produce pointing to a desired target.

  17. Design, installation and operating experience of 20 photovoltaic medical refrigerator systems on four continents

    NASA Technical Reports Server (NTRS)

    Hein, G. F.

    1982-01-01

    The NASA Lewis Research Center in cooperation with the World Health Organization, U.S.A. I.D., the Pan American Health Organization and national government agencies in some developing countries sponsored the installation of twenty photovoltaic powered medical vaccine storage refrigerator-freezer (R/F) systems. The Solar Power Corporation was selected as the contractor to perform the design, development and installation of these twenty units. Solar Power's experiences are described herein.

  18. Conceptual design of an orbital propellant transfer experiment. Volume 2: Study results

    NASA Technical Reports Server (NTRS)

    Drake, G. L.; Bassett, C. E.; Merino, F.; Siden, L. E.; Bradley, R. E.; Carr, E. J.; Parker, R. E.

    1980-01-01

    The OTV configurations, operations and requirements planned for the period from the 1980's to the 1990's were reviewed and a propellant transfer experiment was designed that would support the needs of these advanced OTV operational concepts. An overall integrated propellant management technology plan for all NASA centers was developed. The preliminary cost estimate (for planning purposes only) is $56.7 M, of which approximately $31.8 M is for shuttle user costs.

  19. Design and Development Tools for the Systems Engineering Experience Accelerator. Volume 1

    DTIC Science & Technology

    2015-04-20

    INCOSE) 2012 International Symposium/European Conference on Systems Engineering (EUSEC), Rome , Italy, July 9-12. • Bodner, D., Wade, J., Squires, A...Editor in War Craft III. The map is named Defense of the Ancients (DotA) and has attracted millions of players using it for gaming worldwide. Since its... Rome , Italy. 7. Wade, J. P., "Design and Development Tools for the Systems Engineering Experience Accelerator," Systems Engineering Research

  20. Directed Design of Experiments (DOE) for Determining Probability of Detection (POD) Capability of NDE Systems (DOEPOD)

    NASA Technical Reports Server (NTRS)

    Generazio, Ed

    2007-01-01

    This viewgraph presentation reviews some of the issues that people who specialize in Non destructive evaluation (NDE) have with determining the statistics of the probability of detection. There is discussion of the use of the binominal distribution, and the probability of hit. The presentation then reviews the concepts of Directed Design of Experiments for Validating Probability of Detection of Inspection Systems (DOEPOD). Several cases are reviewed, and discussed. The concept of false calls is also reviewed.