ERIC Educational Resources Information Center
Cooke, Jason; Lightbody, Owen C.
2011-01-01
Experiments are described for the preparation of imidazolium chloride precursors to "N"-heterocyclic carbenes and their cyclopentadienyl nickel chloride derivatives. The syntheses have been optimized for second- and third-year undergraduate laboratories that have a maximum programmed length of three hours per week. The experiments are flexible and…
ERIC Educational Resources Information Center
Willbur, Jaime F.; Vail, Justin D.; Mitchell, Lindsey N.; Jakeman, David L.; Timmons, Shannon C.
2016-01-01
The development and implementation of research-inspired, discovery-based experiences into science laboratory curricula is a proven strategy for increasing student engagement and ownership of experiments. In the novel laboratory module described herein, students learn to express, purify, and characterize a carbohydrate-active enzyme using modern…
A Model for Designing Adaptive Laboratory Evolution Experiments.
LaCroix, Ryan A; Palsson, Bernhard O; Feist, Adam M
2017-04-15
The occurrence of mutations is a cornerstone of the evolutionary theory of adaptation, capitalizing on the rare chance that a mutation confers a fitness benefit. Natural selection is increasingly being leveraged in laboratory settings for industrial and basic science applications. Despite increasing deployment, there are no standardized procedures available for designing and performing adaptive laboratory evolution (ALE) experiments. Thus, there is a need to optimize the experimental design, specifically for determining when to consider an experiment complete and for balancing outcomes with available resources (i.e., laboratory supplies, personnel, and time). To design and to better understand ALE experiments, a simulator, ALEsim, was developed, validated, and applied to the optimization of ALE experiments. The effects of various passage sizes were experimentally determined and subsequently evaluated with ALEsim, to explain differences in experimental outcomes. Furthermore, a beneficial mutation rate of 10 -6.9 to 10 -8.4 mutations per cell division was derived. A retrospective analysis of ALE experiments revealed that passage sizes typically employed in serial passage batch culture ALE experiments led to inefficient production and fixation of beneficial mutations. ALEsim and the results described here will aid in the design of ALE experiments to fit the exact needs of a project while taking into account the resources required and will lower the barriers to entry for this experimental technique. IMPORTANCE ALE is a widely used scientific technique to increase scientific understanding, as well as to create industrially relevant organisms. The manner in which ALE experiments are conducted is highly manual and uniform, with little optimization for efficiency. Such inefficiencies result in suboptimal experiments that can take multiple months to complete. With the availability of automation and computer simulations, we can now perform these experiments in an optimized fashion and can design experiments to generate greater fitness in an accelerated time frame, thereby pushing the limits of what adaptive laboratory evolution can achieve. Copyright © 2017 American Society for Microbiology.
Predictive modelling of flow in a two-dimensional intermediate-scale, heterogeneous porous media
Barth, Gilbert R.; Hill, M.C.; Illangasekare, T.H.; Rajaram, H.
2000-01-01
To better understand the role of sedimentary structures in flow through porous media, and to determine how small-scale laboratory-measured values of hydraulic conductivity relate to in situ values this work deterministically examines flow through simple, artificial structures constructed for a series of intermediate-scale (10 m long), two-dimensional, heterogeneous, laboratory experiments. Nonlinear regression was used to determine optimal values of in situ hydraulic conductivity, which were compared to laboratory-measured values. Despite explicit numerical representation of the heterogeneity, the optimized values were generally greater than the laboratory-measured values. Discrepancies between measured and optimal values varied depending on the sand sieve size, but their contribution to error in the predicted flow was fairly consistent for all sands. Results indicate that, even under these controlled circumstances, laboratory-measured values of hydraulic conductivity need to be applied to models cautiously.To better understand the role of sedimentary structures in flow through porous media, and to determine how small-scale laboratory-measured values of hydraulic conductivity relate to in situ values this work deterministically examines flow through simple, artificial structures constructed for a series of intermediate-scale (10 m long), two-dimensional, heterogeneous, laboratory experiments. Nonlinear regression was used to determine optimal values of in situ hydraulic conductivity, which were compared to laboratory-measured values. Despite explicit numerical representation of the heterogeneity, the optimized values were generally greater than the laboratory-measured values. Discrepancies between measured and optimal values varied depending on the sand sieve size, but their contribution to error in the predicted flow was fairly consistent for all sands. Results indicate that, even under these controlled circumstances, laboratory-measured values of hydraulic conductivity need to be applied to models cautiously.
Optimizing Chromatographic Separation: An Experiment Using an HPLC Simulator
ERIC Educational Resources Information Center
Shalliker, R. A.; Kayillo, S.; Dennis, G. R.
2008-01-01
Optimization of a chromatographic separation within the time constraints of a laboratory session is practically impossible. However, by employing a HPLC simulator, experiments can be designed that allow students to develop an appreciation of the complexities involved in optimization procedures. In the present exercise, a HPLC simulator from "JCE…
Joelsson, Daniel; Moravec, Phil; Troutman, Matthew; Pigeon, Joseph; DePhillips, Pete
2008-08-20
Transferring manual ELISAs to automated platforms requires optimizing the assays for each particular robotic platform. These optimization experiments are often time consuming and difficult to perform using a traditional one-factor-at-a-time strategy. In this manuscript we describe the development of an automated process using statistical design of experiments (DOE) to quickly optimize immunoassays for precision and robustness on the Tecan EVO liquid handler. By using fractional factorials and a split-plot design, five incubation time variables and four reagent concentration variables can be optimized in a short period of time.
Ultrafiltration of Protein Solutions: A Laboratory Experiment
ERIC Educational Resources Information Center
Pansare, Vikram J.; Tien, Daniel; Prud'homme, Robert K.
2015-01-01
Biology is playing an increasingly important role in the chemical engineering curriculum. We describe a set of experiments we have implemented in our Undergraduate Laboratory course giving students practical insights into membrane separation processes for protein processing. The goal of the lab is to optimize the purification and concentration of…
Remediation and recycling of WBP-treated lumber for use as flakeboard
Ronald Sabo; Jerrold E. Winandy; Carol A. Clausen; Altaf Basta
2008-01-01
Laboratory-scale experiments were conducted in which preservative metals (As, Cr, & Cu) were thermochemically extracted from CCA-treated spruce (Picea engelmannii) using oxalic acid and sodium hydroxide. The effects of extraction time, temperature, and pH were examined and laboratory scale optimization was achieved. Two series of experiments were carried out. In...
ERIC Educational Resources Information Center
Erskine, Steven R.; And Others
1986-01-01
Describes a laboratory experiment that is designed to aid in the understanding of the fundamental process involved in gas chromatographic separations. Introduces the Kovats retention index system for use by chemistry students to establish criteria for the optimal selection of gas chromatographic stationary phases. (TW)
Recent Experiences in Multidisciplinary Analysis and Optimization, part 2
NASA Technical Reports Server (NTRS)
Sobieski, J. (Compiler)
1984-01-01
The papers presented at the NASA Symposium on Recent Experiences in Multidisciplinary Analysis and Optimization held at NASA Langley Research Center, Hampton, Virginia, April 24 to 26, 1984 are given. The purposes of the symposium were to exchange information about the status of the application of optimization and the associated analyses in industry or research laboratories to real life problems and to examine the directions of future developments.
Grass shrimp are abundant, ecologically important inhabitants of estuarine ecosystems; adults and embryos have been used extensively in laboratory experiments, including studies of the impacts of environmental toxicants. However, optimal laboratory feeding conditions for grass sh...
Optimizing a reconfigurable material via evolutionary computation
NASA Astrophysics Data System (ADS)
Wilken, Sam; Miskin, Marc Z.; Jaeger, Heinrich M.
2015-08-01
Rapid prototyping by combining evolutionary computation with simulations is becoming a powerful tool for solving complex design problems in materials science. This method of optimization operates in a virtual design space that simulates potential material behaviors and after completion needs to be validated by experiment. However, in principle an evolutionary optimizer can also operate on an actual physical structure or laboratory experiment directly, provided the relevant material parameters can be accessed by the optimizer and information about the material's performance can be updated by direct measurements. Here we provide a proof of concept of such direct, physical optimization by showing how a reconfigurable, highly nonlinear material can be tuned to respond to impact. We report on an entirely computer controlled laboratory experiment in which a 6 ×6 grid of electromagnets creates a magnetic field pattern that tunes the local rigidity of a concentrated suspension of ferrofluid and iron filings. A genetic algorithm is implemented and tasked to find field patterns that minimize the force transmitted through the suspension. Searching within a space of roughly 1010 possible configurations, after testing only 1500 independent trials the algorithm identifies an optimized configuration of layered rigid and compliant regions.
Experiment definition phase shuttle laboratory LDRL 10.6 experiment
NASA Technical Reports Server (NTRS)
1974-01-01
System optimization is reported along with mission and parameter requirements. Link establishment and maintenance requirements are discussed providing an acquisition and tracking scheme. The shuttle terminal configurations are considered and are included in the experiment definition.
Planning a School Physics Experiment.
ERIC Educational Resources Information Center
Blasiak, Wladyslaw
1986-01-01
Presents a model for planning the measurement of physical quantities. Provides two examples of optimizing the conditions of indirect measurement for laboratory experiments which involve measurements of acceleration due to gravity and of viscosity by means of Stokes' formula. (ML)
ERIC Educational Resources Information Center
Mitchell, Eugene E., Ed.
A do-it-yourself laboratory course in automated systems designed at the University of Florida is described. Using a working model of a warehouse interfaced with a minicomputer as a working laboratory, the student gains hands-on experience in operations programing and applications of scheduling, materials handling, and heuristic optimization. (BT)
Optimism and the experience of pain: benefits of seeing the glass as half full
Goodin, Burel R.; Bulls, Hailey W.
2014-01-01
There is a strong body of literature that lends support to the health-promoting effects of an optimistic personality disposition, observed across various physical and psychological dimensions. In accordance with this evidence base, it has been suggested that optimism may positively influence the course and experience of pain. Although the associations among optimism and pain outcomes have only recently begun to be adequately studied, emerging experimental and clinical research links optimism to lower pain sensitivity and better adjustment to chronic pain. This review highlights recent studies that have examined the effects of optimism on the pain experience using samples of individuals with clinically painful conditions as well as healthy samples in laboratory settings. Furthermore, factors such as catastrophizing, hope, acceptance and coping strategies, which are thought to play a role in how optimism exerts its beneficial effects on pain, are also addressed. PMID:23519832
NASA Astrophysics Data System (ADS)
Fromm, Steven
2017-09-01
In an effort to study and improve the optical trapping efficiency of the 225Ra Electric Dipole Moment experiment, a fully parallelized Monte Carlo simulation of the laser cooling and trapping apparatus was created at Argonne National Laboratory and now maintained and upgraded at Michigan State University. The simulation allows us to study optimizations and upgrades without having to use limited quantities of 225Ra (15 day half-life) in experiment's apparatus. It predicts a trapping efficiency that differs from the observed value in the experiment by approximately a factor of thirty. The effects of varying oven geometry, background gas interactions, laboratory magnetic fields, MOT laser beam configurations and laser frequency noise were studied and ruled out as causes of the discrepancy between measured and predicted values of the overall trapping efficiency. Presently, the simulation is being used to help optimize a planned blue slower laser upgrade in the experiment's apparatus, which will increase the overall trapping efficiency by up to two orders of magnitude. This work is supported by Michigan State University, the Director's Research Scholars Program at the National Superconducting Cyclotron Laboratory, and the U.S. DOE, Office of Science, Office of Nuclear Physics, under Contract DE-AC02-06CH11357.
Experience of maintaining laboratory educational website's sustainability
Dimenstein, Izak B.
2016-01-01
Laboratory methodology websites are specialized niche websites. The visibility of a niche website transforms it into an authority site on a particular “niche of knowledge.” This article presents some ways in which a laboratory methodology website can maintain its sustainability. The optimal composition of the website includes a basic content, a blog, and an ancillary part. This article discusses experimenting with the search engine optimization query results page. Strategic placement of keywords and even phrases, as well as fragmentation of the post's material, can improve the website's visibility to search engines. Hyperlinks open a chain reaction of additional links and draw attention to the previous posts. Publications in printed periodicals are a substantial part of a niche website presence on the Internet. Although this article explores a laboratory website on the basis of our hands-on expertise maintaining “Grossing Technology in Surgical Pathology” (www.grossing-technology.com) website with a high volume of traffic for more than a decade, the recommendations presented here for developing an authority website can be applied to other professional specialized websites. The authority websites visibility and sustainability are preconditions for aggregating them in a specialized educational laboratory portal. PMID:27688928
Experience of maintaining laboratory educational website's sustainability.
Dimenstein, Izak B
2016-01-01
Laboratory methodology websites are specialized niche websites. The visibility of a niche website transforms it into an authority site on a particular "niche of knowledge." This article presents some ways in which a laboratory methodology website can maintain its sustainability. The optimal composition of the website includes a basic content, a blog, and an ancillary part. This article discusses experimenting with the search engine optimization query results page. Strategic placement of keywords and even phrases, as well as fragmentation of the post's material, can improve the website's visibility to search engines. Hyperlinks open a chain reaction of additional links and draw attention to the previous posts. Publications in printed periodicals are a substantial part of a niche website presence on the Internet. Although this article explores a laboratory website on the basis of our hands-on expertise maintaining "Grossing Technology in Surgical Pathology" (www.grossing-technology.com) website with a high volume of traffic for more than a decade, the recommendations presented here for developing an authority website can be applied to other professional specialized websites. The authority websites visibility and sustainability are preconditions for aggregating them in a specialized educational laboratory portal.
Optimizing soft X-ray NEXAFS spectroscopy in the laboratory
NASA Astrophysics Data System (ADS)
Mantouvalou, I.; Jonas, A.; Witte, K.; Jung, R.; Stiel, H.; Kanngießer, B.
2017-05-01
Near edge X-ray absorption fine structure (NEXAFS) spectroscopy in the soft X-ray range is feasible in the laboratory using laser-produced plasma sources. We present a study using seven different target materials for optimized data analysis. The emission spectra of the materials with atomic numbers ranging from Z = 6 to Z = 79 show distinct differences, rendering the adapted selection of a suitable target material for specialized experiments feasible. For NEXAFS spectroscopy a 112.5 nm thick polyimide film is investigated as a reference exemplifying the superiority of quasi-continuum like emission spectra.
Optimization of 15 parameters influencing the long-term survival of bacteria in aquatic systems
NASA Technical Reports Server (NTRS)
Obenhuber, D. C.
1993-01-01
NASA is presently engaged in the design and development of a water reclamation system for the future space station. A major concern in processing water is the control of microbial contamination. As a means of developing an optimal microbial control strategy, studies were undertaken to determine the type and amount of contamination which could be expected in these systems under a variety of changing environmental conditions. A laboratory-based Taguchi optimization experiment was conducted to determine the ideal settings for 15 parameters which influence the survival of six bacterial species in aquatic systems. The experiment demonstrated that the bacterial survival period could be decreased significantly by optimizing environmental conditions.
The ideal laboratory information system.
Sepulveda, Jorge L; Young, Donald S
2013-08-01
Laboratory information systems (LIS) are critical components of the operation of clinical laboratories. However, the functionalities of LIS have lagged significantly behind the capacities of current hardware and software technologies, while the complexity of the information produced by clinical laboratories has been increasing over time and will soon undergo rapid expansion with the use of new, high-throughput and high-dimensionality laboratory tests. In the broadest sense, LIS are essential to manage the flow of information between health care providers, patients, and laboratories and should be designed to optimize not only laboratory operations but also personalized clinical care. To list suggestions for designing LIS with the goal of optimizing the operation of clinical laboratories while improving clinical care by intelligent management of laboratory information. Literature review, interviews with laboratory users, and personal experience and opinion. Laboratory information systems can improve laboratory operations and improve patient care. Specific suggestions for improving the function of LIS are listed under the following sections: (1) Information Security, (2) Test Ordering, (3) Specimen Collection, Accessioning, and Processing, (4) Analytic Phase, (5) Result Entry and Validation, (6) Result Reporting, (7) Notification Management, (8) Data Mining and Cross-sectional Reports, (9) Method Validation, (10) Quality Management, (11) Administrative and Financial Issues, and (12) Other Operational Issues.
Optimal Experiment Design for Thermal Characterization of Functionally Graded Materials
NASA Technical Reports Server (NTRS)
Cole, Kevin D.
2003-01-01
The purpose of the project was to investigate methods to accurately verify that designed , materials meet thermal specifications. The project involved heat transfer calculations and optimization studies, and no laboratory experiments were performed. One part of the research involved study of materials in which conduction heat transfer predominates. Results include techniques to choose among several experimental designs, and protocols for determining the optimum experimental conditions for determination of thermal properties. Metal foam materials were also studied in which both conduction and radiation heat transfer are present. Results of this work include procedures to optimize the design of experiments to accurately measure both conductive and radiative thermal properties. Detailed results in the form of three journal papers have been appended to this report.
Applications of Chemiluminescence in the Teaching of Experimental Design
ERIC Educational Resources Information Center
Krawczyk, Tomasz; Slupska, Roksana; Baj, Stefan
2015-01-01
This work describes a single-session laboratory experiment devoted to teaching the principles of factorial experimental design. Students undertook the rational optimization of a luminol oxidation reaction, using a two-level experiment that aimed to create a long-lasting bright emission. During the session students used only simple glassware and…
High Peak Power Ka-Band Gyrotron Oscillator Experiments with Slotted and Unslotted Cavities.
1987-11-10
cylindrical graphite cathode by explosive plasma formation. (In order to optimize the compression ratio for these experiments, a graphite cathode was employed...48106 Attn: S.B. Segall I copy Lawrence Livermore National Laboratory P.O. Box 808 Livermore, California 94550 Attn: Dr. D. Prosnitz 1 copy Dr. T.J
ERIC Educational Resources Information Center
Taipa, M. A^ngela; Azevedo, Ana M.; Grilo, Anto´nio L.; Couto, Pedro T.; Ferreira, Filipe A. G.; Fortuna, Ana R. M.; Pinto, Ine^s F.; Santos, Rafael M.; Santos, Susana B.
2015-01-01
An integrative laboratory study addressing fundamentals of enzyme catalysis and their application to reactors operation and modeling is presented. Invertase, a ß-fructofuranosidase that catalyses the hydrolysis of sucrose, is used as the model enzyme at optimal conditions (pH 4.5 and 45 °C). The experimental work involves 3 h of laboratory time…
ERIC Educational Resources Information Center
Sims, Paul A.; Branscum, Katie M.; Kao, Lydia; Keaveny, Virginia R.
2010-01-01
A method to purify genomic DNA from "Escherichia coli" is presented. The method is an amalgam of published methods but has been modified and optimized for use in the undergraduate biochemistry laboratory. Specifically, the method uses Tide Free 2x Ultra laundry detergent, which contains unspecified proteases and lipases, "n"-butanol, 2-propanol,…
Attitude control system testing on SCOLE
NASA Technical Reports Server (NTRS)
Shenhar, J.; Sparks, D., Jr.; Williams, J. P.; Montgomery, R. C.
1988-01-01
This paper presents implementation of two control policies on SCOLE (Space Control Laboratory Experiment), a laboratory apparatus representing an offset-feed antenna attached to the Space Shuttle by a flexible mast. In the first case, the flexible mast was restrained by cables, permitting modeling of SCOLE as a rigid-body. Starting from an arbitrary state, SCOLE was maneuvered to a specified terminal state using rigid-body minimum-time control law. In the second case, the so called single step optimal control (SSOC) theory is applied to suppress vibrations of the flexible mast mounted as a cantilever beam. Based on the SSOC theory, two parameter optimization algorithms were developed.
Optimization experiments with a double Gauss lens
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brixner, B.; Klein, M.M.
1988-05-01
This paper describes how a lens can be generated by starting from plane surfaces. Three different experiments, using the Los Alamos National Laboratory optimization procedure, all converged on the same stable prescriptions in the optimum minimum region. The starts were made first from an already optimized lens appearing in the literature, then from a powerless plane-surfaces configuration, and finally from a crude Super Angulon configuration. In each case the result was a double Gauss lens, which suggests that this type of lens may be the best compact six-glass solution for one imaging problem: an f/2 aperture and a moderate fieldmore » of view. The procedures and results are discussed in detail.« less
Optimization Experiments With A Double Gauss Lens
NASA Astrophysics Data System (ADS)
Brixner, Berlyn; Klein, Morris M.
1988-05-01
This paper describes how a lens can be generated by starting from plane surfaces. Three different experiments, using the Los Alamos National Laboratory optimization procedure, all converged on the same stable prescriptions in the optimum minimum region. The starts were made first from an already optimized lens appearing in the literature, then from a powerless plane-surfaces configuration, and finally from a crude Super Angulon configuration. In each case the result was a double Gauss lens, which suggests that this type of lens may be the best compact six-glass solution for one imaging problem: an f/2 aperture and a moderate field of view. The procedures and results are discussed in detail.
Recent Experiences in Multidisciplinary Analysis and Optimization, part 1
NASA Technical Reports Server (NTRS)
Sobieski, J. (Compiler)
1984-01-01
Papers presented at the NASA Symposium on Recent Experiences in Multidisciplinary Analysis and Optimization held at NASA Langley Research Center, Hampton, Virginia April 24 to 26, 1984 are given. The purposes of the symposium were to exchange information about the status of the application of optimization and associated analyses in industry or research laboratories to real life problems and to examine the directions of future developments. Information exchange has encompassed the following: (1) examples of successful applications; (2) attempt and failure examples; (3) identification of potential applications and benefits; (4) synergistic effects of optimized interaction and trade-offs occurring among two or more engineering disciplines and/or subsystems in a system; and (5) traditional organization of a design process as a vehicle for or an impediment to the progress in the design methodology.
Hallworth, Mike J; Epner, Paul L; Ebert, Christoph; Fantz, Corinne R; Faye, Sherry A; Higgins, Trefor N; Kilpatrick, Eric S; Li, Wenzhe; Rana, S V; Vanstapel, Florent
2015-04-01
Systematic evidence of the contribution made by laboratory medicine to patient outcomes and the overall process of healthcare is difficult to find. An understanding of the value of laboratory medicine, how it can be determined, and the various factors that influence it is vital to ensuring that the service is provided and used optimally. This review summarizes existing evidence supporting the impact of laboratory medicine in healthcare and indicates the gaps in our understanding. It also identifies deficiencies in current utilization, suggests potential solutions, and offers a vision of a future in which laboratory medicine is used optimally to support patient care. To maximize the value of laboratory medicine, work is required in 5 areas: (a) improved utilization of existing and new tests; (b) definition of new roles for laboratory professionals that are focused on optimizing patient outcomes by adding value at all points of the diagnostic brain-to-brain cycle; (c) development of standardized protocols for prospective patient-centered studies of biomarker clinical effectiveness or extraanalytical process effectiveness; (d) benchmarking of existing and new tests in specified situations with commonly accepted measures of effectiveness; (e) agreed definition and validation of effectiveness measures and use of checklists for articles submitted for publication. Progress in these areas is essential if we are to demonstrate and enhance the value of laboratory medicine and prevent valuable information being lost in meaningless data. This requires effective collaboration with clinicians, and a determination to accept patient outcome and patient experience as the primary measure of laboratory effectiveness. © 2014 American Association for Clinical Chemistry.
A Safe and Effective Modification of Thomson's Jumping Ring Experiment
ERIC Educational Resources Information Center
Waschke, Felix; Strunz, Andreas; Meyn, Jan-Peter
2012-01-01
The electrical circuit of the jumping ring experiment based on discharging a capacitor is optimized. The setup is scoop proof at 46 V and yet the ring jumps more than 9 m high. The setup is suitable for both lectures and student laboratory work in higher education. (Contains 1 table, 8 figures and 3 footnotes.)
Li, Zhongwei; Xin, Yuezhen; Wang, Xun; Sun, Beibei; Xia, Shengyu; Li, Hui
2016-01-01
Phellinus is a kind of fungus and is known as one of the elemental components in drugs to avoid cancers. With the purpose of finding optimized culture conditions for Phellinus production in the laboratory, plenty of experiments focusing on single factor were operated and large scale of experimental data were generated. In this work, we use the data collected from experiments for regression analysis, and then a mathematical model of predicting Phellinus production is achieved. Subsequently, a gene-set based genetic algorithm is developed to optimize the values of parameters involved in culture conditions, including inoculum size, PH value, initial liquid volume, temperature, seed age, fermentation time, and rotation speed. These optimized values of the parameters have accordance with biological experimental results, which indicate that our method has a good predictability for culture conditions optimization. PMID:27610365
A new helium gas bearing turboexpander
NASA Astrophysics Data System (ADS)
Xiong, L. Y.; Chen, C. Z.; Liu, L. Q.; Hou, Y.; Wang, J.; Lin, M. F.
2002-05-01
A new helium gas bearing turboexpander of a helium refrigeration system used for space environment simulation experiments is described in this paper. The main design parameters and construction type of some key parts are presented. An improved calculation of thermodynamic efficiency and instability speed of this turboexpander has been obtained by a multiple objects optimization program. Experiments of examining mechanical and thermodynamic performance have been repeatedly conducted in the laboratory by using air at ambient and liquid nitrogen temperature, respectively. In order to predict the helium turboexpander performance, a similarity principles study has been developed. According to the laboratory and on-the-spot experiments, the mechanical and thermodynamic performances of this helium turboexpander are excellent.
Weinberg, Michael; Besser, Avi; Zeigler-Hill, Virgil; Neria, Yuval
2015-01-01
Although previous studies have rarely examined predictors of acute emotional responses to war trauma, this "natural laboratory" study aimed to examine the role that individual differences in dispositional optimism and self-esteem play in the development of acute symptoms of generalized anxiety disorder (GAD) and dissociative experiences. A sample of 140 female adults exposed to missile and rocket fire during an eruption of violence in the Middle East in November 2012 was assessed during real-time exposure. The results demonstrate inverse associations between dispositional optimism and acute symptoms of GAD and dissociation. The associations were accounted for by individual differences in self-esteem. In addition, individuals with low levels of dispositional optimism demonstrated a higher risk for acute GAD and dissociative experiences, in part because of their low levels of self-esteem. Theoretical and clinical implications of the findings are discussed. (c) 2015 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Mendoza, Sergio; Rothenberger, Michael; Hake, Alison; Fathy, Hosam
2016-03-01
This article presents a framework for optimizing the thermal cycle to estimate a battery cell's entropy coefficient at 20% state of charge (SOC). Our goal is to maximize Fisher identifiability: a measure of the accuracy with which a parameter can be estimated. Existing protocols in the literature for estimating entropy coefficients demand excessive laboratory time. Identifiability optimization makes it possible to achieve comparable accuracy levels in a fraction of the time. This article demonstrates this result for a set of lithium iron phosphate (LFP) cells. We conduct a 24-h experiment to obtain benchmark measurements of their entropy coefficients. We optimize a thermal cycle to maximize parameter identifiability for these cells. This optimization proceeds with respect to the coefficients of a Fourier discretization of this thermal cycle. Finally, we compare the estimated parameters using (i) the benchmark test, (ii) the optimized protocol, and (iii) a 15-h test from the literature (by Forgez et al.). The results are encouraging for two reasons. First, they confirm the simulation-based prediction that the optimized experiment can produce accurate parameter estimates in 2 h, compared to 15-24. Second, the optimized experiment also estimates a thermal time constant representing the effects of thermal capacitance and convection heat transfer.
Nyberg working with ACE in U.S. Laboratory
2013-08-18
ISS036-E-035770 (18 Aug. 2013) --- NASA astronaut Karen Nyberg, Expedition 36 flight engineer, works with new test samples for the Advanced Colloids Experiment, or ACE, housed in the Light Microscopy Module (LMM) inside the Fluids Integrated Rack of the International Space Station?s Destiny laboratory. Results from ACE will help researchers understand how to optimize stabilizers to extend the shelf life of products like laundry detergent, paint, ketchup and even salad dressing.
Nyberg working with ACE in U.S. Laboratory
2013-08-18
ISS036-E-035767 (18 Aug. 2013) --- NASA astronaut Karen Nyberg, Expedition 36 flight engineer, works with new test samples for the Advanced Colloids Experiment, or ACE, housed in the Light Microscopy Module (LMM) inside the Fluids Integrated Rack of the International Space Station?s Destiny laboratory. Results from ACE will help researchers understand how to optimize stabilizers to extend the shelf life of products like laundry detergent, paint, ketchup and even salad dressing.
Nyberg working with ACE in U.S. Laboratory
2013-08-18
ISS036-E-035780 (18 Aug. 2013) --- NASA astronaut Karen Nyberg, Expedition 36 flight engineer, works with new test samples for the Advanced Colloids Experiment, or ACE, housed in the Light Microscopy Module (LMM) inside the Fluids Integrated Rack of the International Space Station?s Destiny laboratory. Results from ACE will help researchers understand how to optimize stabilizers to extend the shelf life of products like laundry detergent, paint, ketchup and even salad dressing.
Analysis of Photothermal Characterization of Layered Materials: Design of Optimal Experiments
NASA Technical Reports Server (NTRS)
Cole, Kevin D.
2003-01-01
In this paper numerical calculations are presented for the steady-periodic temperature in layered materials and functionally-graded materials to simulate photothermal methods for the measurement of thermal properties. No laboratory experiments were performed. The temperature is found from a new Green s function formulation which is particularly well-suited to machine calculation. The simulation method is verified by comparison with literature data for a layered material. The method is applied to a class of two-component functionally-graded materials and results for temperature and sensitivity coefficients are presented. An optimality criterion, based on the sensitivity coefficients, is used for choosing what experimental conditions will be needed for photothermal measurements to determine the spatial distribution of thermal properties. This method for optimal experiment design is completely general and may be applied to any photothermal technique and to any functionally-graded material.
Using the Tritium Plasma Experiment to evaluate ITER PFC safety
NASA Astrophysics Data System (ADS)
Longhurst, Glen R.; Anderl, Robert A.; Bartlit, John R.; Causey, Rion A.; Haines, John R.
The Tritium Plasma Experiment was assembled at Sandia National Laboratories, Livermore to investigate interactions between dense plasmas at low energies and plasma-facing component materials. This apparatus has the unique capability of replicating plasma conditions in a tokamak divertor with particle flux densities of 2 x 10(exp 19) ions/((sq cm)(s)) and a plasma temperature of about 15 eV using a plasma that includes tritium. With the closure of the Tritium Research Laboratory at Livermore, the experiment was moved to the Tritium Systems Test Assembly facility at Los Alamos National Laboratory. An experimental program has been initiated there using the Tritium Plasma Experiment to examine safety issues related to tritium in plasma-facing components, particularly the ITER divertor. Those issues include tritium retention and release characteristics, tritium permeation rates and transient times to coolant streams, surface modification and erosion by the plasma, the effects of thermal loads and cycling, and particulate production. A considerable lack of data exists in these areas for many of the materials, especially beryllium, being considered for use in ITER. Not only will basic material behavior with respect to safety issues in the divertor environment be examined, but innovative techniques for optimizing performance with respect to tritium safety by material modification and process control will be investigated. Supplementary experiments will be carried out at the Idaho National Engineering Laboratory and Sandia National Laboratory to expand and clarify results obtained on the Tritium Plasma Experiment.
Ray, Chad A; Patel, Vimal; Shih, Judy; Macaraeg, Chris; Wu, Yuling; Thway, Theingi; Ma, Mark; Lee, Jean W; Desilva, Binodh
2009-02-20
Developing a process that generates robust immunoassays that can be used to support studies with tight timelines is a common challenge for bioanalytical laboratories. Design of experiments (DOEs) is a tool that has been used by many industries for the purpose of optimizing processes. The approach is capable of identifying critical factors and their interactions with a minimal number of experiments. The challenge for implementing this tool in the bioanalytical laboratory is to develop a user-friendly approach that scientists can understand and apply. We have successfully addressed these challenges by eliminating the screening design, introducing automation, and applying a simple mathematical approach for the output parameter. A modified central composite design (CCD) was applied to three ligand binding assays. The intra-plate factors selected were coating, detection antibody concentration, and streptavidin-HRP concentrations. The inter-plate factors included incubation times for each step. The objective was to maximize the logS/B (S/B) of the low standard to the blank. The maximum desirable conditions were determined using JMP 7.0. To verify the validity of the predictions, the logS/B prediction was compared against the observed logS/B during pre-study validation experiments. The three assays were optimized using the multi-factorial DOE. The total error for all three methods was less than 20% which indicated method robustness. DOE identified interactions in one of the methods. The model predictions for logS/B were within 25% of the observed pre-study validation values for all methods tested. The comparison between the CCD and hybrid screening design yielded comparable parameter estimates. The user-friendly design enables effective application of multi-factorial DOE to optimize ligand binding assays for therapeutic proteins. The approach allows for identification of interactions between factors, consistency in optimal parameter determination, and reduced method development time.
Optimism as a predictor of the effects of laboratory-induced stress on fears and hope.
Kimhi, Shaul; Eshel, Yohanan; Shahar, Eldad
2013-01-01
The objective of the current study is to explore optimism as a predictor of personal and collective fear, as well as hope, following laboratory-induced stress. Students (N = 107; 74 female, 33 male) were assigned randomly to either the experimental (stress--political violence video clip) or the control group (no-stress--nature video clip). Questionnaires of fear and hope were administered immediately after the experiment (Time 1) and 3 weeks later (Time 2). Structural equation modeling indicated the following: (a) Optimism significantly predicted both fear and hope in the stress group at Time 1, but not in the no-stress group. (b) Optimism predicted hope but not fear at Time 2 in the stress group. (c) Hope at Time 1 significantly predicted hope at Time 2, in both the stress and the no-stress groups. (d) Gender did not predict significantly fear at Time 1 in the stress group, despite a significant difference between genders. This study supports previous studies indicating that optimism plays an important role in people's coping with stress. However, based on our research the data raise the question of whether optimism, by itself, or environmental stress, by itself, may accurately predict stress response.
Todd, Christopher A; Greene, Kelli M; Yu, Xuesong; Ozaki, Daniel A; Gao, Hongmei; Huang, Yunda; Wang, Maggie; Li, Gary; Brown, Ronald; Wood, Blake; D'Souza, M Patricia; Gilbert, Peter; Montefiori, David C; Sarzotti-Kelsoe, Marcella
2012-01-31
Recent advances in assay technology have led to major improvements in how HIV-1 neutralizing antibodies are measured. A luciferase reporter gene assay performed in TZM-bl (JC53bl-13) cells has been optimized and validated. Because this assay has been adopted by multiple laboratories worldwide, an external proficiency testing program was developed to ensure data equivalency across laboratories performing this neutralizing antibody assay for HIV/AIDS vaccine clinical trials. The program was optimized by conducting three independent rounds of testing, with an increased level of stringency from the first to third round. Results from the participating domestic and international laboratories improved each round as factors that contributed to inter-assay variability were identified and minimized. Key contributors to increased agreement were experience among laboratories and standardization of reagents. A statistical qualification rule was developed using a simulation procedure based on the three optimization rounds of testing, where a laboratory qualifies if at least 25 of the 30 ID50 values lie within the acceptance ranges. This ensures no more than a 20% risk that a participating laboratory fails to qualify when it should, as defined by the simulation procedure. Five experienced reference laboratories were identified and tested a series of standardized reagents to derive the acceptance ranges for pass-fail criteria. This Standardized Proficiency Testing Program is the first available for the evaluation and documentation of assay equivalency for laboratories performing HIV-1 neutralizing antibody assays and may provide guidance for the development of future proficiency testing programs for other assay platforms. Copyright © 2011 Elsevier B.V. All rights reserved.
The HYCOM (HYbrid Coordinate Ocean Model) Data Assimilative System
2007-06-01
Systems Inc., Stennis Space Center. MS, USA d SHOM/CMO, Toulouse. France € Los Alamos National Laboratory, Los Alamos, NM. USA Received 1 October 2004...Global Ocean Data Assimilation ’U. of Miami, NRL, Los Alamos, NOAA/NCEP, NOAA/AOML, Experiment (GODAE). GODAE is a coordinated inter- NOAA/PMEL, PSI...of Miami, the Naval all three approaches and the optimal distribution is Research Laboratory (NRL), and the Los Alamos chosen at every time step. The
[The modeling of the ricochet shot fired from a light weapon].
Gusentsov, A O; Chuchko, V A; Kil'dyushev, E M; Tumanov, E V
The objective of the present study was to choose the optimal method for the modeling of the glance of a bullet after hitting a target under conditions of the laboratory experiment. The study required the designing and construction of an original device for the modeling of the rebound effect of a light-firearm shot under experimental conditions. The device was tested under conditions of the laboratory experiment. The trials have demonstrated the possibility of using barriers of different weight and dimensions in the above device, their positioning and fixation depending on the purpose of the experiment, dynamic alteration of its conditions with due regard for the safety and security arrangements to protect the health and life of the experimenters without compromising the statistical significance and scientific validity of the results of the experiments.
Immobilized Lactase in the Biochemistry Laboratory
NASA Astrophysics Data System (ADS)
Allison, Matthew J.; Bering, C. Larry
1998-10-01
Immobilized enzymes have many practical applications. They may be used in clinical, industrial, and biotechnological laboratories and in many clinical diagnostic kits. For educational purposes, use of immobilized enzymes can easily be taught at the undergraduate or even secondary level. We have developed an immobilized enzyme experiment that combines many practical techniques used in the biochemistry laboratory and fits within a three-hour time frame. In this experiment, lactase from over-the-counter tablets for patients with lactose intolerance is immobilized in polyacrylamide, which is then milled into small beads and placed into a chromatography column. A lactose solution is added to the column and the eluant is assayed using the glucose oxidase assay, available as a kit. We have determined the optimal conditions to give the greatest turnover of lactose while allowing the immobilized enzymes to be active for long periods at room temperature.
2013-08-18
ISS036-E-033948 (18 Aug. 2013) --- NASA astronaut Karen Nyberg, Expedition 36 flight engineer, works with new test samples for the Advanced Colloids Experiment, or ACE, housed in the Light Microscopy Module (LMM) inside the Fluids Integrated Rack of the International Space Station?s Destiny laboratory. Results from ACE will help researchers understand how to optimize stabilizers to extend the shelf life of products like laundry detergent, paint, ketchup and even salad dressing.
Recent Progress on the magnetic turbulence experiment at the Bryn Mawr Plasma Laboratory
NASA Astrophysics Data System (ADS)
Schaffner, D. A.; Cartagena-Sanchez, C. A.; Johnson, H. K.; Fahim, L. E.; Fiedler-Kawaguchi, C.; Douglas-Mann, E.
2017-10-01
Recent progress is reported on the construction, implementation and testing of the magnetic turbulence experiment at the Bryn Mawr Plasma Laboratory (BMPL). The experiment at the BMPL consists of an ( 300 μs) long coaxial plasma gun discharge that injects magnetic helicity into a flux-conserving chamber in a process akin to sustained slow-formation of spheromaks. A 24cm by 2m cylindrical chamber has been constructed with a high density axial port array to enable detailed simultaneous spatial measurements of magnetic and plasma fluctuations. Careful positioning of the magnetic structure produced by the three separately pulsed coils (one internal, two external) are preformed to optimize for continuous injection of turbulent magnetized plasma. High frequency calibration of magnetic probes is also underway using a power amplifier.
NASA Astrophysics Data System (ADS)
Kurniati, D. R.; Rohman, I.
2018-05-01
This study aims to analyze the concepts and science process skills in bomb calorimeter experiment as a basis for developing the virtual laboratory of bomb calorimeter. This study employed research and development method (R&D) to gain the answer to the proposed problems. This paper discussed the concepts and process skills analysis. The essential concepts and process skills associated with bomb calorimeter are analyze by optimizing the bomb calorimeter experiment. The concepts analysis found seven fundamental concepts to be concerned in developing the virtual laboratory that are internal energy, burning heat, perfect combustion, incomplete combustion, calorimeter constant, bomb calorimeter, and Black principle. Since the concept of bomb calorimeter, perfect and incomplete combustion created to figure out the real situation and contain controllable variables, in virtual the concepts displayed in the form of simulation. Meanwhile, the last four concepts presented in the form of animation because no variable found to be controlled. The process skills analysis detect four notable skills to be developed that are ability to observe, design experiment, interpretation, and communication skills.
Beam pointing angle optimization and experiments for vehicle laser Doppler velocimetry
NASA Astrophysics Data System (ADS)
Fan, Zhe; Hu, Shuling; Zhang, Chunxi; Nie, Yanju; Li, Jun
2015-10-01
Beam pointing angle (BPA) is one of the key parameters that affects the operation performance of the laser Doppler velocimetry (LDV) system. By considering velocity sensitivity and echo power, for the first time, the optimized BPA of vehicle LDV is analyzed. Assuming mounting error is within ±1.0 deg, the reflectivity and roughness are variable for different scenarios, the optimized BPA is obtained in the range from 29 to 43 deg. Therefore, velocity sensitivity is in the range of 1.25 to 1.76 MHz/(m/s), and the percentage of normalized echo power at optimized BPA with respect to that at 0 deg is greater than 53.49%. Laboratory experiments with a rotating table are done with different BPAs of 10, 35, and 66 deg, and the results coincide with the theoretical analysis. Further, vehicle experiment with optimized BPA of 35 deg is conducted by comparison with microwave radar (accuracy of ±0.5% full scale output). The root-mean-square error of LDV's results is smaller than the Microstar II's, 0.0202 and 0.1495 m/s, corresponding to LDV and Microstar II, respectively, and the mean velocity discrepancy is 0.032 m/s. It is also proven that with the optimized BPA both high velocity sensitivity and acceptable echo power can simultaneously be guaranteed.
NASA Astrophysics Data System (ADS)
Reutterer, Bernd; Traxler, Lukas; Bayer, Natascha; Drauschke, Andreas
2016-04-01
Selective Laser Sintering (SLS) is considered as one of the most important additive manufacturing processes due to component stability and its broad range of usable materials. However the influence of the different process parameters on mechanical workpiece properties is still poorly studied, leading to the fact that further optimization is necessary to increase workpiece quality. In order to investigate the impact of various process parameters, laboratory experiments are implemented to improve the understanding of the SLS limitations and advantages on an educational level. Experiments are based on two different workstations, used to teach students the fundamentals of SLS. First of all a 50 W CO2 laser workstation is used to investigate the interaction of the laser beam with the used material in accordance with varied process parameters to analyze a single-layered test piece. Second of all the FORMIGA P110 laser sintering system from EOS is used to print different 3D test pieces in dependence on various process parameters. Finally quality attributes are tested including warpage, dimension accuracy or tensile strength. For dimension measurements and evaluation of the surface structure a telecentric lens in combination with a camera is used. A tensile test machine allows testing of the tensile strength and the interpreting of stress-strain curves. The developed laboratory experiments are suitable to teach students the influence of processing parameters. In this context they will be able to optimize the input parameters depending on the component which has to be manufactured and to increase the overall quality of the final workpiece.
[Thermodynamic forecasting of reagents composition for soils decontamination].
Nikolaev, V P; Nikolaevskiĭ, V B; Chirkina, I V; Shcheglov, M Iu
2009-01-01
Based on thermodynamic studies, the authors conducted laboratory experiments on searching optimal composition of leaching reagents solution for soils decontamination, when contaminated with Cs-137, of activity coefficient for caesium sulfate microquantities in macrocomponents solutions. The method could be used for modelling the radionuclides phase equillibrium and relocations in soils.
Modeling the effects of high-G stress on pilots in a tracking task
NASA Technical Reports Server (NTRS)
Korn, J.; Kleinman, D. L.
1978-01-01
Air-to-air tracking experiments were conducted at the Aerospace Medical Research Laboratories using both fixed and moving base dynamic environment simulators. The obtained data, which includes longitudinal error of a simulated air-to-air tracking task as well as other auxiliary variables, was analyzed using an ensemble averaging method. In conjunction with these experiments, the optimal control model is applied to model a human operator under high-G stress.
NASA Astrophysics Data System (ADS)
Vasilarou, Argyro-Maria G.; Georgiou, Constantinos A.
2000-10-01
The glucose oxidase-horseradish peroxidase coupled reaction using phenol and 4-aminoantipyrine is used for the kinetic determination of glucose in drinks and beverages. This laboratory experiment demonstrates the implementation of reaction rate kinetic methods of analysis, the use of enzymes as selective analytical reagents for the determination of substrates, the kinetic masking of ascorbic acid interference, and the analysis of glucose in drinks and beverages. The method is optimized for student use in the temperature range of 18-28 °C and can be used in low-budget laboratories equipped with an inexpensive visible photometer. The mixed enzyme-chromogen solution that is used is stable for two months. Precision ranged from 5.1 to 12% RSD for analyses conducted during a period of two months by 48 students.
Development of a Split Bitter-type Magnet System for Dusty Plasma Experiments
NASA Astrophysics Data System (ADS)
Bates, Evan; Romero-Talamas, Carlos A.; Birmingham, William J.; Rivera, William F.
2014-10-01
A 10 Tesla Bitter-type magnetic system is under development at the Dusty Plasma Laboratory of the University of Maryland, Baltimore County (UMBC). We present here an optimization technique that uses differential evolution to minimize the omhic heating produced by the coils, while constraining the magnetic field in the experimental volume. The code gives us the optimal dimensions for the coil system including: coil length, turn thickness, disks radii, resistance, and total current required for a constant magnetic field. Finite element parametric optimization is then used to establish the optimal design for water cooling holes. Placement of the cooling holes will also take into consideration the magnetic forces acting on the copper alloy disks to ensure the material strength is not compromised during operation. The proposed power and cooling water delivery subsystems for the coils are also presented. Upon completion and testing of the magnet system, planned experiments include the propagation of magnetized waves in dusty plasma crystals under various boundary conditions, and viscosity in rotational shear flow, among others.
DOT National Transportation Integrated Search
2009-08-01
The development and evaluation of low-cracking high-performance concrete (LC-HPC) for use in bridge decks : is described based on laboratory test results and experience gained during the construction of 14 bridges. This report : emphasizes the materi...
DOT National Transportation Integrated Search
2009-08-01
The development and evaluation of low-cracking high-performance concrete (LC-HPC) for use in bridge decks : is described based on laboratory test results and experience gained during the construction of 14 bridges. This report : emphasizes the materi...
2014-01-01
A multi-session research-like module has been developed for use in the undergraduate organic teaching laboratory curriculum. Students are tasked with planning and executing the synthesis of a novel fluorous dye molecule and using it to explore a fluorous affinity chromatography separation technique, which is the first implementation of this technique in a teaching laboratory. Key elements of the project include gradually introducing students to the use of the chemical literature to facilitate their searching, as well as deliberate constraints designed to force them to think critically about reaction design and optimization in organic chemistry. The project also introduces students to some advanced laboratory practices such as Schlenk techniques, degassing of reaction mixtures, affinity chromatography, and microwave-assisted chemistry. This provides students a teaching laboratory experience that closely mirrors authentic synthetic organic chemistry practice in laboratories throughout the world. PMID:24501431
Slade, Michael C; Raker, Jeffrey R; Kobilka, Brandon; Pohl, Nicola L B
2014-01-14
A multi-session research-like module has been developed for use in the undergraduate organic teaching laboratory curriculum. Students are tasked with planning and executing the synthesis of a novel fluorous dye molecule and using it to explore a fluorous affinity chromatography separation technique, which is the first implementation of this technique in a teaching laboratory. Key elements of the project include gradually introducing students to the use of the chemical literature to facilitate their searching, as well as deliberate constraints designed to force them to think critically about reaction design and optimization in organic chemistry. The project also introduces students to some advanced laboratory practices such as Schlenk techniques, degassing of reaction mixtures, affinity chromatography, and microwave-assisted chemistry. This provides students a teaching laboratory experience that closely mirrors authentic synthetic organic chemistry practice in laboratories throughout the world.
Gerdes, Lars; Iwobi, Azuka; Busch, Ulrich; Pecoraro, Sven
2016-01-01
Digital PCR in droplets (ddPCR) is an emerging method for more and more applications in DNA (and RNA) analysis. Special requirements when establishing ddPCR for analysis of genetically modified organisms (GMO) in a laboratory include the choice between validated official qPCR methods and the optimization of these assays for a ddPCR format. Differentiation between droplets with positive reaction and negative droplets, that is setting of an appropriate threshold, can be crucial for a correct measurement. This holds true in particular when independent transgene and plant-specific reference gene copy numbers have to be combined to determine the content of GM material in a sample. Droplets which show fluorescent units ranging between those of explicit positive and negative droplets are called ‘rain’. Signals of such droplets can hinder analysis and the correct setting of a threshold. In this manuscript, a computer-based algorithm has been carefully designed to evaluate assay performance and facilitate objective criteria for assay optimization. Optimized assays in return minimize the impact of rain on ddPCR analysis. We developed an Excel based ‘experience matrix’ that reflects the assay parameters of GMO ddPCR tests performed in our laboratory. Parameters considered include singleplex/duplex ddPCR, assay volume, thermal cycler, probe manufacturer, oligonucleotide concentration, annealing/elongation temperature, and a droplet separation evaluation. We additionally propose an objective droplet separation value which is based on both absolute fluorescence signal distance of positive and negative droplet populations and the variation within these droplet populations. The proposed performance classification in the experience matrix can be used for a rating of different assays for the same GMO target, thus enabling employment of the best suited assay parameters. Main optimization parameters include annealing/extension temperature and oligonucleotide concentrations. The droplet separation value allows for easy and reproducible assay performance evaluation. The combination of separation value with the experience matrix simplifies the choice of adequate assay parameters for a given GMO event. PMID:27077048
Gerdes, Lars; Iwobi, Azuka; Busch, Ulrich; Pecoraro, Sven
2016-03-01
Digital PCR in droplets (ddPCR) is an emerging method for more and more applications in DNA (and RNA) analysis. Special requirements when establishing ddPCR for analysis of genetically modified organisms (GMO) in a laboratory include the choice between validated official qPCR methods and the optimization of these assays for a ddPCR format. Differentiation between droplets with positive reaction and negative droplets, that is setting of an appropriate threshold, can be crucial for a correct measurement. This holds true in particular when independent transgene and plant-specific reference gene copy numbers have to be combined to determine the content of GM material in a sample. Droplets which show fluorescent units ranging between those of explicit positive and negative droplets are called 'rain'. Signals of such droplets can hinder analysis and the correct setting of a threshold. In this manuscript, a computer-based algorithm has been carefully designed to evaluate assay performance and facilitate objective criteria for assay optimization. Optimized assays in return minimize the impact of rain on ddPCR analysis. We developed an Excel based 'experience matrix' that reflects the assay parameters of GMO ddPCR tests performed in our laboratory. Parameters considered include singleplex/duplex ddPCR, assay volume, thermal cycler, probe manufacturer, oligonucleotide concentration, annealing/elongation temperature, and a droplet separation evaluation. We additionally propose an objective droplet separation value which is based on both absolute fluorescence signal distance of positive and negative droplet populations and the variation within these droplet populations. The proposed performance classification in the experience matrix can be used for a rating of different assays for the same GMO target, thus enabling employment of the best suited assay parameters. Main optimization parameters include annealing/extension temperature and oligonucleotide concentrations. The droplet separation value allows for easy and reproducible assay performance evaluation. The combination of separation value with the experience matrix simplifies the choice of adequate assay parameters for a given GMO event.
Janetzki, Sylvia; Panageas, Katherine S; Ben-Porat, Leah; Boyer, Jean; Britten, Cedrik M; Clay, Timothy M; Kalos, Michael; Maecker, Holden T; Romero, Pedro; Yuan, Jianda; Kast, W Martin; Hoos, Axel
2008-03-01
The Cancer Vaccine Consortium of the Sabin Vaccine Institute (CVC/SVI) is conducting an ongoing large-scale immune monitoring harmonization program through its members and affiliated associations. This effort was brought to life as an external validation program by conducting an international Elispot proficiency panel with 36 laboratories in 2005, and was followed by a second panel with 29 participating laboratories in 2006 allowing for application of learnings from the first panel. Critical protocol choices, as well as standardization and validation practices among laboratories were assessed through detailed surveys. Although panel participants had to follow general guidelines in order to allow comparison of results, each laboratory was able to use its own protocols, materials and reagents. The second panel recorded an overall significantly improved performance, as measured by the ability to detect all predefined responses correctly. Protocol choices and laboratory practices, which can have a dramatic effect on the overall assay outcome, were identified and lead to the following recommendations: (A) Establish a laboratory SOP for Elispot testing procedures including (A1) a counting method for apoptotic cells for determining adequate cell dilution for plating, and (A2) overnight rest of cells prior to plating and incubation, (B) Use only pre-tested serum optimized for low background: high signal ratio, (C) Establish a laboratory SOP for plate reading including (C1) human auditing during the reading process and (C2) adequate adjustments for technical artifacts, and (D) Only allow trained personnel, which is certified per laboratory SOPs to conduct assays. Recommendations described under (A) were found to make a statistically significant difference in assay performance, while the remaining recommendations are based on practical experiences confirmed by the panel results, which could not be statistically tested. These results provide initial harmonization guidelines to optimize Elispot assay performance to the immunotherapy community. Further optimization is in process with ongoing panels.
NASA Technical Reports Server (NTRS)
1975-01-01
The acquisition and tracking links of shuttle to molniya satellite and shuttle to ground are established. Link parameters and tolerance are analyzed. A 10-micromillimeter optomechanical subsystem brassboard model was designed and measured for optical properties and weight optimization. The design incorporates an afocal rotating Gregorian telescope in a two-gimbal berylium structure with beam steering control mechanisms. Parameters for both the optomechanical subsystem and spaceborne terminals are included.
Selecting the best design for nonstandard toxicology experiments.
Webb, Jennifer M; Smucker, Byran J; Bailer, A John
2014-10-01
Although many experiments in environmental toxicology use standard statistical experimental designs, there are situations that arise where no such standard design is natural or applicable because of logistical constraints. For example, the layout of a laboratory may suggest that each shelf serve as a block, with the number of experimental units per shelf either greater than or less than the number of treatments in a way that precludes the use of a typical block design. In such cases, an effective and powerful alternative is to employ optimal experimental design principles, a strategy that produces designs with precise statistical estimates. Here, a D-optimal design was generated for an experiment in environmental toxicology that has 2 factors, 16 treatments, and constraints similar to those described above. After initial consideration of a randomized complete block design and an intuitive cyclic design, it was decided to compare a D-optimal design and a slightly more complicated version of the cyclic design. Simulations were conducted generating random responses under a variety of scenarios that reflect conditions motivated by a similar toxicology study, and the designs were evaluated via D-efficiency as well as by a power analysis. The cyclic design performed well compared to the D-optimal design. © 2014 SETAC.
Zhao, Jian; Hu, Dong-mei; Yu, Da-de; Dong, Ming-liang; Li, Yun; Fan, Ying-ming; Wang, Yan-wei; Zhang, Jin-feng
2016-05-01
Comprehensive laboratory courses, which enable students to aptly apply theoretic knowledge and master experiment skills, play an important role in the present educational reform of laboratory courses. We utilized human ABO blood type as the experimental subject, and designed the experiment--"Molecular Genotyping of Human ABO Blood Type and Analysis of Population Genetic Equilibrium". In the experiment, DNA in mucosal cells is extracted from students' saliva, and each student's genotype is identified using a series of molecular genetics technologies, including PCR amplification of target fragments, enzymatic digestion, and electrophoretic separation. Then, taking the whole class as an analogous Mendel population, a survey of genotype frequency of ABO blood type is conducted, followed with analyses of various population genetic parameters using Popgene. Through the open laboratory course, students can not only master molecular genetic experimental skills, but also improve their understanding of theoretic knowledge through independent design and optimization of molecular techniques. After five years of research and practice, a stable experimental system of molecular genetics has been established to identify six genotypes of ABO blood types, namely I(A)I(A), I(A)i, I(B)I(B), I(B)i, I(A)I(B) and ii. Laboratory courses of molecular and population genetics have been integrated by calculating the frequencies of the six genotypes and three multiple alleles and testing population genetic equilibrium. The goal of the open laboratory course with independent design and implementation by the students has been achieved. This laboratory course has proved effective and received good reviews from the students. It could be applied as a genetics laboratory course for the biology majors directly, and its ideas and methods could be promoted and applied to other biological laboratory courses.
CFD optimization of continuous stirred-tank (CSTR) reactor for biohydrogen production.
Ding, Jie; Wang, Xu; Zhou, Xue-Fei; Ren, Nan-Qi; Guo, Wan-Qian
2010-09-01
There has been little work on the optimal configuration of biohydrogen production reactors. This paper describes three-dimensional computational fluid dynamics (CFD) simulations of gas-liquid flow in a laboratory-scale continuous stirred-tank reactor used for biohydrogen production. To evaluate the role of hydrodynamics in reactor design and optimize the reactor configuration, an optimized impeller design has been constructed and validated with CFD simulations of the normal and optimized impeller over a range of speeds and the numerical results were also validated by examination of residence time distribution. By integrating the CFD simulation with an ethanol-type fermentation process experiment, it was shown that impellers with different type and speed generated different flow patterns, and hence offered different efficiencies for biohydrogen production. The hydrodynamic behavior of the optimized impeller at speeds between 50 and 70 rev/min is most suited for economical biohydrogen production. Copyright 2010 Elsevier Ltd. All rights reserved.
Synthesis of noble metal nanoparticles
NASA Astrophysics Data System (ADS)
Bahadory, Mozhgan
Improved methods were developed for the synthesis of noble metal nanoparticles. Laboratory experiments were designed for introducing of nanotechnology into the undergraduate curriculum. An optimal set of conditions for the synthesis of clear yellow colloidal silver was investigated. Silver nanoparticles were obtained by borohydride reduction of silver nitrate, a method which produces particles with average size of 12+/-2 nm, determined by Transmission Electron Microscopy (TEM). The plasmon absorbance is at 397 nm and the peak width at half maximum (PWHM) is 70-75 nm. The relationship between aggregation and optical properties was determined along with a method to protect the particles using polyvinylpyrrolidone (PVP). A laboratory experiment was designed in which students synthesize yellow colloidal silver, estimate particle size using visible spectroscopy, and study aggregation effects. The synthesis of the less stable copper nanoparticles is more difficult because copper nanopaticles are easily oxidized. Four methods were used for the synthesis of copper nanoparticles, including chemical reduction with sodium borohydride, sodium borohydride with potassium iodide, isopropyl alcohol with cetyltrimethylammonium bormide (CTAB) and reducing sugars. The latter method was also the basis for an undergraduate laboratory experiment. For each reaction, the dependence of stability of the copper nanoparticles on reagent concentrations, additives, relative amounts of reactants, and temperature is explored. Atomic force microscopy (AFM), TEM and UV-Visible Spectroscopy were used to characterize the copper nanoparticles. A laboratory experiment to produce copper nanoparticles from household chemicals was developed.
He, Zhiqiao; Song, Shuang; Ying, Haiping; Xu, Lejin; Chen, Jianmeng
2007-07-01
The degradation of p-aminophenol (PAP) in aqueous solution by sonolysis, by ozonation, and by a combination of both was investigated in laboratory-scale experiments. Operation parameters such as pH, temperature, ultrasonic energy density and ozone dose were optimized with regard to the efficiency of PAP removal. The concentration of PAP during the reaction was detected by high-pressure liquid chromatography. The concentrations of ammonium ions and nitrate ions were monitored during the degradation. Intermediate products such as 4-iminocyclohexa-2,5-dien-1-one, phenol, but-2-enedioic acid, and acetic acid were detected by gas chromatography coupled with mass spectrometry. The degradation rate of PAP was higher in the combined system than in the linear combination of separate experiments. The degradation efficiency was decreased rapidly when n-butanol was added to the combined reaction system, which showed that some radical reaction might proceed during the laboratory experiments.
NASA Astrophysics Data System (ADS)
Wang, Kaiwei; Wang, Xiaoping
2017-08-01
In order to enhance the practical education and hands-on experience of optoelectronics and eliminate the overlapping contents that previously existed in the experiments section adhering to several different courses, a lab course of "Applied Optoelectronics Laboratory" has been established in the College of Optical Science and Engineering, Zhejiang University. The course consists of two sections, i.e., basic experiments and project design. In section 1, basic experiments provide hands-on experience with most of the fundamental concept taught in the corresponding courses. These basic experiments including the study of common light sources such as He-Ne laser, semiconductor laser and solid laser and LED; the testing and analysis of optical detectors based on effects of photovoltaic effect, photoconduction effect, photo emissive effect and array detectors. In section 2, the course encourages students to build a team and establish a stand-alone optical system to realize specific function by taking advantage of the basic knowledge learned from section 1. Through these measures, students acquired both basic knowledge and the practical application skills. Moreover, interest in science has been developed among students.
EVAL mission requirements, phase 1
NASA Technical Reports Server (NTRS)
1976-01-01
The aspects of NASA's applications mission were enhanced by utilization of shuttle/spacelab, and payload groupings which optimize the cost of achieving the mission goals were defined. Preliminary Earth Viewing Application Laboratory (EVAL) missions, experiments, sensors, and sensor groupings were developed. The major technological EVAL themes and objectives which NASA will be addressing during the 1980 to 2,000 time period were investigated. Missions/experiments which addressed technique development, sensor development, application development, and/or operational data collection were considered as valid roles for EVAL flights.
Foam generation and sample composition optimization for the FOAM-C experiment of the ISS
NASA Astrophysics Data System (ADS)
Carpy, R.; Picker, G.; Amann, B.; Ranebo, H.; Vincent-Bonnieu, S.; Minster, O.; Winter, J.; Dettmann, J.; Castiglione, L.; Höhler, R.; Langevin, D.
2011-12-01
End of 2009 and early 2010 a sealed cell, for foam generation and observation, has been designed and manufactured at Astrium Friedrichshafen facilities. With the use of this cell, different sample compositions of "wet foams" have been optimized for mixtures of chemicals such as water, dodecanol, pluronic, aethoxisclerol, glycerol, CTAB, SDS, as well as glass beads. This development is performed in the frame of the breadboarding development activities of the Experiment Container FOAM-C for operation in the ISS Fluid Science Laboratory (ISS). The sample cell supports multiple observation methods such as: Diffusing-Wave and Diffuse Transmission Spectrometry, Time Resolved Correlation Spectroscopy [1] and microscope observation, all of these methods are applied in the cell with a relatively small experiment volume <3cm3. These units, will be on orbit replaceable sets, that will allow multiple sample compositions processing (in the range of >40).
Development of an electromechanical principle for wet and dry milling
NASA Astrophysics Data System (ADS)
Halbedel, Bernd; Kazak, Oleg
2018-05-01
The paper presents a novel electromechanical principle for wet and dry milling of different materials, in which the milling beads are moved under a time- and local-variable magnetic field. A possibility to optimize the milling process in such a milling machine by simulation of the vector gradient distribution of the electromagnetic field in the process room is presented. The mathematical model and simulation methods based on standard software packages are worked out. The results of numerical simulations and experimental measurements of the electromagnetic field in the working chamber of a developed and manufactured laboratory plant correlate well with each other. Using the obtained operating parameters, dry milling experiments with crushed cement clinker and wet milling experiments of organic agents in the laboratory plant are performed and the results are discussed here.
NASA Astrophysics Data System (ADS)
Ren, Wei; Wang, Shujun; Lü, Mingsheng; Wang, Xiaobei; Fang, Yaowei; Jiao, Yuliang; Hu, Jianen
2016-03-01
We adopted the response surface methodology using single factor and orthogonal experiments to optimize four types of antimicrobial agents that could inhibit biofilm formation by Streptococcus mutans, which is commonly found in the human oral cavity and causes tooth decay. The objective was to improve the function of marine Arthrobacter oxydans KQ11 dextranase mouthwash (designed and developed by our laboratory). The experiment was conducted in a three-level, four-variable central composite design to determine the best combination of ZnSO4, lysozyme, citric acid and chitosan. The optimized antibacterial agents were 2.16 g/L ZnSO4, 14 g/L lysozyme, 4.5 g/L citric acid and 5 g/L chitosan. The biofilm formation inhibition reached 84.49%. In addition, microscopic observation of the biofilm was performed using scanning electron microscopy and confocal laser scanning microscopy. The optimized formula was tested in marine dextranase Arthrobacter oxydans KQ11 mouthwash and enhanced the inhibition of S. mutans. This work may be promoted for the design and development of future marine dextranase oral care products.
Long term fault system reorganization of convergent and strike-slip systems
NASA Astrophysics Data System (ADS)
Cooke, M. L.; McBeck, J.; Hatem, A. E.; Toeneboehn, K.; Beyer, J. L.
2017-12-01
Laboratory and numerical experiments representing deformation over many earthquake cycles demonstrate that fault evolution includes episodes of fault reorganization that optimize work on the fault system. Consequently, the mechanical and kinematic efficiencies of fault systems do not increase monotonically through their evolution. New fault configurations can optimize the external work required to accommodate deformation, suggesting that changes in system efficiency can drive fault reorganization. Laboratory evidence and numerical results show that fault reorganization within accretion, strike-slip and oblique convergent systems is associated with increasing efficiency due to increased fault slip (frictional work and seismic energy) and commensurate decreased off-fault deformation (internal work and work against gravity). Between episodes of fault reorganization, fault systems may become less efficient as they produce increasing off fault deformation. For example, laboratory and numerical experiments show that the interference and interaction between different fault segments may increase local internal work or that increasing convergence can increase work against gravity produced by a fault system. This accumulation of work triggers fault reorganization as stored work provides the energy required to grow new faults that reorganize the system to a more efficient configuration. The results of laboratory and numerical experiments reveal that we should expect crustal fault systems to reorganize following periods of increasing inefficiency, even in the absence of changes to the tectonic regime. In other words, fault reorganization doesn't require a change in tectonic loading. The time frame of fault reorganization depends on fault system configuration, strain rate and processes that relax stresses within the crust. For example, stress relaxation may keep pace with stress accumulation, which would limit the increase in the internal work and gravitational work so that irregularities can persist along active fault systems without reorganization of the fault system. Consequently, steady state behavior, for example with constant fault slip rates, may arise either in systems with high degree of stress-relaxation or occur only within the intervals between episodes of fault reorganization.
Measurement of lifespan in Drosophila melanogaster.
Linford, Nancy J; Bilgir, Ceyda; Ro, Jennifer; Pletcher, Scott D
2013-01-07
Aging is a phenomenon that results in steady physiological deterioration in nearly all organisms in which it has been examined, leading to reduced physical performance and increased risk of disease. Individual aging is manifest at the population level as an increase in age-dependent mortality, which is often measured in the laboratory by observing lifespan in large cohorts of age-matched individuals. Experiments that seek to quantify the extent to which genetic or environmental manipulations impact lifespan in simple model organisms have been remarkably successful for understanding the aspects of aging that are conserved across taxa and for inspiring new strategies for extending lifespan and preventing age-associated disease in mammals. The vinegar fly, Drosophila melanogaster, is an attractive model organism for studying the mechanisms of aging due to its relatively short lifespan, convenient husbandry, and facile genetics. However, demographic measures of aging, including age-specific survival and mortality, are extraordinarily susceptible to even minor variations in experimental design and environment, and the maintenance of strict laboratory practices for the duration of aging experiments is required. These considerations, together with the need to practice careful control of genetic background, are essential for generating robust measurements. Indeed, there are many notable controversies surrounding inference from longevity experiments in yeast, worms, flies and mice that have been traced to environmental or genetic artifacts(1-4). In this protocol, we describe a set of procedures that have been optimized over many years of measuring longevity in Drosophila using laboratory vials. We also describe the use of the dLife software, which was developed by our laboratory and is available for download (http://sitemaker.umich.edu/pletcherlab/software). dLife accelerates throughput and promotes good practices by incorporating optimal experimental design, simplifying fly handling and data collection, and standardizing data analysis. We will also discuss the many potential pitfalls in the design, collection, and interpretation of lifespan data, and we provide steps to avoid these dangers.
Optimization of the tungsten oxide technique for measurement of atmospheric ammonia
NASA Technical Reports Server (NTRS)
Brown, Kenneth G.
1987-01-01
Hollow tubes coated with tungstic acid have been shown to be of value in the determination of ammonia and nitric acid in ambient air. Practical application of this technique was demonstrated utilizing an automated sampling system for in-flight collection and analysis of atmospheric samples. Due to time constraints these previous measurements were performed on tubes that had not been well characterized in the laboratory. As a result the experimental precision could not be accurately estimated. Since the technique was being compared to other techniques for measuring these compounds, it became necessary to perform laboratory tests which would establish the reliability of the technique. This report is a summary of these laboratory experiments as they are applied to the determination of ambient ammonia concentration.
Early-Time HANE Simulation and Experiment
1987-05-01
we suggest attacking the question of structure with target variation. Irregular structure in laboratory simulations of HANE’s can arise from...plasma instabilities even when the illuminating laser beam and target are optimized to yield a very uniform initial expansion. Irregularities ... irregular structure be purposely explored. In particular we encourage the use of a wide variety of inhomogeneous targets. Target irregularities
ERIC Educational Resources Information Center
Parker, Patrick D.; Beers, Brandon; Vergne, Matthew J.
2017-01-01
Laboratory experiments were developed to introduce students to the quantitation of drugs of abuse by high performance liquid chromatography-tandem mass spectrometry (LC-MS/MS). Undergraduate students were introduced to internal standard quantitation and the LC-MS/MS method optimization for cocaine. Cocaine extracted from paper currency was…
[Simultaneous desulfurization and denitrification by TiO2/ACF under different irradiation].
Han, Jing; Zhao, Yi
2009-04-15
The supported TiO2 photocatalysts were prepared in laboratory, and the experiments of simultaneous desulfurization and denitrification were carried out by self-designed photocatalysis reactor. The optimal experimental conditions were achieved, and the efficiencies of simultaneous desulfurization and denitrification under two different light sources were compared. The results show that the oxygen content of flue gas, reaction temperature, flue gas humidity and irradiation intensity are most essential factors to photocatalysis. For TiO2/ACF, the removal efficiencies of 99.7% for SO2 and 64.3% for NO are obtained respectively at optimal experimental conditions under UV irradiation. For TiO2/ACF, the removal efficiencies of 97.5% for SO2 and 49.6% for NO are achieved respectively at optimal experimental conditions under the visible light irradiation. The results of five times parallel experiments indicate standard deviation S of parallel data is little. The mechanism of removal for SO2 and NO is proposed under two light sources by ion chromatography analysis of the absorption liquid.
Efficiency Measurement of VANDLE Modules
NASA Astrophysics Data System (ADS)
Peters, William; Matei, C.; Cizewski, J. A.; O'Malley, P. D.; Spassova, I.; Bardayan, D.; Blackmon, J. C.; Brune, C.; Massey, T.; Grzywacz, R. K.; Madurga, M.; Sarazin, F.; Raiola, F.
2010-02-01
The Versatile Array of Neutron Detectors at Low Energy (VANDLE) is a new array of plastic scintillator bars being developed at the Holifield Radioactive Ion Beam Facility (HRIBF) at Oak Ridge National Laboratory (ORNL). The modular design enables optimization of different configurations for particular experiments, such as (d,n) and beta-delayed neutron-decay experiments, with rare ion beams. Two prototype modules were moved to the Edwards Accelerator Laboratory at Ohio University to measure their efficiency using a calibrated ^27Al(d,n) reaction as a neutron source. Results show that one bar with a cross section of 3x3 cm^2 is over 25% efficient to neutrons around 1 MeV with sensitivity down to 100 keV neutrons. Other design features such as wrapping and coupling will be presented, as well as results from resolution tests. )
Development of Solvent Extraction Approach to Recycle Enriched Molybdenum Material
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tkac, Peter; Brown, M. Alex; Sen, Sujat
2016-06-01
Argonne National Laboratory, in cooperation with Oak Ridge National Laboratory and NorthStar Medical Technologies, LLC, is developing a recycling process for a solution containing valuable Mo-100 or Mo-98 enriched material. Previously, Argonne had developed a recycle process using a precipitation technique. However, this process is labor intensive and can lead to production of large volumes of highly corrosive waste. This report discusses an alternative process to recover enriched Mo in the form of ammonium heptamolybdate by using solvent extraction. Small-scale experiments determined the optimal conditions for effective extraction of high Mo concentrations. Methods were developed for removal of ammonium chloridemore » from the molybdenum product of the solvent extraction process. In large-scale experiments, very good purification from potassium and other elements was observed with very high recovery yields (~98%).« less
SCOUT: small chamber for optical UV tests
NASA Astrophysics Data System (ADS)
Pancrazzi, M.; Landini, F.; Romoli, M.; Totaro, M.; Pennelli, G.
2017-11-01
SCOUT is the acronym of the new facility developed within the XUVLab laboratory of the Department of Physics and Astronomy of the University of Florence. SCOUT stands for "Small Chamber for Optical UV Tests" and has been designed to perform practical and fast measurements for those experiments requiring an evacuated environment. SCOUT has been thought, designed and manufactured by paying a particular attention to its flexibility and adaptability. The functionality and the capabilities of SCOUT have been recently tested in a measurement campaign to characterize an innovative wire-grid polarizer optimized to work in transmission in the UV band. This paper provides a description of the overall manufactured system and its performance and shows the additional resources available at the XUVLab laboratory in Florence that make SCOUT exploitable by whatever compact (within 1 m) optical experiment that investigates the UV band of the spectrum.
Research on a new wave energy absorption device
NASA Astrophysics Data System (ADS)
Lu, Zhongyue; Shang, Jianzhong; Luo, Zirong; Sun, Chongfei; Zhu, Yiming
2018-01-01
To reduce impact of global warming and the energy crisis problems caused by pollution of energy combustion, the research on renewable and clean energies becomes more and more important. This paper designed a new wave absorption device, and also gave an introduction on its mechanical structure. The flow tube model is analyzed, and presented the formulation of the proposed method. To verify the principle of wave absorbing device, an experiment was carried out in a laboratory environment, and the results of the experiment can be applied for optimizing the structure design of output power.
NASA Astrophysics Data System (ADS)
Smits, K. M.; Drumheller, Z. W.; Lee, J. H.; Illangasekare, T. H.; Regnery, J.; Kitanidis, P. K.
2015-12-01
Aquifers around the world show troubling signs of irreversible depletion and seawater intrusion as climate change, population growth, and urbanization lead to reduced natural recharge rates and overuse. Scientists and engineers have begun to revisit the technology of managed aquifer recharge and recovery (MAR) as a means to increase the reliability of the diminishing and increasingly variable groundwater supply. Unfortunately, MAR systems remain wrought with operational challenges related to the quality and quantity of recharged and recovered water stemming from a lack of data-driven, real-time control. This research seeks to develop and validate a general simulation-based control optimization algorithm that relies on real-time data collected though embedded sensors that can be used to ease the operational challenges of MAR facilities. Experiments to validate the control algorithm were conducted at the laboratory scale in a two-dimensional synthetic aquifer under both homogeneous and heterogeneous packing configurations. The synthetic aquifer used well characterized technical sands and the electrical conductivity signal of an inorganic conservative tracer as a surrogate measure for water quality. The synthetic aquifer was outfitted with an array of sensors and an autonomous pumping system. Experimental results verified the feasibility of the approach and suggested that the system can improve the operation of MAR facilities. The dynamic parameter inversion reduced the average error between the simulated and observed pressures between 12.5 and 71.4%. The control optimization algorithm ran smoothly and generated optimal control decisions. Overall, results suggest that with some improvements to the inversion and interpolation algorithms, which can be further advanced through testing with laboratory experiments using sensors, the concept can successfully improve the operation of MAR facilities.
The optimization of total laboratory automation by simulation of a pull-strategy.
Yang, Taho; Wang, Teng-Kuan; Li, Vincent C; Su, Chia-Lo
2015-01-01
Laboratory results are essential for physicians to diagnose medical conditions. Because of the critical role of medical laboratories, an increasing number of hospitals use total laboratory automation (TLA) to improve laboratory performance. Although the benefits of TLA are well documented, systems occasionally become congested, particularly when hospitals face peak demand. This study optimizes TLA operations. Firstly, value stream mapping (VSM) is used to identify the non-value-added time. Subsequently, batch processing control and parallel scheduling rules are devised and a pull mechanism that comprises a constant work-in-process (CONWIP) is proposed. Simulation optimization is then used to optimize the design parameters and to ensure a small inventory and a shorter average cycle time (CT). For empirical illustration, this approach is applied to a real case. The proposed methodology significantly improves the efficiency of laboratory work and leads to a reduction in patient waiting times and increased service level.
Factors that impact clinical laboratory scientists' commitment to their work organizations.
Bamberg, Richard; Akroyd, Duane; Moore, Ti'eshia M
2008-01-01
To assess the predictive ability of various aspects of the work environment for organizational commitment. A questionnaire measuring three dimensions of organizational commitment along with five aspects of work environment and 10 demographic and work setting characteristics was sent to a national, convenience sample of clinical laboratory professionals. All persons obtaining the CLS certification by NCA from January 1, 1997 to December 31, 2006. Only respondents who worked full-time in a clinical laboratory setting were included in the database. Levels of affective, normative, and continuance organizational commitment, organizational support, role clarity, role conflict, transformational leadership behavior of supervisor, and organizational type, total years work experience in clinical laboratories, and educational level of respondents. Questionnaire items used either a 7-point or 5-point Likert response scale. Based on multiple regression analysis for the 427 respondents, organizational support and transformational leadership behavior were found to be significant positive predictors of affective and normative organizational commitment. Work setting (non-hospital laboratory) and total years of work experience in clinical laboratories were found to be significant positive predictors of continuance organizational commitment. Overall the organizational commitment levels for all three dimensions were at the neutral rating or below in the slightly disagree range. The results indicate a less than optimal level of organizational commitment to employers, which were predominantly hospitals, by CLS practitioners. This may result in continuing retention problems for hospital laboratories. The results offer strategies for improving organizational commitment via the significant predictors.
Pickard, Dawn
2007-01-01
We have developed experiments and materials to model human genetics using rapid cycling Brassica rapa, also known as Fast Plants. Because of their self-incompatibility for pollination and the genetic diversity within strains, B. rapa can serve as a relevant model for human genetics in teaching laboratory experiments. The experiment presented here is a paternity exclusion project in which a child is born with a known mother but two possible alleged fathers. Students use DNA markers (microsatellites) to perform paternity exclusion on these subjects. Realistic DNA marker analysis can be challenging to implement within the limitations of an instructional lab, but we have optimized the experimental methods to work in a teaching lab environment and to maximize the “hands-on” experience for the students. The genetic individuality of each B. rapa plant, revealed by analysis of polymorphic microsatellite markers, means that each time students perform this project, they obtain unique results that foster independent thinking in the process of data interpretation. PMID:17548880
Microfluidics for High School Chemistry Students.
Hemling, Melissa; Crooks, John A; Oliver, Piercen M; Brenner, Katie; Gilbertson, Jennifer; Lisensky, George C; Weibel, Douglas B
2014-01-14
We present a laboratory experiment that introduces high school chemistry students to microfluidics while teaching fundamental properties of acid-base chemistry. The procedure enables students to create microfluidic systems using nonspecialized equipment that is available in high school classrooms and reagents that are safe, inexpensive, and commercially available. The experiment is designed to ignite creativity and confidence about experimental design in a high school chemistry class. This experiment requires a computer program (e.g., PowerPoint), Shrinky Dink film, a readily available silicone polymer, weak acids, bases, and a colorimetric pH indicator. Over the span of five 45-min class periods, teams of students design and prepare devices in which two different pH solutions mix in a predictable way to create five different pH solutions. Initial device designs are instructive but rarely optimal. During two additional half-class periods, students have the opportunity to use their initial observations to redesign their microfluidic systems to optimize the outcome. The experiment exposes students to cutting-edge science and the design process, and solidifies introductory chemistry concepts including laminar flow, neutralization of weak acids-bases, and polymers.
Microfluidics for High School Chemistry Students
Hemling, Melissa; Crooks, John A.; Oliver, Piercen M.; Brenner, Katie; Gilbertson, Jennifer; Lisensky, George C.; Weibel, Douglas B.
2014-01-01
We present a laboratory experiment that introduces high school chemistry students to microfluidics while teaching fundamental properties of acid–base chemistry. The procedure enables students to create microfluidic systems using nonspecialized equipment that is available in high school classrooms and reagents that are safe, inexpensive, and commercially available. The experiment is designed to ignite creativity and confidence about experimental design in a high school chemistry class. This experiment requires a computer program (e.g., PowerPoint), Shrinky Dink film, a readily available silicone polymer, weak acids, bases, and a colorimetric pH indicator. Over the span of five 45-min class periods, teams of students design and prepare devices in which two different pH solutions mix in a predictable way to create five different pH solutions. Initial device designs are instructive but rarely optimal. During two additional half-class periods, students have the opportunity to use their initial observations to redesign their microfluidic systems to optimize the outcome. The experiment exposes students to cutting-edge science and the design process, and solidifies introductory chemistry concepts including laminar flow, neutralization of weak acids–bases, and polymers. PMID:25584013
Development of fire shutters based on numerical optimizations
NASA Astrophysics Data System (ADS)
Novak, Ondrej; Kulhavy, Petr; Martinec, Tomas; Petru, Michal; Srb, Pavel
2018-06-01
This article deals with a prototype concept, real experiment and numerical simulation of a layered industrial fire shutter, based on some new insulating composite materials. The real fire shutter has been developed and optimized in laboratory and subsequently tested in the certified test room. A simulation of whole concept has been carried out as the non-premixed combustion process in the commercial final volume sw Pyrosim. Model of the combustion based on a stoichiometric defined mixture of gas and the tested layered samples showed good conformity with experimental results - i.e. thermal distribution inside and heat release rate that has gone through the sample.
Optimization of wood plastic composite decks
NASA Astrophysics Data System (ADS)
Ravivarman, S.; Venkatesh, G. S.; Karmarkar, A.; Shivkumar N., D.; Abhilash R., M.
2018-04-01
Wood Plastic Composite (WPC) is a new class of natural fibre based composite material that contains plastic matrix reinforced with wood fibres or wood flour. In the present work, Wood Plastic Composite was prepared with 70-wt% of wood flour reinforced in polypropylene matrix. Mechanical characterization of the composite was done by carrying out laboratory tests such as tensile test and flexural test as per the American Society for Testing and Materials (ASTM) standards. Computer Aided Design (CAD) model of the laboratory test specimen (tensile test) was created and explicit finite element analysis was carried out on the finite element model in non-linear Explicit FE code LS - DYNA. The piecewise linear plasticity (MAT 24) material model was identified as a suitable model in LS-DYNA material library, describing the material behavior of the developed composite. The composite structures for decking application in construction industry were then optimized for cross sectional area and distance between two successive supports (span length) by carrying out various numerical experiments in LS-DYNA. The optimized WPC deck (Elliptical channel-2 E10) has 45% reduced weight than the baseline model (solid cross-section) considered in this study with the load carrying capacity meeting acceptance criterion (allowable deflection & stress) for outdoor decking application.
Frasson, L; Neubert, J; Reina, S; Oldfield, M; Davies, B L; Rodriguez Y Baena, F
2010-01-01
The popularity of minimally invasive surgical procedures is driving the development of novel, safer and more accurate surgical tools. In this context a multi-part probe for soft tissue surgery is being developed in the Mechatronics in Medicine Laboratory at Imperial College, London. This study reports an optimization procedure using finite element methods, for the identification of an interlock geometry able to limit the separation of the segments composing the multi-part probe. An optimal geometry was obtained and the corresponding three-dimensional finite element model validated experimentally. Simulation results are shown to be consistent with the physical experiments. The outcome of this study is an important step in the provision of a novel miniature steerable probe for surgery.
NASA Technical Reports Server (NTRS)
Scott, Elaine P.
1993-01-01
Thermal stress analyses are an important aspect in the development of aerospace vehicles such as the National Aero-Space Plane (NASP) and the High-Speed Civil Transport (HSCT) at NASA-LaRC. These analyses require knowledge of the temperature within the structures which consequently necessitates the need for thermal property data. The initial goal of this research effort was to develop a methodology for the estimation of thermal properties of aerospace structural materials at room temperature and to develop a procedure to optimize the estimation process. The estimation procedure was implemented utilizing a general purpose finite element code. In addition, an optimization procedure was developed and implemented to determine critical experimental parameters to optimize the estimation procedure. Finally, preliminary experiments were conducted at the Aircraft Structures Branch (ASB) laboratory.
NASA Astrophysics Data System (ADS)
Scott, Elaine P.
1993-12-01
Thermal stress analyses are an important aspect in the development of aerospace vehicles such as the National Aero-Space Plane (NASP) and the High-Speed Civil Transport (HSCT) at NASA-LaRC. These analyses require knowledge of the temperature within the structures which consequently necessitates the need for thermal property data. The initial goal of this research effort was to develop a methodology for the estimation of thermal properties of aerospace structural materials at room temperature and to develop a procedure to optimize the estimation process. The estimation procedure was implemented utilizing a general purpose finite element code. In addition, an optimization procedure was developed and implemented to determine critical experimental parameters to optimize the estimation procedure. Finally, preliminary experiments were conducted at the Aircraft Structures Branch (ASB) laboratory.
Fei, Xunchang; Zekkos, Dimitrios; Raskin, Lutgarde
2016-09-01
The energy conversion potential of municipal solid waste (MSW) disposed of in landfills remains largely untapped because of the slow and variable rate of biogas generation, delayed and inefficient biogas collection, leakage of biogas, and landfill practices and infrastructure that are not geared toward energy recovery. A database consisting of methane (CH4) generation data, the major constituent of biogas, from 49 laboratory experiments and field monitoring data from 57 landfills was developed. Three CH4 generation parameters, i.e., waste decay rate (k), CH4 generation potential (L0), and time until maximum CH4 generation rate (tmax), were calculated for each dataset using U.S. EPA's Landfill Gas Emission Model (LandGEM). Factors influencing the derived parameters in laboratory experiments and landfills were investigated using multi-linear regression analysis. Total weight of waste (W) was correlated with biodegradation conditions through a ranked classification scheme. k increased with increasing percentage of readily biodegradable waste (Br0 (%)) and waste temperature, and reduced with increasing W, an indicator of less favorable biodegradation conditions. The values of k obtained in the laboratory were commonly significantly higher than those in landfills and those recommended by LandGEM. The mean value of L0 was 98 and 88L CH4/kg waste for laboratory and field studies, respectively, but was significantly affected by waste composition with ranges from 10 to 300L CH4/kg. tmax increased with increasing percentage of biodegradable waste (B0) and W. The values of tmax in landfills were higher than those in laboratory experiments or those based on LandGEM's recommended parameters. Enhancing biodegradation conditions in landfill cells has a greater impact on improving k and tmax than increasing B0. Optimizing the B0 and Br0 values of landfilled waste increases L0 and reduces tmax. Copyright © 2015 Elsevier Ltd. All rights reserved.
Breakthrough: Fermilab Accelerator Technology
None
2018-02-07
There are more than 30,000 particle accelerators in operation around the world. At Fermilab, scientists are collaborating with other laboratories and industry to optimize the manufacturing processes for a new type of powerful accelerator that uses superconducting niobium cavities. Experimenting with unique polishing materials, a Fermilab team has now developed an efficient and environmentally friendly way of creating cavities that can propel particles with more than 30 million volts per meter.
Breakthrough: Fermilab Accelerator Technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2012-04-23
There are more than 30,000 particle accelerators in operation around the world. At Fermilab, scientists are collaborating with other laboratories and industry to optimize the manufacturing processes for a new type of powerful accelerator that uses superconducting niobium cavities. Experimenting with unique polishing materials, a Fermilab team has now developed an efficient and environmentally friendly way of creating cavities that can propel particles with more than 30 million volts per meter.
Smooth Constrained Heuristic Optimization of a Combinatorial Chemical Space
2015-05-01
ARL-TR-7294•MAY 2015 US Army Research Laboratory Smooth ConstrainedHeuristic Optimization of a Combinatorial Chemical Space by Berend Christopher...7294•MAY 2015 US Army Research Laboratory Smooth ConstrainedHeuristic Optimization of a Combinatorial Chemical Space by Berend Christopher...
Schirmer, Emily B; Golden, Kathryn; Xu, Jin; Milling, Jesse; Murillo, Alec; Lowden, Patricia; Mulagapati, Srihariraju; Hou, Jinzhao; Kovalchin, Joseph T; Masci, Allyson; Collins, Kathryn; Zarbis-Papastoitsis, Gregory
2013-08-01
Through a parallel approach of tracking product quality through fermentation and purification development, a robust process was designed to reduce the levels of product-related species. Three biochemically similar product-related species were identified as byproducts of host-cell enzymatic activity. To modulate intracellular proteolytic activity, key fermentation parameters (temperature, pH, trace metals, EDTA levels, and carbon source) were evaluated through bioreactor optimization, while balancing negative effects on growth, productivity, and oxygen demand. The purification process was based on three non-affinity steps and resolved product-related species by exploiting small charge differences. Using statistical design of experiments for elution conditions, a high-resolution cation exchange capture column was optimized for resolution and recovery. Further reduction of product-related species was achieved by evaluating a matrix of conditions for a ceramic hydroxyapatite column. The optimized fermentation process was transferred from the 2-L laboratory scale to the 100-L pilot scale and the purification process was scaled accordingly to process the fermentation harvest. The laboratory- and pilot-scale processes resulted in similar process recoveries of 60 and 65%, respectively, and in a product that was of equal quality and purity to that of small-scale development preparations. The parallel approach for up- and downstream development was paramount in achieving a robust and scalable clinical process. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Routine Digital Pathology Workflow: The Catania Experience
Fraggetta, Filippo; Garozzo, Salvatore; Zannoni, Gian Franco; Pantanowitz, Liron; Rossi, Esther Diana
2017-01-01
Introduction: Successful implementation of whole slide imaging (WSI) for routine clinical practice has been accomplished in only a few pathology laboratories worldwide. We report the transition to an effective and complete digital surgical pathology workflow in the pathology laboratory at Cannizzaro Hospital in Catania, Italy. Methods: All (100%) permanent histopathology glass slides were digitized at ×20 using Aperio AT2 scanners. Compatible stain and scanning slide racks were employed to streamline operations. eSlide Manager software was bidirectionally interfaced with the anatomic pathology laboratory information system. Virtual slide trays connected to the two-dimensional (2D) barcode tracking system allowed pathologists to confirm that they were correctly assigned slides and that all tissues on these glass slides were scanned. Results: Over 115,000 glass slides were digitized with a scan fail rate of around 1%. Drying glass slides before scanning minimized them sticking to scanner racks. Implementation required introduction of a 2D barcode tracking system and modification of histology workflow processes. Conclusion: Our experience indicates that effective adoption of WSI for primary diagnostic use was more dependent on optimizing preimaging variables and integration with the laboratory information system than on information technology infrastructure and ensuring pathologist buy-in. Implementation of digital pathology for routine practice not only leveraged the benefits of digital imaging but also creates an opportunity for establishing standardization of workflow processes in the pathology laboratory. PMID:29416914
Routine Digital Pathology Workflow: The Catania Experience.
Fraggetta, Filippo; Garozzo, Salvatore; Zannoni, Gian Franco; Pantanowitz, Liron; Rossi, Esther Diana
2017-01-01
Successful implementation of whole slide imaging (WSI) for routine clinical practice has been accomplished in only a few pathology laboratories worldwide. We report the transition to an effective and complete digital surgical pathology workflow in the pathology laboratory at Cannizzaro Hospital in Catania, Italy. All (100%) permanent histopathology glass slides were digitized at ×20 using Aperio AT2 scanners. Compatible stain and scanning slide racks were employed to streamline operations. eSlide Manager software was bidirectionally interfaced with the anatomic pathology laboratory information system. Virtual slide trays connected to the two-dimensional (2D) barcode tracking system allowed pathologists to confirm that they were correctly assigned slides and that all tissues on these glass slides were scanned. Over 115,000 glass slides were digitized with a scan fail rate of around 1%. Drying glass slides before scanning minimized them sticking to scanner racks. Implementation required introduction of a 2D barcode tracking system and modification of histology workflow processes. Our experience indicates that effective adoption of WSI for primary diagnostic use was more dependent on optimizing preimaging variables and integration with the laboratory information system than on information technology infrastructure and ensuring pathologist buy-in. Implementation of digital pathology for routine practice not only leveraged the benefits of digital imaging but also creates an opportunity for establishing standardization of workflow processes in the pathology laboratory.
Optimizing electricity consumption: A case of function learning.
Guath, Mona; Millroth, Philip; Juslin, Peter; Elwin, Ebba
2015-12-01
A popular way to improve consumers' control over their electricity consumption is by providing outcome feedback on the cost with in-home displays. Research on function learning, however, suggests that outcome feedback may not always be ideal for learning, especially if the feedback signal is noisy. In this study, we relate research on function learning to in-home displays and use a laboratory task simulating a household to investigate the role of outcome feedback and function learning on electricity optimization. Three function training schemes (FTSs) are presented that convey specific properties of the functions that relate the electricity consumption to the utility and cost. In Experiment 1, we compared learning from outcome feedback with 3 FTSs, 1 of which allowed maximization of the utility while keeping the budget, despite no feedback about the total monthly cost. In Experiment 2, we explored the combination of this FTS and outcome feedback. The results suggested that electricity optimization may be facilitated if feedback learning is preceded by a brief period of function training. (c) 2015 APA, all rights reserved).
Singh, Kunwar P; Rai, Premanjali; Pandey, Priyanka; Sinha, Sarita
2012-01-01
The present research aims to investigate the individual and interactive effects of chlorine dose/dissolved organic carbon ratio, pH, temperature, bromide concentration, and reaction time on trihalomethanes (THMs) formation in surface water (a drinking water source) during disinfection by chlorination in a prototype laboratory-scale simulation and to develop a model for the prediction and optimization of THMs levels in chlorinated water for their effective control. A five-factor Box-Behnken experimental design combined with response surface and optimization modeling was used for predicting the THMs levels in chlorinated water. The adequacy of the selected model and statistical significance of the regression coefficients, independent variables, and their interactions were tested by the analysis of variance and t test statistics. The THMs levels predicted by the model were very close to the experimental values (R(2) = 0.95). Optimization modeling predicted maximum (192 μg/l) TMHs formation (highest risk) level in water during chlorination was very close to the experimental value (186.8 ± 1.72 μg/l) determined in laboratory experiments. The pH of water followed by reaction time and temperature were the most significant factors that affect the THMs formation during chlorination. The developed model can be used to determine the optimum characteristics of raw water and chlorination conditions for maintaining the THMs levels within the safe limit.
Giannoli, Jean-Marc; Szymanowicz, Anton
2011-01-01
We propose a set of recommendations and practices to optimize the use of quality control of medical biology examinations. The fundamentals are reviewed: definition of a series of analysis, IQC at one or more level, Westgard alert rules and rejection, practical remedial actions to take for the technician, corrective and preventive actions to be implemented by the biologist. We have also formalized three flowcharts to guide the technician in their daily practice to ensure analytical quality of investigations carried out. These decision trees are the result of the experience submitted by an accredited and professional laboratory attentive to the ongoing improvement of IQC. This article can provide useful assistance to biologists for accreditation but also aims to foster collaboration reliable medical biology laboratory at the appropriate management of patients.
(Preoxidation cleaning optimization for crystalline silicon)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-01-01
A series of controlled experiments has been performed in Sandia's Photovoltaic Device Fabrication Laboratory to evaluate the effect of various chemical surface treatments on the recombination lifetime of crystalline silicon wafers subjected to a high-temperature dry oxidation. From this series of experiments we have deduced a relatively simple yet effective cleaning sequence. We have also evaluated the effect of different chemical damage-removal etches for improving the recombination lifetime and surface smoothness of mechanically lapped wafers. This paper presents the methodology used, the experimental results obtained, and our experience with using this process on a continuing basis over a period ofmore » many months. 7 refs., 4 figs., 1 tab.« less
Optimization of In-Cylinder Pressure Filter for Engine Research
2017-06-01
ARL-TR-8034 ● JUN 2017 US Army Research Laboratory Optimization of In-Cylinder Pressure Filter for Engine Research by Kenneth...Laboratory Optimization of In-Cylinder Pressure Filter for Engine Research by Kenneth S Kim, Michael T Szedlmayer, Kurt M Kruger, and Chol-Bum M...
Four experimental demonstrations of active vibration control for flexible structures
NASA Technical Reports Server (NTRS)
Phillips, Doug; Collins, Emmanuel G., Jr.
1990-01-01
Laboratory experiments designed to test prototype active-vibration-control systems under development for future flexible space structures are described, summarizing previously reported results. The control-synthesis technique employed for all four experiments was the maximum-entropy optimal-projection (MEOP) method (Bernstein and Hyland, 1988). Consideration is given to: (1) a pendulum experiment on large-amplitude LF dynamics; (2) a plate experiment on broadband vibration suppression in a two-dimensional structure; (3) a multiple-hexagon experiment combining the factors studied in (1) and (2) to simulate the complexity of a large space structure; and (4) the NASA Marshall ACES experiment on a lightweight deployable 45-foot beam. Extensive diagrams, drawings, graphs, and photographs are included. The results are shown to validate the MEOP design approach, demonstrating that good performance is achievable using relatively simple low-order decentralized controllers.
Divergence of gastropod life history in contrasting thermal environments in a geothermal lake.
Johansson, M P; Ermold, F; Kristjánsson, B K; Laurila, A
2016-10-01
Experiments using natural populations have provided mixed support for thermal adaptation models, probably because the conditions are often confounded with additional environmental factors like seasonality. The contrasting geothermal environments within Lake Mývatn, northern Iceland, provide a unique opportunity to evaluate thermal adaptation models using closely located natural populations. We conducted laboratory common garden and field reciprocal transplant experiments to investigate how thermal origin influences the life history of Radix balthica snails originating from stable cold (6 °C), stable warm (23 °C) thermal environments or from areas with seasonal temperature variation. Supporting thermal optimality models, warm-origin snails survived poorly at 6 °C in the common garden experiment and better than cold-origin and seasonal-origin snails in the warm habitat in the reciprocal transplant experiment. Contrary to thermal adaptation models, growth rate in both experiments was highest in the warm populations irrespective of temperature, indicating cogradient variation. The optimal temperatures for growth and reproduction were similar irrespective of origin, but cold-origin snails always had the lowest performance, and seasonal-origin snails often performed at an intermediate level compared to snails originating in either stable environment. Our results indicate that central life-history traits can differ in their mode of evolution, with survival following the predictions of thermal optimality models, whereas ecological constraints have shaped the evolution of growth rates in local populations. © 2016 European Society For Evolutionary Biology. Journal of Evolutionary Biology © 2016 European Society For Evolutionary Biology.
Generalized Optimal-State-Constraint Extended Kalman Filter (OSC-EKF)
2017-02-01
ARL-TR-7948• FEB 2017 US Army Research Laboratory GeneralizedOptimal-State-Constraint ExtendedKalman Filter (OSC-EKF) by James M Maley, Kevin...originator. ARL-TR-7948• FEB 2017 US Army Research Laboratory GeneralizedOptimal-State-Constraint ExtendedKalman Filter (OSC-EKF) by James M Maley Weapons and...
Alternatives to Pyrotechnic Distress Signals; Additional Signal Evaluation
2017-06-01
conducted a series of laboratory experiments designed to determine the optimal signal color and temporal pattern for identification against a variety of...practice” trials at approximately 2030 local time and began the actual Test 1 observation trials at approximately 2130. The series of trials finished at...Lewandowski , 860-271-2692, email: M.J.Lewandowski@uscg.mil 16. Abstract (MAXIMUM 200 WORDS) This report is the fourth in a series that details work
Rosa, Rossana; Zavala, Bruno; Cain, Natalie; Anjan, Shweta; Aragon, Laura; Abbo, Lilian M
2018-03-01
Antimicrobial stewardship programs can optimize the management of Staphylococcus aureus bacteremia by integrating information technology and microbiology laboratory resources. This study describes our experience implementing an intervention consisting of real-time feedback and the use of an electronic order set for the management of S. aureus bacteremia. Infect Control Hosp Epidemiol 2018;39:346-349.
2005-01-01
qubits . Suppression of Superconductivity in Granular Metals Igor Beloborodov Argonne National Laboratory, USA We investigate the suppression of...Russia Various strategies for extending coherence times of superconducting qubits have been proposed. We analyze the effect of fluctuations on a... qubit operated at an optimal point in the free- induction decay and the spin-echo-like experiments. Motivated by the recent experimental findings we
Laser Assisted CVD Growth of A1N and GaN
1990-08-31
additional cost sharing. RESEARCH FACILITIES The york is being performed in the Howard University Laser Laboratory. This is a free-standing buildinq...would be used to optimize computer models of the laser induced CVD reactor. FACILITIES AND EQUIPMENT - ADDITIONAL COST SHARING This year Howard ... University has provided $45,000 for the purchase of an excimer laser to be shared by Dr. Crye for the diode laser probe experiments and another Assistant
NASA Astrophysics Data System (ADS)
Uhlemann, Sebastian; Wilkinson, Paul B.; Maurer, Hansruedi; Wagner, Florian M.; Johnson, Timothy C.; Chambers, Jonathan E.
2018-07-01
Within geoelectrical imaging, the choice of measurement configurations and electrode locations is known to control the image resolution. Previous work has shown that optimized survey designs can provide a model resolution that is superior to standard survey designs. This paper demonstrates a methodology to optimize resolution within a target area, while limiting the number of required electrodes, thereby selecting optimal electrode locations. This is achieved by extending previous work on the `Compare-R' algorithm, which by calculating updates to the resolution matrix optimizes the model resolution in a target area. Here, an additional weighting factor is introduced that allows to preferentially adding measurement configurations that can be acquired on a given set of electrodes. The performance of the optimization is tested on two synthetic examples and verified with a laboratory study. The effect of the weighting factor is investigated using an acquisition layout comprising a single line of electrodes. The results show that an increasing weight decreases the area of improved resolution, but leads to a smaller number of electrode positions. Imaging results superior to a standard survey design were achieved using 56 per cent fewer electrodes. The performance was also tested on a 3-D acquisition grid, where superior resolution within a target at the base of an embankment was achieved using 22 per cent fewer electrodes than a comparable standard survey. The effect of the underlying resistivity distribution on the performance of the optimization was investigated and it was shown that even strong resistivity contrasts only have minor impact. The synthetic results were verified in a laboratory tank experiment, where notable image improvements were achieved. This work shows that optimized surveys can be designed that have a resolution superior to standard survey designs, while requiring significantly fewer electrodes. This methodology thereby provides a means for improving the efficiency of geoelectrical imaging.
NASA Astrophysics Data System (ADS)
Uhlemann, Sebastian; Wilkinson, Paul B.; Maurer, Hansruedi; Wagner, Florian M.; Johnson, Timothy C.; Chambers, Jonathan E.
2018-03-01
Within geoelectrical imaging, the choice of measurement configurations and electrode locations is known to control the image resolution. Previous work has shown that optimized survey designs can provide a model resolution that is superior to standard survey designs. This paper demonstrates a methodology to optimize resolution within a target area, while limiting the number of required electrodes, thereby selecting optimal electrode locations. This is achieved by extending previous work on the `Compare-R' algorithm, which by calculating updates to the resolution matrix optimizes the model resolution in a target area. Here, an additional weighting factor is introduced that allows to preferentially adding measurement configurations that can be acquired on a given set of electrodes. The performance of the optimization is tested on two synthetic examples and verified with a laboratory study. The effect of the weighting factor is investigated using an acquisition layout comprising a single line of electrodes. The results show that an increasing weight decreases the area of improved resolution, but leads to a smaller number of electrode positions. Imaging results superior to a standard survey design were achieved using 56 per cent fewer electrodes. The performance was also tested on a 3D acquisition grid, where superior resolution within a target at the base of an embankment was achieved using 22 per cent fewer electrodes than a comparable standard survey. The effect of the underlying resistivity distribution on the performance of the optimization was investigated and it was shown that even strong resistivity contrasts only have minor impact. The synthetic results were verified in a laboratory tank experiment, where notable image improvements were achieved. This work shows that optimized surveys can be designed that have a resolution superior to standard survey designs, while requiring significantly fewer electrodes. This methodology thereby provides a means for improving the efficiency of geoelectrical imaging.
Optimizing EDELWEISS detectors for low-mass WIMP searches
NASA Astrophysics Data System (ADS)
Arnaud, Q.; Armengaud, E.; Augier, C.; Benoît, A.; Bergé, L.; Billard, J.; Broniatowski, A.; Camus, P.; Cazes, A.; Chapellier, M.; Charlieux, F.; de Jésus, M.; Dumoulin, L.; Eitel, K.; Foerster, N.; Gascon, J.; Giuliani, A.; Gros, M.; Hehn, L.; Jin, Y.; Juillard, A.; Kleifges, M.; Kozlov, V.; Kraus, H.; Kudryavtsev, V. A.; Le-Sueur, H.; Maisonobe, R.; Marnieros, S.; Navick, X.-F.; Nones, C.; Olivieri, E.; Pari, P.; Paul, B.; Poda, D.; Queguiner, E.; Rozov, S.; Sanglard, V.; Scorza, S.; Siebenborn, B.; Vagneron, L.; Weber, M.; Yakushev, E.; EDELWEISS Collaboration
2018-01-01
The physics potential of EDELWEISS detectors for the search of low-mass weakly interacting massive particles (WIMPs) is studied. Using a data-driven background model, projected exclusion limits are computed using frequentist and multivariate analysis approaches, namely, profile likelihood and boosted decision tree. Both current and achievable experimental performances are considered. The optimal strategy for detector optimization depends critically on whether the emphasis is put on WIMP masses below or above ˜5 GeV /c2 . The projected sensitivity for the next phase of the EDELWEISS-III experiment at the Modane Underground Laboratory (LSM) for low-mass WIMP search is presented. By 2018 an upper limit on the spin-independent WIMP-nucleon cross section of σSI=7 ×10-42 cm2 is expected for a WIMP mass in the range 2 - 5 GeV /c2 . The requirements for a future hundred-kilogram-scale experiment designed to reach the bounds imposed by the coherent scattering of solar neutrinos are also described. By improving the ionization resolution down to 50 eVe e , we show that such an experiment installed in an even lower background environment (e.g., at SNOLAB) together with an exposure of 1 000 kg .yr , should allow us to observe about 80
Naydenova, Vessela; Badova, Mariyana; Vassilev, Stoyan; Iliev, Vasil; Kaneva, Maria; Kostov, Georgi
2014-03-04
Two mathematical models were developed for studying the effect of main fermentation temperature ( T MF ), immobilized cell mass ( M IC ) and original wort extract (OE) on beer fermentation with alginate-chitosan microcapsules with a liquid core. During the experiments, the investigated parameters were varied in order to find the optimal conditions for beer fermentation with immobilized cells. The basic beer characteristics, i.e. extract, ethanol, biomass concentration, pH and colour, as well as the concentration of aldehydes and vicinal diketones, were measured. The results suggested that the process parameters represented a powerful tool in controlling the fermentation time. Subsequently, the optimized process parameters were used to produce beer in laboratory batch fermentation. The system productivity was also investigated and the data were used for the development of another mathematical model.
Naydenova, Vessela; Badova, Mariyana; Vassilev, Stoyan; Iliev, Vasil; Kaneva, Maria; Kostov, Georgi
2014-01-01
Two mathematical models were developed for studying the effect of main fermentation temperature (T MF), immobilized cell mass (M IC) and original wort extract (OE) on beer fermentation with alginate-chitosan microcapsules with a liquid core. During the experiments, the investigated parameters were varied in order to find the optimal conditions for beer fermentation with immobilized cells. The basic beer characteristics, i.e. extract, ethanol, biomass concentration, pH and colour, as well as the concentration of aldehydes and vicinal diketones, were measured. The results suggested that the process parameters represented a powerful tool in controlling the fermentation time. Subsequently, the optimized process parameters were used to produce beer in laboratory batch fermentation. The system productivity was also investigated and the data were used for the development of another mathematical model. PMID:26019512
NASA Astrophysics Data System (ADS)
Ryan, A. J.; Christensen, P. R.
2016-12-01
Laboratory measurements have been necessary to interpret thermal data of planetary surfaces for decades. We present a novel radiometric laboratory method to determine temperature-dependent thermal conductivity of complex regolith simulants under high vacuum and across a wide range of temperatures. Here, we present our laboratory method, strategy, and initial results. This method relies on radiometric temperature measurements instead of contact measurements, eliminating the need to disturb the sample with thermal probes. We intend to determine the conductivity of grains that are up to 2 cm in diameter and to parameterize the effects of angularity, sorting, layering, composition, and cementation. These results will support the efforts of the OSIRIS-REx team in selecting a site on asteroid Bennu that is safe and meets grain size requirements for sampling. Our system consists of a cryostat vacuum chamber with an internal liquid nitrogen dewar. A granular sample is contained in a cylindrical cup that is 4 cm in diameter and 1 to 6 cm deep. The surface of the sample is exposed to vacuum and is surrounded by a black liquid nitrogen cold shroud. Once the system has equilibrated at 80 K, the base of the sample cup is rapidly heated to 450 K. An infrared camera observes the sample from above to monitor its temperature change over time. We have built a time-dependent finite element model of the experiment in COMSOL Multiphysics. Boundary temperature conditions and all known material properties (including surface emissivities) are included to replicate the experiment as closely as possible. The Optimization module in COMSOL is specifically designed for parameter estimation. Sample thermal conductivity is assumed to be a quadratic or cubic polynomial function of temperature. We thus use gradient-based optimization methods in COMSOL to vary the polynomial coefficients in an effort to reduce the least squares error between the measured and modeled sample surface temperature.
Automation in biological crystallization.
Stewart, Patrick Shaw; Mueller-Dieckmann, Jochen
2014-06-01
Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given.
Automation in biological crystallization
Shaw Stewart, Patrick; Mueller-Dieckmann, Jochen
2014-01-01
Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given. PMID:24915074
NASA Astrophysics Data System (ADS)
Lemke, Raymond
2015-06-01
The focus of this talk is on magnetically driven, liner implosion experiments on the Z machine (Z) in which a solid, metal tube is shocklessly compressed to multi-megabar pressure. The goal of the experiments is to collect velocimetry data that can be used in conjunction with a new optimization based analysis technique to infer the principal isentrope of the tube material over a range of pressures. For the past decade, shock impact and ramp loading experiments on Z have used planar platforms exclusively. While producing state-of-the-art results for material science, it is difficult to produce drive pressures greater than 6 Mbar in the divergent planar geometry. In contrast, a cylindrical liner implosion is convergent; magnetic drive pressures approaching 50 Mbar are possible with the available current on Z (~ 20 MA). In our cylindrical experiments, the liner comprises an inner tube composed of the sample material (e.g., Ta) of unknown equation of state, and an outer tube composed of aluminum (Al) that serves as the current carrying cathode. Internal to the sample are fielded multiple PDV (Photonic Doppler Velocimetry) probes that measure velocity of the inner free surface of the imploding sample. External to the composite liner, at much larger radius, is an Al tube that is the return current anode. VISAR (velocity interferometry system for any reflector) probes measure free surface velocity of the exploding anode. Using the latter, MHD and optimization codes are employed to solve an inverse problem that yields the current driving the liner implosion. Then, the drive current, PDV velocity, MHD and optimization codes, are used to solve another inverse problem that yields pressure vs. density on approximately the principal isentrope of the sample material. Results for Ta, Re, and Cu compressed to ~ 10 Mbar are presented. Sandia National Laboratories is a multiprogram laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the US DOE's National Nuclear Security Administration under contract DE-AC04-94AL85000.
Abou Mrad, Ninette; Duvernay, Fabrice; Theulé, Patrice; Chiavassa, Thierry; Danger, Grégoire
2014-08-19
This contribution presents an original analytical system for studying volatile organic compounds (VOC) coming from the heating and/or irradiation of interstellar/cometary ice analogues (VAHIIA system) through laboratory experiments. The VAHIIA system brings solutions to three analytical constraints regarding chromatography analysis: the low desorption kinetics of VOC (many hours) in the vacuum chamber during laboratory experiments, the low pressure under which they sublime (10(-9) mbar), and the presence of water in ice analogues. The VAHIIA system which we developed, calibrated, and optimized is composed of two units. The first is a preconcentration unit providing the VOC recovery. This unit is based on a cryogenic trapping which allows VOC preconcentration and provides an adequate pressure allowing their subsequent transfer to an injection unit. The latter is a gaseous injection unit allowing the direct injection into the GC-MS of the VOC previously transferred from the preconcentration unit. The feasibility of the online transfer through this interface is demonstrated. Nanomoles of VOC can be detected with the VAHIIA system, and the variability in replicate measurements is lower than 13%. The advantages of the GC-MS in comparison to infrared spectroscopy are pointed out, the GC-MS allowing an unambiguous identification of compounds coming from complex mixtures. Beyond the application to astrophysical subjects, these analytical developments can be used for all systems requiring vacuum/cryogenic environments.
EuroFlow standardization of flow cytometer instrument settings and immunophenotyping protocols
Kalina, T; Flores-Montero, J; van der Velden, V H J; Martin-Ayuso, M; Böttcher, S; Ritgen, M; Almeida, J; Lhermitte, L; Asnafi, V; Mendonça, A; de Tute, R; Cullen, M; Sedek, L; Vidriales, M B; Pérez, J J; te Marvelde, J G; Mejstrikova, E; Hrusak, O; Szczepański, T; van Dongen, J J M; Orfao, A
2012-01-01
The EU-supported EuroFlow Consortium aimed at innovation and standardization of immunophenotyping for diagnosis and classification of hematological malignancies by introducing 8-color flow cytometry with fully standardized laboratory procedures and antibody panels in order to achieve maximally comparable results among different laboratories. This required the selection of optimal combinations of compatible fluorochromes and the design and evaluation of adequate standard operating procedures (SOPs) for instrument setup, fluorescence compensation and sample preparation. Additionally, we developed software tools for the evaluation of individual antibody reagents and antibody panels. Each section describes what has been evaluated experimentally versus adopted based on existing data and experience. Multicentric evaluation demonstrated high levels of reproducibility based on strict implementation of the EuroFlow SOPs and antibody panels. Overall, the 6 years of extensive collaborative experiments and the analysis of hundreds of cell samples of patients and healthy controls in the EuroFlow centers have provided for the first time laboratory protocols and software tools for fully standardized 8-color flow cytometric immunophenotyping of normal and malignant leukocytes in bone marrow and blood; this has yielded highly comparable data sets, which can be integrated in a single database. PMID:22948490
A fine resolution multifrequency polarimetric FM radar
NASA Technical Reports Server (NTRS)
Bredow, J.; Gogineni, S.; Leung, T.; Moore, R. K.
1988-01-01
A fine resolution polarimetric FM SAR was developed for optimization of polarimetric SARs and interpretation of SAR data via controlled experiments with surface-base sensors. The system is designed for collecting polarimetric data at 5.3 and 10 GHz over incidence angles from 0 to 60 deg. Features of the system include broad bandwidth to obtain fine range resolution, phase stabilization and linearization loop circuitry, and digital signal processing capability. The system is used in a research program to collect polarimetric backscatter data from artificial sea ice research and design trade-offs, laboratory and field evaluation, as well as results from experiments on artificial sea ice are presented.
Status of the KLOE-2 experiment
NASA Astrophysics Data System (ADS)
Di Cicco, Alessandro
2015-06-01
The KLOE-2 experiment at the Frascati National Laboratory of the INFN is undergoing commissioning, together with the e+e- collider DAΦNE. The KLOE apparatus, consisting of a huge Drift Chamber and an Electromagnetic Colorimeter working in a 0.5 T axial magnetic field, has been upgraded with the insertion of an Inner Tracker, two low-angle calorimeters (CCALT and QCALT) and low-angle taggers (LET and HET) for γγ-physics. Cosmic-ray muon and collision data are being acquired in order to optimize the sub-detectors operation in view of the new data taking campaign. The first results from the ongoing commissioning of the KLOE-2 detector will be shown.
Deetz, C O; Scott, M G; Ladenson, J H; Seyoum, M; Hassan, A; Kreisel, F H; Nguyen, T T; Frater, J L
2013-02-01
With proper logistical support and sponsorship, a laboratory in an industrialized nation might be able to act as a reference laboratory for clinicians based in a developing country. We built on previous experience in the clinical laboratory to see whether a specialized histopathology service (hematopathology) could be provided to a developing country without the expertise or experience to do it in country. Over an 13-year period, 582 cases from 579 individuals were analyzed. Principal pathologic findings included acute leukemia in 84 cases (14%), dyspoiesis in one or more of the hematopoietic lineages in 65 cases (11%, including three cases with high-grade myelodysplasia), 23 cases (4%) with findings suspicious for a chronic myeloproliferative disorder, 35 cases (6%) with findings suspicious for a lymphoproliferative disorder, and infectious organisms (presumably Leishmania in most instances) in 9 (1%) of cases. Specimens from 45 cases (8%) were unsatisfactory owing to extreme hemodilution and/or specimen degeneration. With proper support, a medical laboratory in an industrialized nation may serve as a reference facility for a developing nation. The use of existing infrastructure may be remarkably effective to achieve optimal turnaround time. Although the lack of ancillary studies and follow-up biopsies limit the ability to achieve a definitive diagnosis in many cases, this must be viewed in the context of the limited ability to diagnose or manage hematopoietic neoplasia in developing nations. © 2012 Blackwell Publishing Ltd.
Simulations of ultrafast x-ray laser experiments
NASA Astrophysics Data System (ADS)
Fortmann-Grote, C.; Andreev, A. A.; Appel, K.; Branco, J.; Briggs, R.; Bussmann, M.; Buzmakov, A.; Garten, M.; Grund, A.; Huebl, A.; Jurek, Z.; Loh, N. D.; Nakatsutsumi, M.; Samoylova, L.; Santra, R.; Schneidmiller, E. A.; Sharma, A.; Steiniger, K.; Yakubov, S.; Yoon, C. H.; Yurkov, M. V.; Zastrau, U.; Ziaja-Motyka, B.; Mancuso, A. P.
2017-06-01
Simulations of experiments at modern light sources, such as optical laser laboratories, synchrotrons, and free electron lasers, become increasingly important for the successful preparation, execution, and analysis of these experiments investigating ever more complex physical systems, e.g. biomolecules, complex materials, and ultra-short lived states of matter at extreme conditions. We have implemented a platform for complete start-to-end simulations of various types of photon science experiments, tracking the radiation from the source through the beam transport optics to the sample or target under investigation, its interaction with and scattering from the sample, and registration in a photon detector. This tool allows researchers and facility operators to simulate their experiments and instruments under real life conditions, identify promising and unattainable regions of the parameter space and ultimately make better use of valuable beamtime. In this paper, we present an overview about status and future development of the simulation platform and discuss three applications: 1.) Single-particle imaging of biomolecules using x-ray free electron lasers and optimization of x-ray pulse properties, 2.) x-ray scattering diagnostics of hot dense plasmas in high power laser-matter interaction and identification of plasma instabilities, and 3.) x-ray absorption spectroscopy in warm dense matter created by high energy laser-matter interaction and pulse shape optimization for low-isentrope dynamic compression.
Optimization of time-course experiments for kinetic model discrimination.
Lages, Nuno F; Cordeiro, Carlos; Sousa Silva, Marta; Ponces Freire, Ana; Ferreira, António E N
2012-01-01
Systems biology relies heavily on the construction of quantitative models of biochemical networks. These models must have predictive power to help unveiling the underlying molecular mechanisms of cellular physiology, but it is also paramount that they are consistent with the data resulting from key experiments. Often, it is possible to find several models that describe the data equally well, but provide significantly different quantitative predictions regarding particular variables of the network. In those cases, one is faced with a problem of model discrimination, the procedure of rejecting inappropriate models from a set of candidates in order to elect one as the best model to use for prediction.In this work, a method is proposed to optimize the design of enzyme kinetic assays with the goal of selecting a model among a set of candidates. We focus on models with systems of ordinary differential equations as the underlying mathematical description. The method provides a design where an extension of the Kullback-Leibler distance, computed over the time courses predicted by the models, is maximized. Given the asymmetric nature this measure, a generalized differential evolution algorithm for multi-objective optimization problems was used.The kinetics of yeast glyoxalase I (EC 4.4.1.5) was chosen as a difficult test case to evaluate the method. Although a single-substrate kinetic model is usually considered, a two-substrate mechanism has also been proposed for this enzyme. We designed an experiment capable of discriminating between the two models by optimizing the initial substrate concentrations of glyoxalase I, in the presence of the subsequent pathway enzyme, glyoxalase II (EC 3.1.2.6). This discriminatory experiment was conducted in the laboratory and the results indicate a two-substrate mechanism for the kinetics of yeast glyoxalase I.
Close games versus blowouts: Optimal challenge reinforces one's intrinsic motivation to win.
Meng, Liang; Pei, Guanxiong; Zheng, Jiehui; Ma, Qingguo
2016-12-01
When immersed in intrinsically motivating activities, individuals actively seek optimal challenge, which generally brings the most satisfaction as they play hard and finally win. To better simulate real-life scenarios in the controlled laboratory setting, a two-player online StopWatch (SW) game was developed, whose format is similar to that of a badminton tournament. During the game, a male opponent played by a confederate ensured that the same-sex participant paired with him won both matches, one with a wide margin (the lack of challenge condition) and another with a narrow one (the optimal challenge condition). Electrophysiological data were recorded during the entire experiment. An enlarged Stimulus-preceding negativity (SPN) was observed in the optimal challenge condition, indicating a more concentrated anticipatory attention toward the feedback and a stronger intrinsic motivation during close games. Thus, this study provided original neural evidence for predictions of Self-determination theory (SDT) and Flow theory, and confirmed and emphasized the significant role of optimal challenge in promoting one's intrinsic motivation to win. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Roslund, Jonathan; Shir, Ofer M.; Bäck, Thomas; Rabitz, Herschel
2009-10-01
Optimization of quantum systems by closed-loop adaptive pulse shaping offers a rich domain for the development and application of specialized evolutionary algorithms. Derandomized evolution strategies (DESs) are presented here as a robust class of optimizers for experimental quantum control. The combination of stochastic and quasi-local search embodied by these algorithms is especially amenable to the inherent topology of quantum control landscapes. Implementation of DES in the laboratory results in efficiency gains of up to ˜9 times that of the standard genetic algorithm, and thus is a promising tool for optimization of unstable or fragile systems. The statistical learning upon which these algorithms are predicated also provide the means for obtaining a control problem’s Hessian matrix with no additional experimental overhead. The forced optimal covariance adaptive learning (FOCAL) method is introduced to enable retrieval of the Hessian matrix, which can reveal information about the landscape’s local structure and dynamic mechanism. Exploitation of such algorithms in quantum control experiments should enhance their efficiency and provide additional fundamental insights.
Particle-in-cell numerical simulations of a cylindrical Hall thruster with permanent magnets
NASA Astrophysics Data System (ADS)
Miranda, Rodrigo A.; Martins, Alexandre A.; Ferreira, José L.
2017-10-01
The cylindrical Hall thruster (CHT) is a propulsion device that offers high propellant utilization and performance at smaller dimensions and lower power levels than traditional Hall thrusters. In this paper we present first results of a numerical model of a CHT. This model solves particle and field dynamics self-consistently using a particle-in-cell approach. We describe a number of techniques applied to reduce the execution time of the numerical simulations. The specific impulse and thrust computed from our simulations are in agreement with laboratory experiments. This simplified model will allow for a detailed analysis of different thruster operational parameters and obtain an optimal configuration to be implemented at the Plasma Physics Laboratory at the University of Brasília.
Optimizing the Ar-Xe infrared laser on the Naval Research Laboratory's Electra generator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Apruzese, J. P.; Giuliani, J. L.; Wolford, M. F.
2008-07-01
The Ar-Xe infrared laser has been investigated in several series of experiments carried out on the Naval Research Laboratory's Electra generator. Our primary goals were to optimize the efficiency of the laser (within Electra's capabilities) and to gain understanding of the main physical processes underlying the laser's output as a function of controllable parameters such as Xe fraction, power deposition, and gas pressure. We find that the intrinsic efficiency maximizes at {approx}3% at a total pressure of 2.5 atm, Xe fraction of 1%, and electron beam power deposition density of 50-100 kW cm{sup -3}. We deployed an interferometer to measuremore » the electron density during lasing; the ionization fractions of 10{sup -5}-10{sup -4} that it detected well exceed previous theoretical estimates. Some trends in the data as a function of beam power and xenon fraction are not fully understood. The as-yet incomplete picture of Ar-Xe laser physics is likely traceable in large part to significant uncertainties still present in many important rates influencing the atomic and molecular kinetics.« less
Lopez-Alvarez, Blady; Torres-Palma, Ricardo A; Ferraro, Franklin; Peñuela, Gustavo
2012-01-01
The degradation of the pesticide carbofuran (CBF) using solar photo-Fenton treatment, at both the laboratory and the pilot scale, was evaluated. At the laboratory scale, in a suntest reactor, the Fe(2+) concentration and H(2)O(2) concentration were evaluated and optimized using the surface response methodology and the Pareto diagram. Under optimal conditions experiments were performed to evaluate the evolution of the substrate removal, oxidation, subsequent mineralization, toxicity and the formation of chloride ions during the treatment. The analysis and evolution of five CBF by-products as well as several control and reactivity tests at the density functional theory level were used to depict a general scheme of the main degradation pathway of CBF via the photo-Fenton system. Finally, at the pilot scale, a sample of the commercial CBF product Furadan was eliminated after 420 min by the photo-Fenton system using direct sunlight. Under these conditions, after 900 min 89% of toxicity (1/E(50) on Vibrio fischeri bacteria), 97% of chemical oxygen demand, and 90% of dissolved organic carbon were removed.
Personalized Learning: From Neurogenetics of Behaviors to Designing Optimal Language Training
Wong, Patrick C. M.; Vuong, Loan; Liu, Kevin
2016-01-01
Variability in drug responsivity has prompted the development of Personalized Medicine, which has shown great promise in utilizing genotypic information to develop safer and more effective drug regimens for patients. Similarly, individual variability in learning outcomes has puzzled researchers who seek to create optimal learning environments for students. “Personalized Learning” seeks to identify genetic, neural and behavioral predictors of individual differences in learning and aims to use predictors to help create optimal teaching paradigms. Evidence for Personalized Learning can be observed by connecting research in pharmacogenomics, cognitive genetics and behavioral experiments across domains of learning, which provides a framework for conducting empirical studies from the laboratory to the classroom and holds promise for addressing learning effectiveness in the individual learners. Evidence can also be seen in the subdomain of speech learning, thus providing initial support for the applicability of Personalized Learning to language. PMID:27720749
The PADME calorimeters for missing mass dark photon searches
NASA Astrophysics Data System (ADS)
Ferrarotto, F.
2018-03-01
In this paper we will present the design and expected performance for the Electromagnetic and Small Angle Calorimeters (ECAL, SAC) of the PADME experiment. The design of the calorimeters has been optimized for the detection of the final state γ from the annihilation production (and subsequent "invisible" decay) of a "Dark Photon" produced by a positron beam on a thin, low Z target. Beam tests have been made in 2016 and 2017 at the INFN Frascati National Laboratories Linac Beam Test Facility (BTF) with positron beams of energy 100–400 MeV and results are presented. The PADME experiment will be built at the INFN Frascati National Laboratories by the end of 2017 and will be taking data in 2018 (and possibly also 2019). At the moment the collaboration is composed by the following institutions: INFN Roma and "La Sapienza" University of Roma, INFN Frascati, INFN Lecce and University of Salento, MTA Atomki Debrecen, University of Sofia, Cornell University, U.S. William and Mary College.
Parallel collisionless shocks forming in simulations of the LAPD experiment
NASA Astrophysics Data System (ADS)
Weidl, Martin S.; Jenko, Frank; Niemann, Chris; Winske, Dan
2016-10-01
Research on parallel collisionless shocks, most prominently occurring in the Earth's bow shock region, has so far been limited to satellite measurements and simulations. However, the formation of collisionless shocks depends on a wide range of parameters and scales, which can be accessed more easily in a laboratory experiment. Using a kJ-class laser, an ongoing experimental campaign at the Large Plasma Device (LAPD) at UCLA is expected to produce the first laboratory measurements of the formation of a parallel collisionless shock. We present hybrid kinetic/MHD simulations that show how beam instabilities in the background plasma can be driven by ablating carbon ions from a target, causing non-linear density oscillations which develop into a propagating shock front. The free-streaming carbon ions can excite both the resonant right-hand instability and the non-resonant firehose mode. We analyze their respective roles and discuss optimizing their growth rates to speed up the process of shock formation.
Müller, Alexander; Weiss, Stefan C; Beisswenger, Judith; Leukhardt, H Georg; Schulz, Wolfgang; Seitz, Wolfram; Ruck, Wolfgang K L; Weber, Walter H
2012-03-01
During the treatment of surface water to drinking water, ozonation is often used for disinfection and to remove organic trace substances, whereby oxidation by-products can be formed. Here we use the example of tolyltriazole to describe an approach for identifying relevant oxidation by-products in the laboratory and subsequently detecting them in an industrial-scale process. The identification process involves ozonation experiments with pure substances at laboratory level (concentration range mg L(-1)). The reaction solutions from different ozone contact times were analyzed by high performance liquid chromatography - quadrupole time-of-flight mass spectrometry (HPLC-QTOF-MS) in full scan mode. Various approaches were used to detect the oxidation by-products: (i) target searches of postulated oxidation by-products, (ii) comparisons of chromatograms (e.g., UV/VIS) of the different samples, and (iii) color-coded abundance time courses (kinetic) of all detected compounds were illustrated in a kind of a heat map. MS/MS, H/D exchange, and derivatization experiments were used for structure elucidation for the detected by-product. Due to the low contaminant concentrations (ng L(-1)-range) of contaminants in the untreated water, the conversion of results from laboratory experiments to an industrial-scale required the use of HPLC-MS/MS with sample enrichment (e.g., solid phase extraction.) In cases where reference substances were not available or oxidation by-products without clear structures were detected, reaction solutions from laboratory experiments were used to optimize the analytical method to detect ng L(-1) in the samples of the industrial processes. We exemplarily demonstrated the effectiveness of the methodology with the industrial chemicals 4- and 5-methyl-1H-benzotriazole (4- and 5-MBT) as an example. Moreover, not only did we identify several oxidation by-products in the laboratory experiments tentatively, but also detected three of the eleven reaction products in the outlet of the full-scale ozonation unit. Copyright © 2011 Elsevier Ltd. All rights reserved.
Integration of Diagnostic Microbiology in a Model of Total Laboratory Automation.
Da Rin, Giorgio; Zoppelletto, Maira; Lippi, Giuseppe
2016-02-01
Although automation has become widely utilized in certain areas of diagnostic testing, its adoption in diagnostic microbiology has proceeded much more slowly. To describe our real-world experience of integrating an automated instrument for diagnostic microbiology (Walk-Away Specimen Processor, WASPLab) within a model of total laboratory automation (TLA). The implementation process was divided into 2 phases. The former period, lasting approximately 6 weeks, entailed the installation of the WASPLab processor to operate as a stand-alone instrumentation, whereas the latter, lasting approximately 2 weeks, involved physical connection of the WASPLab with the automation. Using the WASPLab instrument in conjunction with the TLA model, we obtained a time savings equivalent to the work of 1.2 full-time laboratory technicians for diagnostic microbiology. The connection of WASPLab to TLA allowed its management by a generalist or clinical chemistry technician, with no need for microbiology skills on the part of either worker. Hence, diagnostic microbiology could be performed by the staff that is already using the TLA, extending their activities to include processing urgent clinical chemistry and hematology specimens. The time to result was also substantially improved. According to our experience, using the WASPLab instrument as part of a TLA in diagnostic microbiology holds great promise for optimizing laboratory workflow and improving the quality of testing. © American Society for Clinical Pathology, 2015. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Magnetostrophic balance as the optimal state for turbulent magnetoconvection
King, Eric M.; Aurnou, Jonathan M.
2015-01-01
The magnetic fields of Earth and other planets are generated by turbulent convection in the vast oceans of liquid metal within them. Although direct observation is not possible, this liquid metal circulation is thought to be dominated by the controlling influences of planetary rotation and magnetic fields through the Coriolis and Lorentz forces. Theory famously predicts that planetary dynamo systems naturally settle into the so-called magnetostrophic state, where the Coriolis and Lorentz forces partially cancel, and convection is optimally efficient. Although this magnetostrophic theory correctly predicts the strength of Earth’s magnetic field, no laboratory experiments have reached the magnetostrophic regime in turbulent liquid metal convection. Furthermore, computational dynamo simulations have as yet failed to produce a magnetostrophic dynamo, which has led some to question the existence of the magnetostrophic state. Here, we present results from the first, to our knowledge, turbulent, magnetostrophic convection experiments using the liquid metal gallium. We find that turbulent convection in the magnetostrophic regime is, in fact, maximally efficient. The experimental results clarify these previously disparate results, suggesting that the dynamically optimal magnetostrophic state is the natural expression of turbulent planetary dynamo systems. PMID:25583512
Adding value to laboratory medicine: a professional responsibility.
Beastall, Graham H
2013-01-01
Laboratory medicine is a medical specialty at the centre of healthcare. When used optimally laboratory medicine generates knowledge that can facilitate patient safety, improve patient outcomes, shorten patient journeys and lead to more cost-effective healthcare. Optimal use of laboratory medicine relies on dynamic and authoritative leadership outside as well as inside the laboratory. The first responsibility of the head of a clinical laboratory is to ensure the provision of a high quality service across a wide range of parameters culminating in laboratory accreditation against an international standard, such as ISO 15189. From that essential baseline the leadership of laboratory medicine at local, national and international level needs to 'add value' to ensure the optimal delivery, use, development and evaluation of the services provided for individuals and for groups of patients. A convenient tool to illustrate added value is use of the mnemonic 'SCIENCE'. This tool allows added value to be considered in seven domains: standardisation and harmonisation; clinical effectiveness; innovation; evidence-based practice; novel applications; cost-effectiveness; and education of others. The assessment of added value in laboratory medicine may be considered against a framework that comprises three dimensions: operational efficiency; patient management; and patient behaviours. The profession and the patient will benefit from sharing examples of adding value to laboratory medicine.
The credibility crisis in research: Can economics tools help?
Gall, Thomas; Ioannidis, John P. A.; Maniadis, Zacharias
2017-01-01
The issue of nonreplicable evidence has attracted considerable attention across biomedical and other sciences. This concern is accompanied by an increasing interest in reforming research incentives and practices. How to optimally perform these reforms is a scientific problem in itself, and economics has several scientific methods that can help evaluate research reforms. Here, we review these methods and show their potential. Prominent among them are mathematical modeling and laboratory experiments that constitute affordable ways to approximate the effects of policies with wide-ranging implications. PMID:28445470
Low frequency seismic noise acquisition and analysis with tunable monolithic horizontal sensors
NASA Astrophysics Data System (ADS)
Acernese, Fausto; De Rosa, Rosario; Giordano, Gerardo; Romano, Rocco; Vilasi, Silvia; Barone, Fabrizio
2011-04-01
In this paper we describe the scientific data recorded mechanical monolithic horizontal sensor prototypes located in the Gran Sasso Laboratory of the INFN. The mechanical monolithic sensors, developed at the University of Salerno, are placed, in thermally insulating enclosures, onto concrete slabs connected to the bedrock. The main goal of this experiment is to characterize seismically the sites in the frequency band 10-4 ÷ 10Hz and to get all the necessary information to optimize the sensor.
The credibility crisis in research: Can economics tools help?
Gall, Thomas; Ioannidis, John P A; Maniadis, Zacharias
2017-04-01
The issue of nonreplicable evidence has attracted considerable attention across biomedical and other sciences. This concern is accompanied by an increasing interest in reforming research incentives and practices. How to optimally perform these reforms is a scientific problem in itself, and economics has several scientific methods that can help evaluate research reforms. Here, we review these methods and show their potential. Prominent among them are mathematical modeling and laboratory experiments that constitute affordable ways to approximate the effects of policies with wide-ranging implications.
Parametrization of turbulence models using 3DVAR data assimilation in laboratory conditions
NASA Astrophysics Data System (ADS)
Olbert, A. I.; Nash, S.; Ragnoli, E.; Hartnett, M.
2013-12-01
In this research the 3DVAR data assimilation scheme is implemented in the numerical model DIVAST in order to optimize the performance of the numerical model by selecting an appropriate turbulence scheme and tuning its parameters. Two turbulence closure schemes: the Prandtl mixing length model and the two-equation k-ɛ model were incorporated into DIVAST and examined with respect to their universality of application, complexity of solutions, computational efficiency and numerical stability. A square harbour with one symmetrical entrance subject to tide-induced flows was selected to investigate the structure of turbulent flows. The experimental part of the research was conducted in a tidal basin. A significant advantage of such laboratory experiment is a fully controlled environment where domain setup and forcing are user-defined. The research shows that the Prandtl mixing length model and the two-equation k-ɛ model, with default parameterization predefined according to literature recommendations, overestimate eddy viscosity which in turn results in a significant underestimation of velocity magnitudes in the harbour. The data assimilation of the model-predicted velocity and laboratory observations significantly improves model predictions for both turbulence models by adjusting modelled flows in the harbour to match de-errored observations. Such analysis gives an optimal solution based on which numerical model parameters can be estimated. The process of turbulence model optimization by reparameterization and tuning towards optimal state led to new constants that may be potentially applied to complex turbulent flows, such as rapidly developing flows or recirculating flows. This research further demonstrates how 3DVAR can be utilized to identify and quantify shortcomings of the numerical model and consequently to improve forecasting by correct parameterization of the turbulence models. Such improvements may greatly benefit physical oceanography in terms of understanding and monitoring of coastal systems and the engineering sector through applications in coastal structure design, marine renewable energy and pollutant transport.
2017-11-01
ARL-TR-8225 ● NOV 2017 US Army Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based...Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques by...SUBTITLE Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques 5a. CONTRACT NUMBER
Further development of the dynamic gas temperature measurement system. Volume 1: Technical efforts
NASA Technical Reports Server (NTRS)
Elmore, D. L.; Robinson, W. W.; Watkins, W. B.
1986-01-01
A compensated dynamic gas temperature thermocouple measurement method was experimentally verified. Dynamic gas temperature signals from a flow passing through a chopped-wheel signal generator and an atmospheric pressure laboratory burner were measured by the dynamic temperature sensor and other fast-response sensors. Compensated data from dynamic temperature sensor thermoelements were compared with fast-response sensors. Results from the two experiments are presented as time-dependent waveforms and spectral plots. Comparisons between compensated dynamic temperature sensor spectra and a commercially available optical fiber thermometer compensated spectra were made for the atmospheric burner experiment. Increases in precision of the measurement method require optimization of several factors, and directions for further work are identified.
Experience with custom processors in space flight applications
NASA Technical Reports Server (NTRS)
Fraeman, M. E.; Hayes, J. R.; Lohr, D. A.; Ballard, B. W.; Williams, R. L.; Henshaw, R. M.
1991-01-01
The Applied Physics Laboratory (APL) has developed a magnetometer instrument for a swedish satellite named Freja with launch scheduled for August 1992 on a Chinese Long March rocket. The magnetometer controller utilized a custom microprocessor designed at APL with the Genesil silicon compiler. The processor evolved from our experience with an older bit-slice design and two prior single chip efforts. The architecture of our microprocessor greatly lowered software development costs because it was optimized to provide an interactive and extensible programming environment hosted by the target hardware. Radiation tolerance of the microprocessor was also tested and was adequate for Freja's mission -- 20 kRad(Si) total dose and very infrequent latch-up and single event upset events.
Plasma density characterization at SPARC_LAB through Stark broadening of Hydrogen spectral lines
NASA Astrophysics Data System (ADS)
Filippi, F.; Anania, M. P.; Bellaveglia, M.; Biagioni, A.; Chiadroni, E.; Cianchi, A.; Di Giovenale, D.; Di Pirro, G.; Ferrario, M.; Mostacci, A.; Palumbo, L.; Pompili, R.; Shpakov, V.; Vaccarezza, C.; Villa, F.; Zigler, A.
2016-09-01
Plasma-based acceleration techniques are of great interest for future, compact accelerators due to their high accelerating gradient. Both particle-driven and laser-driven Plasma Wakefield Acceleration experiments are foreseen at the SPARC_LAB Test Facility (INFN National Laboratories of Frascati, Italy), with the aim to accelerate high-brightness electron beams. In order to optimize the efficiency of the acceleration in the plasma and preserve the quality of the accelerated beam, the knowledge of the plasma electron density is mandatory. The Stark broadening of the Hydrogen spectral lines is one of the candidates used to characterize plasma density. The implementation of this diagnostic for plasma-based experiments at SPARC_LAB is presented.
Development of guided inquiry-based laboratory worksheet on topic of heat of combustion
NASA Astrophysics Data System (ADS)
Sofiani, D.; Nurhayati; Sunarya, Y.; Suryatna, A.
2018-03-01
Chemistry curriculum reform shows an explicit shift from traditional approach to scientific inquiry. This study aims to develop a guided inquiry-based laboratory worksheet on topic of heat of combustion. Implementation of this topic in high school laboratory is new because previously some teachers only focused the experiment on determining the heat of neutralization. The method used in this study was development research consisted of three stages: define, design, and develop. In the define stage, curriculum analysis and material analysis were performed. In the design stage, laboratory optimization and product preparation were conducted. In the development stage, the product was evaluated by the experts and tested to a total of 20 eleventh-grade students. The instruments used in this study were assessment sheet and students’ response questionnaire. The assessment results showed that the guided inquiry-based laboratory worksheet has very good quality based on the aspects of content, linguistic, and graphics. The students reacted positively to the use of this guided inquiry-based worksheet as demonstrated by the results from questionnaire. The implications of this study is the laboratory activity should be directed to development of scientific inquiry skills in order to enhance students’ competences as well as the quality of science education.
LaPeyre, Megan K.; Gossman, B.; La Peyre, Jerome F.
2009-01-01
In coastal Louisiana, the development of large-scale freshwater diversion projects has led to controversy over their effects on oyster resources. Using controlled laboratory experiments in combination with a field study, we examined the effects of pulsed freshwater events (freshet) of different magnitude, duration, and rate of change on oyster resources. Laboratory and field evidence indicate that low salinity events (<5 psu) decreased Perkinsus marinus infection intensities. Furthermore, when salinity was low (<5 psu), parasite infection intensities continued to decrease even as temperatures exceeded 20°C. At the same time, oyster growth was positively correlated with salinity. To maximize oyster production, data indicate that both low and high salinity events will be necessary.
Titanium clip ball joint: a partial ossicular reconstruction prosthesis.
Beutner, Dirk; Luers, Jan Christoffer; Bornitz, Matthias; Zahnert, Thomas; Huttenbrink, Karl-Bernd
2011-06-01
To describe a new titanium clip prosthesis for partial ossicular reconstruction with a micro ball joint in the headplate for compensation of tympanic membrane displacements. Laboratory experiments followed by 18 consecutive patients. A micro ball joint was implemented into a headplate of titanium middle ear prosthesis. First, the new prosthesis was tested in the laboratory in temporal bone experiments. Second, the new prosthesis was clinically installed in 18 patients. Results of laser Doppler vibrometry and force measurements in the laboratory experiments, analysis of a questionnaire, and preoperative and postoperative pure tone audiometry. The frictional resistance in the joint was measured to be 12 mN that should allow for adequate mobility under physiologic conditions. The effective sound transmission of the prosthesis was demonstrated by laser Doppler vibrometry. Intraoperatively, the installation of the prosthesis was always straightforward with headplate prosthesis shaft angles between 60 and 90 degrees. Postoperatively, pure tone audiometry revealed satisfying hearing results with a remaining average air-bone gap of 18.2 dB over the frequencies 500, 1,000, 2,000, and 3,000 Hz. No signs of prosthesis dislocation were discovered within the follow-up period of approximately 6 months. The experimental data show that the new modified prosthesis headplate fulfills the requirements necessary for sound transmission. The joint allows the plate to follow movements of the tympanic membrane. This characteristic in conjunction with the proven clip design ensure for optimal prosthesis placement and effectiveness.
Research of facial feature extraction based on MMC
NASA Astrophysics Data System (ADS)
Xue, Donglin; Zhao, Jiufen; Tang, Qinhong; Shi, Shaokun
2017-07-01
Based on the maximum margin criterion (MMC), a new algorithm of statistically uncorrelated optimal discriminant vectors and a new algorithm of orthogonal optimal discriminant vectors for feature extraction were proposed. The purpose of the maximum margin criterion is to maximize the inter-class scatter while simultaneously minimizing the intra-class scatter after the projection. Compared with original MMC method and principal component analysis (PCA) method, the proposed methods are better in terms of reducing or eliminating the statistically correlation between features and improving recognition rate. The experiment results on Olivetti Research Laboratory (ORL) face database shows that the new feature extraction method of statistically uncorrelated maximum margin criterion (SUMMC) are better in terms of recognition rate and stability. Besides, the relations between maximum margin criterion and Fisher criterion for feature extraction were revealed.
Optimization of plasma amplifiers
Sadler, James D.; Trines, Raoul M. G. M.; Tabak, Max; ...
2017-05-24
Here, plasma amplifiers offer a route to side-step limitations on chirped pulse amplification and generate laser pulses at the power frontier. They compress long pulses by transferring energy to a shorter pulse via the Raman or Brillouin instabilities. We present an extensive kinetic numerical study of the three-dimensional parameter space for the Raman case. Further particle-in-cell simulations find the optimal seed pulse parameters for experimentally relevant constraints. The high-efficiency self-similar behavior is observed only for seeds shorter than the linear Raman growth time. A test case similar to an upcoming experiment at the Laboratory for Laser Energetics is found tomore » maintain good transverse coherence and high-energy efficiency. Effective compression of a 10kJ, nanosecond-long driver pulse is also demonstrated in a 15-cm-long amplifier.« less
Optimization of plasma amplifiers
NASA Astrophysics Data System (ADS)
Sadler, James D.; Trines, Raoul M. Â. G. Â. M.; Tabak, Max; Haberberger, Dan; Froula, Dustin H.; Davies, Andrew S.; Bucht, Sara; Silva, Luís O.; Alves, E. Paulo; Fiúza, Frederico; Ceurvorst, Luke; Ratan, Naren; Kasim, Muhammad F.; Bingham, Robert; Norreys, Peter A.
2017-05-01
Plasma amplifiers offer a route to side-step limitations on chirped pulse amplification and generate laser pulses at the power frontier. They compress long pulses by transferring energy to a shorter pulse via the Raman or Brillouin instabilities. We present an extensive kinetic numerical study of the three-dimensional parameter space for the Raman case. Further particle-in-cell simulations find the optimal seed pulse parameters for experimentally relevant constraints. The high-efficiency self-similar behavior is observed only for seeds shorter than the linear Raman growth time. A test case similar to an upcoming experiment at the Laboratory for Laser Energetics is found to maintain good transverse coherence and high-energy efficiency. Effective compression of a 10 kJ , nanosecond-long driver pulse is also demonstrated in a 15-cm-long amplifier.
Improved detonation modeling with CHEETAH
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heller, A.
1997-11-01
A Livermore software program called CHEETAH, an important, even indispensable tool for energetic materials researchers worldwide, was made more powerful in the summer of 1997 with the release of CHEETAH 2.0, an advanced version that simulates a wider variety of detonations. Derived from more than 40 years of experiments on high explosives at Lawrence Livermore and Los Alamos national laboratories, CHEETAH predicts the results from detonating a mixture of specified reactants. It operates by solving thermodynamic equations to predict detonation products and such properties as temperature, pressure, volume, and total energy released. The code is prized by synthesis chemists andmore » other researchers because it allows them to vary the starting molecules and conditions to optimize the desired performance properties. One of the Laboratory`s most popular computer codes, CHEETAH is used at more than 200 sites worldwide, including ones in England, Canada, Sweden, Switzerland, and France. Most sites are defense-related, although a few users, such as Japanese fireworks researchers, are in the civilian sector.« less
Process Development in the Teaching Laboratory
NASA Astrophysics Data System (ADS)
Klein, Leonard C.; Dana, Susanne M.
1998-06-01
Many experiences in high school and undergraduate laboratories are well-tested cookbook recipes that have already been designed to yield optimal results; the well-known synthesis of aspirin is such an example. In this project for advanced placement or second-year high school chemistry students, students mimic the process development in industrial laboratories by investigating the effect of varying conditions in the synthesis of aspirin. The class decides on criteria that should be explored (quantity of catalyst, temperature of reaction, etc.). The class is then divided into several teams with each team assigned a variable to study. Each team must submit a proposal describing how they will explore the variable before they start their study. After data on yield and purity has been gathered and evaluated, students discuss which method is most desirable, based on their agreed-upon criteria. This exercise provides an opportunity for students to review many topics from the course (rate of reaction, limiting reagents, Beer's Law) while participating in a cooperative exercise designed to imitate industrial process development.
Optimizing laboratory animal stress paradigms: The H-H* experimental design.
McCarty, Richard
2017-01-01
Major advances in behavioral neuroscience have been facilitated by the development of consistent and highly reproducible experimental paradigms that have been widely adopted. In contrast, many different experimental approaches have been employed to expose laboratory mice and rats to acute versus chronic intermittent stress. An argument is advanced in this review that more consistent approaches to the design of chronic intermittent stress experiments would provide greater reproducibility of results across laboratories and greater reliability relating to various neural, endocrine, immune, genetic, and behavioral adaptations. As an example, the H-H* experimental design incorporates control, homotypic (H), and heterotypic (H*) groups and allows for comparisons across groups, where each animal is exposed to the same stressor, but that stressor has vastly different biological and behavioral effects depending upon each animal's prior stress history. Implementation of the H-H* experimental paradigm makes possible a delineation of transcriptional changes and neural, endocrine, and immune pathways that are activated in precisely defined stressor contexts. Copyright © 2016 Elsevier Ltd. All rights reserved.
Multi-Objective Design Of Optimal Greenhouse Gas Observation Networks
NASA Astrophysics Data System (ADS)
Lucas, D. D.; Bergmann, D. J.; Cameron-Smith, P. J.; Gard, E.; Guilderson, T. P.; Rotman, D.; Stolaroff, J. K.
2010-12-01
One of the primary scientific functions of a Greenhouse Gas Information System (GHGIS) is to infer GHG source emission rates and their uncertainties by combining measurements from an observational network with atmospheric transport modeling. Certain features of the observational networks that serve as inputs to a GHGIS --for example, sampling location and frequency-- can greatly impact the accuracy of the retrieved GHG emissions. Observation System Simulation Experiments (OSSEs) provide a framework to characterize emission uncertainties associated with a given network configuration. By minimizing these uncertainties, OSSEs can be used to determine optimal sampling strategies. Designing a real-world GHGIS observing network, however, will involve multiple, conflicting objectives; there will be trade-offs between sampling density, coverage and measurement costs. To address these issues, we have added multi-objective optimization capabilities to OSSEs. We demonstrate these capabilities by quantifying the trade-offs between retrieval error and measurement costs for a prototype GHGIS, and deriving GHG observing networks that are Pareto optimal. [LLNL-ABS-452333: This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
NASA Technical Reports Server (NTRS)
Patten, W. N.; Robertshaw, H. H.; Pierpont, D.; Wynn, R. H.
1989-01-01
A new, near-optimal feedback control technique is introduced that is shown to provide excellent vibration attenuation for those distributed parameter systems that are often encountered in the areas of aeroservoelasticity and large space systems. The technique relies on a novel solution methodology for the classical optimal control problem. Specifically, the quadratic regulator control problem for a flexible vibrating structure is first cast in a weak functional form that admits an approximate solution. The necessary conditions (first-order) are then solved via a time finite-element method. The procedure produces a low dimensional, algebraic parameterization of the optimal control problem that provides a rigorous basis for a discrete controller with a first-order like hold output. Simulation has shown that the algorithm can successfully control a wide variety of plant forms including multi-input/multi-output systems and systems exhibiting significant nonlinearities. In order to firmly establish the efficacy of the algorithm, a laboratory control experiment was implemented to provide planar (bending) vibration attenuation of a highly flexible beam (with a first clamped-free mode of approximately 0.5 Hz).
Effective utilization of clinical laboratories.
Murphy, J; Henry, J B
1978-11-01
Effective utilization of clinical laboratories requires that underutilization, overutilization, and malutilization be appreciated and eliminated or reduced. Optimal patient care service, although subjective to a major extent, is reflected in terms of outcome and cost. Increased per diem charges, reduced hospital stay, and increased laboratory workload over the past decade all require each laboratory to examine its internal operations to achieve economy and efficiency as well as maximal effectiveness. Increased research and development, an active managerial role on the part of pathologists, internal self-assessment, and an aggressive response to sophisticated scientific and clinical laboratory data base requirements are not only desirable but essential. The importance of undergraduate and graduate medical education in laboratory medicine to insure understanding as well as effective utilization is stressed. The costs and limitations as well as the accuracy, precision, sensitivity, specificity, and pitfalls of measurements and examinations must also be fully appreciated. Medical malpractice and defensive medicine and the use of critical values, emergency and routine services, and an active clinical role by the pathologist are of the utmost value in assuring effective utilization of the laboratory. A model for the optimal use of the laboratory including economy and efficiency has been achieved in the blood bank in regard to optimal hemotherapy for elective surgery, assuring superior patient care in a cost effective and safe manner.
Optimization of a chemical identification algorithm
NASA Astrophysics Data System (ADS)
Chyba, Thomas H.; Fisk, Brian; Gunning, Christin; Farley, Kevin; Polizzi, Amber; Baughman, David; Simpson, Steven; Slamani, Mohamed-Adel; Almassy, Robert; Da Re, Ryan; Li, Eunice; MacDonald, Steve; Slamani, Ahmed; Mitchell, Scott A.; Pendell-Jones, Jay; Reed, Timothy L.; Emge, Darren
2010-04-01
A procedure to evaluate and optimize the performance of a chemical identification algorithm is presented. The Joint Contaminated Surface Detector (JCSD) employs Raman spectroscopy to detect and identify surface chemical contamination. JCSD measurements of chemical warfare agents, simulants, toxic industrial chemicals, interferents and bare surface backgrounds were made in the laboratory and under realistic field conditions. A test data suite, developed from these measurements, is used to benchmark algorithm performance throughout the improvement process. In any one measurement, one of many possible targets can be present along with interferents and surfaces. The detection results are expressed as a 2-category classification problem so that Receiver Operating Characteristic (ROC) techniques can be applied. The limitations of applying this framework to chemical detection problems are discussed along with means to mitigate them. Algorithmic performance is optimized globally using robust Design of Experiments and Taguchi techniques. These methods require figures of merit to trade off between false alarms and detection probability. Several figures of merit, including the Matthews Correlation Coefficient and the Taguchi Signal-to-Noise Ratio are compared. Following the optimization of global parameters which govern the algorithm behavior across all target chemicals, ROC techniques are employed to optimize chemical-specific parameters to further improve performance.
NASA Astrophysics Data System (ADS)
Guenet, Bertrand; Esteban Moyano, Fernando; Peylin, Philippe; Ciais, Philippe; Janssens, Ivan A.
2016-03-01
Priming of soil carbon decomposition encompasses different processes through which the decomposition of native (already present) soil organic matter is amplified through the addition of new organic matter, with new inputs typically being more labile than the native soil organic matter. Evidence for priming comes from laboratory and field experiments, but to date there is no estimate of its impact at global scale and under the current anthropogenic perturbation of the carbon cycle. Current soil carbon decomposition models do not include priming mechanisms, thereby introducing uncertainty when extrapolating short-term local observations to ecosystem and regional to global scale. In this study we present a simple conceptual model of decomposition priming, called PRIM, able to reproduce laboratory (incubation) and field (litter manipulation) priming experiments. Parameters for this model were first optimized against data from 20 soil incubation experiments using a Bayesian framework. The optimized parameter values were evaluated against another set of soil incubation data independent from the ones used for calibration and the PRIM model reproduced the soil incubations data better than the original, CENTURY-type soil decomposition model, whose decomposition equations are based only on first-order kinetics. We then compared the PRIM model and the standard first-order decay model incorporated into the global land biosphere model ORCHIDEE (Organising Carbon and Hydrology In Dynamic Ecosystems). A test of both models was performed at ecosystem scale using litter manipulation experiments from five sites. Although both versions were equally able to reproduce observed decay rates of litter, only ORCHIDEE-PRIM could simulate the observed priming (R2 = 0.54) in cases where litter was added or removed. This result suggests that a conceptually simple and numerically tractable representation of priming adapted to global models is able to capture the sign and magnitude of the priming of litter and soil organic matter.
NASA Astrophysics Data System (ADS)
Guenet, B.; Moyano, F. E.; Peylin, P.; Ciais, P.; Janssens, I. A.
2015-10-01
Priming of soil carbon decomposition encompasses different processes through which the decomposition of native (already present) soil organic matter is amplified through the addition of new organic matter, with new inputs typically being more labile than the native soil organic matter. Evidence for priming comes from laboratory and field experiments, but to date there is no estimate of its impact at global scale and under the current anthropogenic perturbation of the carbon cycle. Current soil carbon decomposition models do not include priming mechanisms, thereby introducing uncertainty when extrapolating short-term local observations to ecosystem and regional to global scale. In this study we present a simple conceptual model of decomposition priming, called PRIM, able to reproduce laboratory (incubation) and field (litter manipulation) priming experiments. Parameters for this model were first optimized against data from 20 soil incubation experiments using a Bayesian framework. The optimized parameter values were evaluated against another set of soil incubation data independent from the ones used for calibration and the PRIM model reproduced the soil incubations data better than the original, CENTURY-type soil decomposition model, whose decomposition equations are based only on first order kinetics. We then compared the PRIM model and the standard first order decay model incorporated into the global land biosphere model ORCHIDEE. A test of both models was performed at ecosystem scale using litter manipulation experiments from 5 sites. Although both versions were equally able to reproduce observed decay rates of litter, only ORCHIDEE-PRIM could simulate the observed priming (R2 = 0.54) in cases where litter was added or removed. This result suggests that a conceptually simple and numerically tractable representation of priming adapted to global models is able to capture the sign and magnitude of the priming of litter and soil organic matter.
Improving Sensorimotor Function and Adaptation using Stochastic Vestibular Stimulation
NASA Technical Reports Server (NTRS)
Galvan, R. C.; Bloomberg, J. J.; Mulavara, A. P.; Clark, T. K.; Merfeld, D. M.; Oman, C. M.
2014-01-01
Astronauts experience sensorimotor changes during adaption to G-transitions that occur when entering and exiting microgravity. Post space flight, these sensorimotor disturbances can include postural and gait instability, visual performance changes, manual control disruptions, spatial disorientation, and motion sickness, all of which can hinder the operational capabilities of the astronauts. Crewmember safety would be significantly increased if sensorimotor changes brought on by gravitational changes could be mitigated and adaptation could be facilitated. The goal of this research is to investigate and develop the use of electrical stochastic vestibular stimulation (SVS) as a countermeasure to augment sensorimotor function and facilitate adaptation. For this project, SVS will be applied via electrodes on the mastoid processes at imperceptible amplitude levels. We hypothesize that SVS will improve sensorimotor performance through the phenomena of stochastic resonance, which occurs when the response of a nonlinear system to a weak input signal is optimized by the application of a particular nonzero level of noise. In line with the theory of stochastic resonance, a specific optimal level of SVS will be found and tested for each subject [1]. Three experiments are planned to investigate the use of SVS in sensory-dependent tasks and performance. The first experiment will aim to demonstrate stochastic resonance in the vestibular system through perception based motion recognition thresholds obtained using a 6-degree of freedom Stewart platform in the Jenks Vestibular Laboratory at Massachusetts Eye and Ear Infirmary. A range of SVS amplitudes will be applied to each subject and the subjectspecific optimal SVS level will be identified as that which results in the lowest motion recognition threshold, through previously established, well developed methods [2,3,4]. The second experiment will investigate the use of optimal SVS in facilitating sensorimotor adaptation to system disturbances. Subjects will adapt to wearing minifying glasses, resulting in decreased vestibular ocular reflex (VOR) gain. The VOR gain will then be intermittently measured while the subject readapts to normal vision, with and without optimal SVS. We expect that optimal SVS will cause a steepening of the adaptation curve. The third experiment will test the use of optimal SVS in an operationally relevant aerospace task, using the tilt translation sled at NASA Johnson Space Center, a test platform capable of recreating the tilt-gain and tilt-translation illusions associated with landing of a spacecraft post-space flight. In this experiment, a perception based manual control measure will be used to compare performance with and without optimal SVS. We expect performance to improve in this task when optimal SVS is applied. The ultimate goal of this work is to systematically investigate and further understand the potential benefits of stochastic vestibular stimulation in the context of human space flight so that it may be used in the future as a component of a comprehensive countermeasure plan for adaptation to G-transitions.
Thiex, Nancy
2016-01-01
A previously validated method for the determination of nitrogen release patterns of slow- and controlled-release fertilizers (SRFs and CRFs, respectively) was submitted to the Expert Review Panel (ERP) for Fertilizers for consideration of First Action Official Method(SM) status. The ERP evaluated the single-laboratory validation results and recommended the method for First Action Official Method status and provided recommendations for achieving Final Action. The 180 day soil incubation-column leaching technique was demonstrated to be a robust and reliable method for characterizing N release patterns from SRFs and CRFs. The method was reproducible, and the results were only slightly affected by variations in environmental factors such as microbial activity, soil moisture, temperature, and texture. The release of P and K were also studied, but at fewer replications than for N. Optimization experiments on the accelerated 74 h extraction method indicated that temperature was the only factor found to substantially influence nutrient-release rates from the materials studied, and an optimized extraction profile was established as follows: 2 h at 25°C, 2 h at 50°C, 20 h at 55°C, and 50 h at 60°C.
Li, Liang; Mustafi, Debarshi; Fu, Qiang; Tereshko, Valentina; Chen, Delai L.; Tice, Joshua D.; Ismagilov, Rustem F.
2006-01-01
High-throughput screening and optimization experiments are critical to a number of fields, including chemistry and structural and molecular biology. The separation of these two steps may introduce false negatives and a time delay between initial screening and subsequent optimization. Although a hybrid method combining both steps may address these problems, miniaturization is required to minimize sample consumption. This article reports a “hybrid” droplet-based microfluidic approach that combines the steps of screening and optimization into one simple experiment and uses nanoliter-sized plugs to minimize sample consumption. Many distinct reagents were sequentially introduced as ≈140-nl plugs into a microfluidic device and combined with a substrate and a diluting buffer. Tests were conducted in ≈10-nl plugs containing different concentrations of a reagent. Methods were developed to form plugs of controlled concentrations, index concentrations, and incubate thousands of plugs inexpensively and without evaporation. To validate the hybrid method and demonstrate its applicability to challenging problems, crystallization of model membrane proteins and handling of solutions of detergents and viscous precipitants were demonstrated. By using 10 μl of protein solution, ≈1,300 crystallization trials were set up within 20 min by one researcher. This method was compatible with growth, manipulation, and extraction of high-quality crystals of membrane proteins, demonstrated by obtaining high-resolution diffraction images and solving a crystal structure. This robust method requires inexpensive equipment and supplies, should be especially suitable for use in individual laboratories, and could find applications in a number of areas that require chemical, biochemical, and biological screening and optimization. PMID:17159147
NASA Astrophysics Data System (ADS)
Waisman, E. M.; Reisman, D. B.; Stoltzfus, B. S.; Stygar, W. A.; Cuneo, M. E.; Haill, T. A.; Davis, J.-P.; Brown, J. L.; Seagle, C. T.; Spielman, R. B.
2016-06-01
The Thor pulsed power generator is being developed at Sandia National Laboratories. The design consists of up to 288 decoupled and transit time isolated capacitor-switch units, called "bricks," that can be individually triggered to achieve a high degree of pulse tailoring for magnetically driven isentropic compression experiments (ICE) [D. B. Reisman et al., Phys. Rev. Spec. Top.-Accel. Beams 18, 090401 (2015)]. The connecting transmission lines are impedance matched to the bricks, allowing the capacitor energy to be efficiently delivered to an ICE strip-line load with peak pressures of over 100 GPa. Thor will drive experiments to explore equation of state, material strength, and phase transition properties of a wide variety of materials. We present an optimization process for producing tailored current pulses, a requirement for many material studies, on the Thor generator. This technique, which is unique to the novel "current-adder" architecture used by Thor, entirely avoids the iterative use of complex circuit models to converge to the desired electrical pulse. We begin with magnetohydrodynamic simulations for a given material to determine its time dependent pressure and thus the desired strip-line load current and voltage. Because the bricks are connected to a central power flow section through transit-time isolated coaxial cables of constant impedance, the brick forward-going pulses are independent of each other. We observe that the desired equivalent forward-going current driving the pulse must be equal to the sum of the individual brick forward-going currents. We find a set of optimal brick delay times by requiring that the L2 norm of the difference between the brick-sum current and the desired forward-going current be a minimum. We describe the optimization procedure for the Thor design and show results for various materials of interest.
Waisman, E M; Reisman, D B; Stoltzfus, B S; Stygar, W A; Cuneo, M E; Haill, T A; Davis, J-P; Brown, J L; Seagle, C T; Spielman, R B
2016-06-01
The Thor pulsed power generator is being developed at Sandia National Laboratories. The design consists of up to 288 decoupled and transit time isolated capacitor-switch units, called "bricks," that can be individually triggered to achieve a high degree of pulse tailoring for magnetically driven isentropic compression experiments (ICE) [D. B. Reisman et al., Phys. Rev. Spec. Top.-Accel. Beams 18, 090401 (2015)]. The connecting transmission lines are impedance matched to the bricks, allowing the capacitor energy to be efficiently delivered to an ICE strip-line load with peak pressures of over 100 GPa. Thor will drive experiments to explore equation of state, material strength, and phase transition properties of a wide variety of materials. We present an optimization process for producing tailored current pulses, a requirement for many material studies, on the Thor generator. This technique, which is unique to the novel "current-adder" architecture used by Thor, entirely avoids the iterative use of complex circuit models to converge to the desired electrical pulse. We begin with magnetohydrodynamic simulations for a given material to determine its time dependent pressure and thus the desired strip-line load current and voltage. Because the bricks are connected to a central power flow section through transit-time isolated coaxial cables of constant impedance, the brick forward-going pulses are independent of each other. We observe that the desired equivalent forward-going current driving the pulse must be equal to the sum of the individual brick forward-going currents. We find a set of optimal brick delay times by requiring that the L2 norm of the difference between the brick-sum current and the desired forward-going current be a minimum. We describe the optimization procedure for the Thor design and show results for various materials of interest.
Leymarie, Vincent; Flandrin, Georges; Noguera, Maria Elena; Leymarie, Florence; Lioure, Bruno; Daliphard, Sylvie
2006-09-01
Although modern communication technology is well developed, telehematology does not readily lend itself to practical laboratory use. Multicenter therapeutic protocols may offer preferential opportunities. The cytologists of the AML-2001 protocol established an innovative organization to demonstrate the reliability of the diagnostic assessment of acute myeloid leukemia through a rapid and decentralized exchange of information via the internet and to define the conditions optimizing expert diagnosis. Telediagnosis appears to be a powerful tool for cytological review and other issues.
Pitfalls in PCR troubleshooting: Expect the unexpected?
Schrick, Livia; Nitsche, Andreas
2015-01-01
PCR is a well-understood and established laboratory technique often used in molecular diagnostics. Huge experience has been accumulated over the last years regarding the design of PCR assays and their set-up, including in-depth troubleshooting to obtain the optimal PCR assay for each purpose. Here we report a PCR troubleshooting that came up with a surprising result never observed before. With this report we hope to sensitize the reader to this peculiar problem and to save troubleshooting efforts in similar situations, especially in time-critical and ambitious diagnostic settings. PMID:27077041
NASA Astrophysics Data System (ADS)
Jamroz, Ben; Julien, Keith; Knobloch, Edgar
2008-12-01
Taking advantage of disparate spatio-temporal scales relevant to astrophysics and laboratory experiments, we derive asymptotically exact reduced partial differential equation models for the magnetorotational instability. These models extend recent single-mode formulations leading to saturation in the presence of weak dissipation, and are characterized by a back-reaction on the imposed shear. Numerical simulations performed for a broad class of initial conditions indicate an initial phase of growth dominated by the optimal (fastest growing) magnetorotational instability fingering mode, followed by a vertical coarsening to a box-filling mode.
Aparicio, Juan Daniel; Raimondo, Enzo Emanuel; Gil, Raúl Andrés; Benimeli, Claudia Susana; Polti, Marta Alejandra
2018-01-15
The objective of the present work was to establish optimal biological and physicochemical parameters in order to remove simultaneously lindane and Cr(VI) at high and/or low pollutants concentrations from the soil by an actinobacteria consortium formed by Streptomyces sp. M7, MC1, A5, and Amycolatopsis tucumanensis AB0. Also, the final aim was to treat real soils contaminated with Cr(VI) and/or lindane from the Northwest of Argentina employing the optimal biological and physicochemical conditions. In this sense, after determining the optimal inoculum concentration (2gkg -1 ), an experimental design model with four factors (temperature, moisture, initial concentration of Cr(VI) and lindane) was employed for predicting the system behavior during bioremediation process. According to response optimizer, the optimal moisture level was 30% for all bioremediation processes. However, the optimal temperature was different for each situation: for low initial concentrations of both pollutants, the optimal temperature was 25°C; for low initial concentrations of Cr(VI) and high initial concentrations of lindane, the optimal temperature was 30°C; and for high initial concentrations of Cr(VI), the optimal temperature was 35°C. In order to confirm the model adequacy and the validity of the optimization procedure, experiments were performed in six real contaminated soils samples. The defined actinobacteria consortium reduced the contaminants concentrations in five of the six samples, by working at laboratory scale and employing the optimal conditions obtained through the factorial design. Copyright © 2017 Elsevier B.V. All rights reserved.
Favre, B; Bonche, J P; Meheir, H; Peyrin, J O
1990-02-01
For many years, a number of laboratories have been working on the applications of very low field NMR. In 1985, our laboratory presented the first NMR images using the earth's magnetic field. However, the use of this technique was limited by the weakness of the signal and the disturbing effects of the environment on the signal-to-noise ratio and on the homogeneity of the static magnetic field. Therefore experiments has to be performed in places with low environmental disturbances, such as open country or large parks. In 1986, we installed a new station in Lyon, in the town's hostile environment. Good NMR signals can now be obtained (with a signal-to-noise ratio better than 200 and a time constant T2 better than 3s for 200-mnl water samples and at a temperature of about 40 degrees C). We report the terrace roof of our faculty building. Gradient coils were used to correct the local inhomogeneities of the earth's magnetic field. We show FIDs and MR images of water-filled tubes made with or without these improvements.
Design and Development of Turbodrill Blade Used in Crystallized Section
Yu, Wang; Jianyi, Yao; Zhijun, Li
2014-01-01
Turbodrill is a type of hydraulic axial turbomachinery which has a multistage blade consisting of stators and rotors. In this paper, a turbodrill blade that can be applied in crystallized section under high temperature and pressure conditions is developed. On the basis of Euler equations, the law of energy transfer is analyzed and the output characteristics of turbodrill blade are proposed. Moreover, considering the properties of the layer and the bole-hole conditions, the radical size, the geometrical dimension, and the blade profile are optimized. A computational model of a single-stage blade is built on the ANSYS CFD into which the three-dimensional model of turbodrill is input. In light of the distribution law of the pressure and flow field, the functions of the turbodrill blade are improved and optimized. The turbodrill blade optimization model was verified based on laboratory experiments. The results show that the design meets the deep hard rock mineral exploration application and provides good references for further study. PMID:25276857
Jacchia, Sara; Nardini, Elena; Savini, Christian; Petrillo, Mauro; Angers-Loustau, Alexandre; Shim, Jung-Hyun; Trijatmiko, Kurniawan; Kreysa, Joachim; Mazzara, Marco
2015-02-18
In this study, we developed, optimized, and in-house validated a real-time PCR method for the event-specific detection and quantification of Golden Rice 2, a genetically modified rice with provitamin A in the grain. We optimized and evaluated the performance of the taxon (targeting rice Phospholipase D α2 gene)- and event (targeting the 3' insert-to-plant DNA junction)-specific assays that compose the method as independent modules, using haploid genome equivalents as unit of measurement. We verified the specificity of the two real-time PCR assays and determined their dynamic range, limit of quantification, limit of detection, and robustness. We also confirmed that the taxon-specific DNA sequence is present in single copy in the rice genome and verified its stability of amplification across 132 rice varieties. A relative quantification experiment evidenced the correct performance of the two assays when used in combination.
Zahed, Mohammad Ali; Aziz, Hamidi Abdul; Mohajeri, Leila; Mohajeri, Soraya; Kutty, Shamsul Rahman Mohamed; Isa, Mohamed Hasnain
2010-12-15
Response surface methodology (RSM) was employed to optimize nitrogen and phosphorus concentrations for removal of n-alkanes from crude oil contaminated seawater samples in batch reactors. Erlenmeyer flasks were used as bioreactors; each containing 250 mL dispersed crude oil contaminated seawater, indigenous acclimatized microorganism and different amounts of nitrogen and phosphorus based on central composite design (CCD). Samples were extracted and analyzed according to US-EPA protocols using a gas chromatograph. During 28 days of bioremediation, a maximum of 95% total aliphatic hydrocarbons removal was observed. The obtained Model F-value of 267.73 and probability F<0.0001 implied the model was significant. Numerical condition optimization via a quadratic model, predicted 98% n-alkanes removal for a 20-day laboratory bioremediation trial using nitrogen and phosphorus concentrations of 13.62 and 1.39 mg/L, respectively. In actual experiments, 95% removal was observed under these conditions. Copyright © 2010 Elsevier B.V. All rights reserved.
2012-07-01
detection only condition followed either face detection only or dual task, thus ensuring that participants were practiced in face detection before...1 ARMY RSCH LABORATORY – HRED RDRL HRM C A DAVISON 320 MANSCEN LOOP STE 115 FORT LEONARD WOOD MO 65473 2 ARMY RSCH LABORATORY...HRED RDRL HRM DI T DAVIS J HANSBERGER BLDG 5400 RM C242 REDSTONE ARSENAL AL 35898-7290 1 ARMY RSCH LABORATORY – HRED RDRL HRS
Multicellular microorganisms: laboratory versus nature.
Palková, Zdena
2004-05-01
Our present in-depth knowledge of the physiology and regulatory mechanisms of microorganisms has arisen from our ability to remove them from their natural, complex ecosystems into pure liquid cultures. These cultures are grown under optimized laboratory conditions and allow us to study microorganisms as individuals. However, microorganisms naturally grow in conditions that are far from optimal, which causes them to become organized into multicellular communities that are better protected against the harmful environment. Moreover, this multicellular existence allows individual cells to differentiate and acquire specific properties, such as forming resistant spores, which benefit the whole population. The relocation of natural microorganisms to the laboratory can result in their adaptation to these favourable conditions, which is accompanied by complex changes that include the repression of some protective mechanisms that are essential in nature. Laboratory microorganisms that have been cultured for long periods under optimized conditions might therefore differ markedly from those that exist in natural ecosystems.
Validating models of target acquisition performance in the dismounted soldier context
NASA Astrophysics Data System (ADS)
Glaholt, Mackenzie G.; Wong, Rachel K.; Hollands, Justin G.
2018-04-01
The problem of predicting real-world operator performance with digital imaging devices is of great interest within the military and commercial domains. There are several approaches to this problem, including: field trials with imaging devices, laboratory experiments using imagery captured from these devices, and models that predict human performance based on imaging device parameters. The modeling approach is desirable, as both field trials and laboratory experiments are costly and time-consuming. However, the data from these experiments is required for model validation. Here we considered this problem in the context of dismounted soldiering, for which detection and identification of human targets are essential tasks. Human performance data were obtained for two-alternative detection and identification decisions in a laboratory experiment in which photographs of human targets were presented on a computer monitor and the images were digitally magnified to simulate range-to-target. We then compared the predictions of different performance models within the NV-IPM software package: Targeting Task Performance (TTP) metric model and the Johnson model. We also introduced a modification to the TTP metric computation that incorporates an additional correction for target angular size. We examined model predictions using NV-IPM default values for a critical model constant, V50, and we also considered predictions when this value was optimized to fit the behavioral data. When using default values, certain model versions produced a reasonably close fit to the human performance data in the detection task, while for the identification task all models substantially overestimated performance. When using fitted V50 values the models produced improved predictions, though the slopes of the performance functions were still shallow compared to the behavioral data. These findings are discussed in relation to the models' designs and parameters, and the characteristics of the behavioral paradigm.
NASA Technical Reports Server (NTRS)
Barker, L. Keith; Mckinney, William S., Jr.
1989-01-01
The Laboratory Telerobotic Manipulator (LTM) is a seven-degree-of-freedom robot arm. Two of the arms were delivered to Langley Research Center for ground-based research to assess the use of redundant degree-of-freedom robot arms in space operations. Resolved-rate control equations for the LTM are derived. The equations are based on a scheme developed at the Oak Ridge National Laboratory for computing optimized joint angle rates in real time. The optimized joint angle rates actually represent a trade-off, as the hand moves, between small rates (least-squares solution) and those rates which work toward satisfying a specified performance criterion of joint angles. In singularities where the optimization scheme cannot be applied, alternate control equations are devised. The equations developed were evaluated using a real-time computer simulation to control a 3-D graphics model of the LTM.
Gurm, Hitinder S; Hosman, Carrie; Bates, Eric R; Share, David; Hansen, Ben B
2015-02-01
Eptifibatide, a small-molecule glycoprotein IIb/IIIa inhibitor, is conventionally administered as a bolus plus infusion. A growing number of clinicians are using a strategy of catheterization laboratory-only eptifibatide (an off-label use) as procedural pharmacotherapy for patients undergoing percutaneous coronary intervention although the comparative effectiveness of this approach is unknown. We compared the in-hospital outcome of patients undergoing percutaneous coronary intervention across 47 hospitals and treated with eptifibatide bolus plus infusion with those treated with a catheterization laboratory-only regimen. We used optimal matching to link the use of catheterization laboratory-only eptifibatide with clinical outcomes, including mortality, myocardial infarction, bleeding, and need for transfusion. Of the 84 678 percutaneous coronary interventions performed during 2010 to 2011, and meeting our inclusion criteria, eptifibatide was administered to 21 296 patients. Of these, a catheterization laboratory-only regimen was used in 4511 patients, whereas 16 785 patients were treated with bolus plus infusion. In the optimally matched analysis, compared with bolus plus infusion, a catheterization laboratory-only regimen was associated with a reduction in bleeding (optimally matched adjusted odds ratio, 0.74; 95% confidence interval, 0.58-0.93; P=0.014) and need for transfusion (optimally matched adjusted odds ratio, 0.70; 95% confidence interval, 0.52-0.92; P=0.012), with no difference in mortality or myocardial infarction. A catheterization laboratory-only eptifibatide regimen is commonly used in clinical practice and is associated with a significant reduction in bleeding complications in patients undergoing contemporary percutaneous coronary intervention. © 2015 American Heart Association, Inc.
Panaccione, G; Vobornik, I; Fujii, J; Krizmancic, D; Annese, E; Giovanelli, L; Maccherozzi, F; Salvador, F; De Luisa, A; Benedetti, D; Gruden, A; Bertoch, P; Polack, F; Cocco, D; Sostero, G; Diviacco, B; Hochstrasser, M; Maier, U; Pescia, D; Back, C H; Greber, T; Osterwalder, J; Galaktionov, M; Sancrotti, M; Rossi, G
2009-04-01
We report the main characteristics of the advanced photoelectric effect experiments beamline, operational at Elettra storage ring, featuring a fully independent double branch scheme obtained by the use of chicane undulators and able to keep polarization control in both linear and circular mode. The paper describes the novel technical solutions adopted, namely, (a) the design of a quasiperiodic undulator resulting in optimized suppression of higher harmonics over a large photon energy range (10-100 eV), (b) the thermal stability of optics under high heat load via cryocoolers, and (c) the end station interconnected setup allowing full access to off-beam and on-beam facilities and, at the same time, the integration of users' specialized sample growth chambers or modules.
Reciprocal Space Mapping of Macromolecular Crystals in the Home Laboratory
NASA Technical Reports Server (NTRS)
Snell, Edward H.; Fewster, P. F.; Andrew, Norman; Boggon, T. J.; Judge, Russell A.; Pusey, Marc A.
1999-01-01
Reciprocal space mapping techniques are used widely by the materials science community to provide physical information about their crystal samples. We have used similar methods at synchrotron sources to look at the quality of macromolecular crystals produced both on the ground and under microgravity conditions. The limited nature of synchrotron time has led us to explore the use of a high resolution materials research diffractometer to perform similar measurements in the home laboratory. Although the available intensity is much reduced due to the beam conditioning necessary for high reciprocal space resolution, lower resolution data can be collected in the same detail as the synchrotron source. Experiments can be optimized at home to make most benefit from the synchrotron time available. Preliminary results including information on the mosaicity and the internal strains from reciprocal space maps will be presented.
SU-F-T-670: From the OR to the Radiobiology Lab: The Journey of a Small X-Ray Source
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lehmann, J; The University of Sydney, Sydney, NSW; The University of Newcastle, Newcastle, NSW
Purpose: Irradiation of small animal tumor models within laboratories is vital to radiobiological experiments. Often the animals are not able to be brought back into the lab after being taken out for irradiation. Cell biology laboratories benefit from irradiation capability available around the clock without regard to patient load in an associated radiotherapy clinic. Commercial systems are available, but bulky and expensive. Methods: An intraoperative kV irradiation system (IntraBeam™) designed to deliver spherical dose distributions to surgical cavities has been repurposed for the irradiation of cell plates and small laboratory animals. An applicator has been altered to allow for simple,more » open fields. Special collimators are being developed. BEAMnrc Monte Carlo simulations with the “NRC swept BEAM” source model have been performed to characterize the dose distributions, to develop optimal collimators and as basis for dose prescription. Measurements with radiochromic film and with an ionization chamber were performed to characterize the beam and to validate the simulations. Results: Using its highest setting (50 kV and 40 µA) the x-ray unit is capable of delivering dose rates over 1 Gy/min homogeneously to standard cell plates even without an optimized collimator. Smaller areas (tumors in animals) can be irradiated with significantly higher dose rates (> 20 Gy/min) depending on distance of the source to the tumor. The HVL was found to be 0.21 mm Al which means the shielding requirements for the device are easily achievable in the lab. Conclusion: A mobile irradiation facility is feasible. It will allow easier access to radiation for radiobiology experiments. The modified system is versatile in that for cell plates homogenous irradiations can be achieved through distance from the source, while for high dose rate small field irradiations the source can be brought in close proximity to the target.« less
Fernando, Elizabeth H; Dicay, Michael; Stahl, Martin; Gordon, Marilyn H; Vegso, Andrew; Baggio, Cristiane; Alston, Laurie; Lopes, Fernando; Baker, Kristi; Hirota, Simon; McKay, Derek M; Vallance, Bruce; MacNaughton, Wallace K
2017-11-01
Cancer cell lines have been the mainstay of intestinal epithelial experimentation for decades, due primarily to their immortality and ease of culture. However, because of the inherent biological abnormalities of cancer cell lines, many cellular biologists are currently transitioning away from these models and toward more representative primary cells. This has been particularly challenging, but recent advances in the generation of intestinal organoids have brought the routine use of primary cells within reach of most epithelial biologists. Nevertheless, even with the proliferation of publications that use primary intestinal epithelial cells, there is still a considerable amount of trial and error required for laboratories to establish a consistent and reliable method to culture three-dimensional (3D) intestinal organoids and primary epithelial monolayers. We aim to minimize the time other laboratories spend troubleshooting the technique and present a standard method for culturing primary epithelial cells. Therefore, we have described our optimized, high-yield, cost-effective protocol to grow 3D murine colonoids for more than 20 passages and our detailed methods to culture these cells as confluent monolayers for at least 14 days, enabling a wide variety of potential future experiments. By supporting and expanding on the current literature of primary epithelial culture optimization and detailed use in experiments, we hope to help enable the widespread adoption of these innovative methods and allow consistency of results obtained across laboratories and institutions. NEW & NOTEWORTHY Primary intestinal epithelial monolayers are notoriously difficult to maintain culture, even with the recent advances in the field. We describe, in detail, the protocols required to maintain three-dimensional cultures of murine colonoids and passage these primary epithelial cells to confluent monolayers in a standardized, high-yield and cost-effective manner. Copyright © 2017 the American Physiological Society.
The role of veterinary research laboratories in the provision of veterinary services.
Verwoerd, D W
1998-08-01
Veterinary research laboratories play an essential role in the provision of veterinary services in most countries. These laboratories are the source of new knowledge, innovative ideas and improved technology for the surveillance, prevention and control of animal diseases. In addition, many laboratories provide diagnostic and other services. To ensure the optimal integration of various veterinary activities, administrators must understand the functions and constraints of research laboratories. Therefore, a brief discussion is presented of the following: organisational structures methods for developing research programmes outputs of research scientists and how these are measured the management of quality assurance funding of research. Optimal collaboration can only be attained by understanding the environment in which a research scientist functions and the motivational issues at stake.
Myneni, Sahiti; Patel, Vimla L.; Bova, G. Steven; Wang, Jian; Ackerman, Christopher F.; Berlinicke, Cynthia A.; Chen, Steve H.; Lindvall, Mikael; Zack, Donald J.
2016-01-01
This paper describes a distributed collaborative effort between industry and academia to systematize data management in an academic biomedical laboratory. Heterogeneous and voluminous nature of research data created in biomedical laboratories make information management difficult and research unproductive. One such collaborative effort was evaluated over a period of four years using data collection methods including ethnographic observations, semi-structured interviews, web-based surveys, progress reports, conference call summaries, and face-to-face group discussions. Data were analyzed using qualitative methods of data analysis to 1) characterize specific problems faced by biomedical researchers with traditional information management practices, 2) identify intervention areas to introduce a new research information management system called Labmatrix, and finally to 3) evaluate and delineate important general collaboration (intervention) characteristics that can optimize outcomes of an implementation process in biomedical laboratories. Results emphasize the importance of end user perseverance, human-centric interoperability evaluation, and demonstration of return on investment of effort and time of laboratory members and industry personnel for success of implementation process. In addition, there is an intrinsic learning component associated with the implementation process of an information management system. Technology transfer experience in a complex environment such as the biomedical laboratory can be eased with use of information systems that support human and cognitive interoperability. Such informatics features can also contribute to successful collaboration and hopefully to scientific productivity. PMID:26652980
Bordoloi, Shreemoyee; Nath, Suresh K; Gogoi, Sweety; Dutta, Robin K
2013-09-15
A three-step treatment process involving (i) mild alkaline pH-conditioning by NaHCO₃; (ii) oxidation of arsenite and ferrous ions by KMnO₄, itself precipitating as insoluble MnO₂ under the pH condition; and (iii) coagulation by FeCl₃ has been used for simultaneous removal of arsenic and iron ions from water. The treated water is filtered after a residence time of 1-2 h. Laboratory batch experiments were performed to optimize the doses. A field trial was performed with an optimized recipe at 30 households and 5 schools at some highly arsenic affected villages in Assam, India. Simultaneous removals of arsenic from initial 0.1-0.5 mg/L to about 5 μg/L and iron from initial 0.3-5.0 mg/L to less than 0.1 mg/L have been achieved along with final pH between 7.0 and 7.5 after residence time of 1h. The process also removes other heavy elements, if present, without leaving any additional toxic residue. The small quantity of solid sludge containing mainly ferrihydrite with adsorbed arsenate passes the toxicity characteristic leaching procedure (TCLP) test. The estimated recurring cost is approximately USD 0.16 per/m(3) of purified water. A high efficiency, an extremely low cost, safety, non-requirement of power and simplicity of operation make the technique potential for rural application. Copyright © 2013 Elsevier B.V. All rights reserved.
Educating Laboratory Science Learners at a Distance Using Interactive Television
ERIC Educational Resources Information Center
Reddy, Christopher
2014-01-01
Laboratory science classes offered to students learning at a distance require a methodology that allows for the completion of tactile activities. Literature describes three different methods of solving the distance laboratory dilemma: kit-based laboratory experience, computer-based laboratory experience, and campus-based laboratory experience,…
New method of processing heat treatment experiments with numerical simulation support
NASA Astrophysics Data System (ADS)
Kik, T.; Moravec, J.; Novakova, I.
2017-08-01
In this work, benefits of combining modern software for numerical simulations of welding processes with laboratory research was described. Proposed new method of processing heat treatment experiments leading to obtaining relevant input data for numerical simulations of heat treatment of large parts was presented. It is now possible, by using experiments on small tested samples, to simulate cooling conditions comparable with cooling of bigger parts. Results from this method of testing makes current boundary conditions during real cooling process more accurate, but also can be used for improvement of software databases and optimization of a computational models. The point is to precise the computation of temperature fields for large scale hardening parts based on new method of temperature dependence determination of the heat transfer coefficient into hardening media for the particular material, defined maximal thickness of processed part and cooling conditions. In the paper we will also present an example of the comparison standard and modified (according to newly suggested methodology) heat transfer coefficient data’s and theirs influence on the simulation results. It shows how even the small changes influence mainly on distribution of temperature, metallurgical phases, hardness and stresses distribution. By this experiment it is also possible to obtain not only input data and data enabling optimization of computational model but at the same time also verification data. The greatest advantage of described method is independence of used cooling media type.
Implementation of Epic Beaker Anatomic Pathology at an Academic Medical Center.
Blau, John Larry; Wilford, Joseph D; Dane, Susan K; Karandikar, Nitin J; Fuller, Emily S; Jacobsmeier, Debbie J; Jans, Melissa A; Horning, Elisabeth A; Krasowski, Matthew D; Ford, Bradley A; Becker, Kent R; Beranek, Jeanine M; Robinson, Robert A
2017-01-01
Beaker is a relatively new laboratory information system (LIS) offered by Epic Systems Corporation as part of its suite of health-care software and bundled with its electronic medical record, EpicCare. It is divided into two modules, Beaker anatomic pathology (Beaker AP) and Beaker Clinical Pathology. In this report, we describe our experience implementing Beaker AP version 2014 at an academic medical center with a go-live date of October 2015. This report covers preimplementation preparations and challenges beginning in September 2014, issues discovered soon after go-live in October 2015, and some post go-live optimizations using data from meetings, debriefings, and the project closure document. We share specific issues that we encountered during implementation, including difficulties with the proposed frozen section workflow, developing a shared specimen source dictionary, and implementation of the standard Beaker workflow in large institution with trainees. We share specific strategies that we used to overcome these issues for a successful Beaker AP implementation. Several areas of the laboratory-required adaptation of the default Beaker build parameters to meet the needs of the workflow in a busy academic medical center. In a few areas, our laboratory was unable to use the Beaker functionality to support our workflow, and we have continued to use paper or have altered our workflow. In spite of several difficulties that required creative solutions before go-live, the implementation has been successful based on satisfaction surveys completed by pathologists and others who use the software. However, optimization of Beaker workflows has continued to be an ongoing process after go-live to the present time. The Beaker AP LIS can be successfully implemented at an academic medical center but requires significant forethought, creative adaptation, and continued shared management of the ongoing product by institutional and departmental information technology staff as well as laboratory managers to meet the needs of the laboratory.
Edwards, Thea M; Morgan, Howard E; Balasca, Coralia; Chalasani, Naveen K; Yam, Lauren; Roark, Alison M
2018-01-01
The Yeast Estrogen Screen (YES) is used to detect estrogenic ligands in environmental samples and has been broadly applied in studies of endocrine disruption. Estrogenic ligands include both natural and manmade "Environmental Estrogens" (EEs) found in many consumer goods including Personal Care Products (PCPs), plastics, pesticides, and foods. EEs disrupt hormone signaling in humans and other animals, potentially reducing fertility and increasing disease risk. Despite the importance of EEs and other Endocrine Disrupting Chemicals (EDCs) to public health, endocrine disruption is not typically included in undergraduate curricula. This shortcoming is partly due to a lack of relevant laboratory activities that illustrate the principles involved while also being accessible to undergraduate students. This article presents an optimized YES for quantifying ligands in personal care products that bind estrogen receptors alpha (ERα) and/or beta (ERβ). The method incorporates one of the two colorimetric substrates (ortho-nitrophenyl-β-D-galactopyranoside (ONPG) or chlorophenol red-β-D-galactopyranoside (CPRG)) that are cleaved by β-galactosidase, a 6-day refrigerated incubation step to facilitate use in undergraduate laboratory courses, an automated application for LacZ calculations, and R code for the associated 4-parameter logistic regression analysis. The protocol has been designed to allow undergraduate students to develop and conduct experiments in which they screen products of their choosing for estrogen mimics. In the process, they learn about endocrine disruption, cell culture, receptor binding, enzyme activity, genetic engineering, statistics, and experimental design. Simultaneously, they also practice fundamental and broadly applicable laboratory skills, such as: calculating concentrations; making solutions; demonstrating sterile technique; serially diluting standards; constructing and interpolating standard curves; identifying variables and controls; collecting, organizing, and analyzing data; constructing and interpreting graphs; and using common laboratory equipment such as micropipettors and spectrophotometers. Thus, implementing this assay encourages students to engage in inquiry-based learning while exploring emerging issues in environmental science and health.
Edwards, Thea M.; Morgan, Howard E.; Balasca, Coralia; Chalasani, Naveen K.; Yam, Lauren; Roark, Alison M.
2018-01-01
The Yeast Estrogen Screen (YES) is used to detect estrogenic ligands in environmental samples and has been broadly applied in studies of endocrine disruption. Estrogenic ligands include both natural and manmade "Environmental Estrogens" (EEs) found in many consumer goods including Personal Care Products (PCPs), plastics, pesticides, and foods. EEs disrupt hormone signaling in humans and other animals, potentially reducing fertility and increasing disease risk. Despite the importance of EEs and other Endocrine Disrupting Chemicals (EDCs) to public health, endocrine disruption is not typically included in undergraduate curricula. This shortcoming is partly due to a lack of relevant laboratory activities that illustrate the principles involved while also being accessible to undergraduate students. This article presents an optimized YES for quantifying ligands in personal care products that bind estrogen receptors alpha (ERα) and/or beta (ERβ). The method incorporates one of the two colorimetric substrates (ortho-nitrophenyl-β-D-galactopyranoside (ONPG) or chlorophenol red-β-D-galactopyranoside (CPRG)) that are cleaved by β-galactosidase, a 6-day refrigerated incubation step to facilitate use in undergraduate laboratory courses, an automated application for LacZ calculations, and R code for the associated 4-parameter logistic regression analysis. The protocol has been designed to allow undergraduate students to develop and conduct experiments in which they screen products of their choosing for estrogen mimics. In the process, they learn about endocrine disruption, cell culture, receptor binding, enzyme activity, genetic engineering, statistics, and experimental design. Simultaneously, they also practice fundamental and broadly applicable laboratory skills, such as: calculating concentrations; making solutions; demonstrating sterile technique; serially diluting standards; constructing and interpolating standard curves; identifying variables and controls; collecting, organizing, and analyzing data; constructing and interpreting graphs; and using common laboratory equipment such as micropipettors and spectrophotometers. Thus, implementing this assay encourages students to engage in inquiry-based learning while exploring emerging issues in environmental science and health. PMID:29364271
NASA Astrophysics Data System (ADS)
Sanchez, M. J.; Santamarina, C.; Gai, X., Sr.; Teymouri, M., Sr.
2017-12-01
Stability and behavior of Hydrate Bearing Sediments (HBS) are characterized by the metastable character of the gas hydrate structure which strongly depends on thermo-hydro-chemo-mechanical (THCM) actions. Hydrate formation, dissociation and methane production from hydrate bearing sediments are coupled THCM processes that involve, amongst other, exothermic formation and endothermic dissociation of hydrate and ice phases, mixed fluid flow and large changes in fluid pressure. The analysis of available data from past field and laboratory experiments, and the optimization of future field production studies require a formal and robust numerical framework able to capture the very complex behavior of this type of soil. A comprehensive fully coupled THCM formulation has been developed and implemented into a finite element code to tackle problems involving gas hydrates sediments. Special attention is paid to the geomechanical behavior of HBS, and particularly to their response upon hydrate dissociation under loading. The numerical framework has been validated against recent experiments conducted under controlled conditions in the laboratory that challenge the proposed approach and highlight the complex interaction among THCM processes in HBS. The performance of the models in these case studies is highly satisfactory. Finally, the numerical code is applied to analyze the behavior of gas hydrate soils under field-scale conditions exploring different features of material behavior under possible reservoir conditions.
High-Z plasma facing components in fusion devices: boundary conditions and operational experiences
NASA Astrophysics Data System (ADS)
Neu, R.
2006-04-01
In present day fusion devices optimization of the performance and experimental freedom motivates the use of low-Z plasma facing materials (PFMs). However, in a future fusion reactor, for economic reasons, a sufficient lifetime of the first wall components is essential. Additionally, tritium retention has to be small to meet safety requirements. Tungsten appears to be the most realistic material choice for reactor plasma facing components (PFCs) because it exhibits the lowest erosion. But besides this there are a lot of criteria which have to be fulfilled simultaneously in a reactor. Results from present day devices and from laboratory experiments confirm the advantages of high-Z PFMs but also point to operational restrictions, when using them as PFCs. These are associated with the central impurity concentration, which is determined by the sputtering yield, the penetration of the impurities and their transport within the confined plasma. The restrictions could exclude successful operation of a reactor, but concomitantly there exist remedies to ameliorate their impact. Obviously some price has to be paid in terms of reduced performance but lacking of materials or concepts which could substitute high-Z PFCs, emphasis has to be put on the development and optimization of reactor-relevant scenarios which incorporate the experiences and measures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kobayshi, Takeshi; Nishiyama, Yusuke; Pruski, Marek
The main focus of this chapter is to address experimental strategies on the subject by providing a hands-on guide to fast MAS experiments, with a particular focus on indirect detection. Although our experience is limited to our respective laboratories in Ames and Yokohama, we hope that our descriptions of experimental setups and optimization procedures are sufficiently general to be applicable to all modern instruments. The chapter is organized as follows. Section 2 below introduces briefly the fast MAS technology and its main advantages. In Section 3, we describe the hardware associated with this remarkable technology and provide practical advices onmore » its use, including procedures for loading and unloading the samples, maintaining the probe, reducing t 1 noise, etc. In Section 4, we describe the principles and hands-on aspects of experiments involving the indirect detection of spin-1/2 and 14N nuclei« less
Practical witness for electronic coherences
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Allan S.; Department of Physics, Imperial College London, London; Yuen-Zhou, Joel
2014-12-28
The origin of the coherences in two-dimensional spectroscopy of photosynthetic complexes remains disputed. Recently, it has been shown that in the ultrashort-pulse limit, oscillations in a frequency-integrated pump-probe signal correspond exclusively to electronic coherences, and thus such experiments can be used to form a test for electronic vs. vibrational oscillations in such systems. Here, we demonstrate a method for practically implementing such a test, whereby pump-probe signals are taken at several different pulse durations and used to extrapolate to the ultrashort-pulse limit. We present analytic and numerical results determining requirements for pulse durations and the optimal choice of pulse centralmore » frequency, which can be determined from an absorption spectrum. Our results suggest that for numerous systems, the required experiment could be implemented by many ultrafast spectroscopy laboratories using pulses of tens of femtoseconds in duration. Such experiments could resolve the standing debate over the nature of coherences in photosynthetic complexes.« less
Practical witness for electronic coherences.
Johnson, Allan S; Yuen-Zhou, Joel; Aspuru-Guzik, Alán; Krich, Jacob J
2014-12-28
The origin of the coherences in two-dimensional spectroscopy of photosynthetic complexes remains disputed. Recently, it has been shown that in the ultrashort-pulse limit, oscillations in a frequency-integrated pump-probe signal correspond exclusively to electronic coherences, and thus such experiments can be used to form a test for electronic vs. vibrational oscillations in such systems. Here, we demonstrate a method for practically implementing such a test, whereby pump-probe signals are taken at several different pulse durations and used to extrapolate to the ultrashort-pulse limit. We present analytic and numerical results determining requirements for pulse durations and the optimal choice of pulse central frequency, which can be determined from an absorption spectrum. Our results suggest that for numerous systems, the required experiment could be implemented by many ultrafast spectroscopy laboratories using pulses of tens of femtoseconds in duration. Such experiments could resolve the standing debate over the nature of coherences in photosynthetic complexes.
Identification of the students' critical thinking skills through biochemistry laboratory work report
NASA Astrophysics Data System (ADS)
Anwar, Yunita Arian Sani; Senam, Laksono, Endang W.
2017-08-01
This work aims to (1) identify the critical thinking skills of student based on their ability to set up laboratory work reports, and (2) analyze the implementation of biochemistry laboratory work. The method of quantitative content analysis was employed. Quantitative data were in the form of critical thinking skills through the assessment of students' laboratory work reports and questionnaire data. Hoyo rubric was used to measure critical thinking skills with 10 indicators, namely clarity, accuracy, precision, consistency, relevance, evidence, reason, depth, breadth, and fairness. The research sample consisted of 105 students (35 male, 70 female) of Mataram University who took a Biochemistry course and 2 lecturers of Biochemistry course. The results showed students' critical thinking skills through laboratory work reports were still weak. Analysis of the questionnaire showed that three indicators become the biggest problems during the laboratory work implementation, namely, lecturers' involved in laboratory work implementation, the integration of laboratory work implementation of learning in the classroom has not been done optimally and laboratory work implementation as an effort to train critical thinking skills is not optimal yet.
Oberoi, Harinder Singh; Vadlani, Praveen V; Saida, Lavudi; Bansal, Sunil; Hughes, Joshua D
2011-07-01
Dried and ground banana peel biomass (BP) after hydrothermal sterilization pretreatment was used for ethanol production using simultaneous saccharification and fermentation (SSF). Central composite design (CCD) was used to optimize concentrations of cellulase and pectinase, temperature and time for ethanol production from BP using SSF. Analysis of variance showed a high coefficient of determination (R(2)) value of 0.92 for ethanol production. On the basis of model graphs and numerical optimization, the validation was done in a laboratory batch fermenter with cellulase, pectinase, temperature and time of nine cellulase filter paper unit/gram cellulose (FPU/g-cellulose), 72 international units/gram pectin (IU/g-pectin), 37 °C and 15 h, respectively. The experiment using optimized parameters in batch fermenter not only resulted in higher ethanol concentration than the one predicted by the model equation, but also saved fermentation time. This study demonstrated that both hydrothermal pretreatment and SSF could be successfully carried out in a single vessel, and use of optimized process parameters helped achieve significant ethanol productivity, indicating commercial potential for the process. To the best of our knowledge, ethanol concentration and ethanol productivity of 28.2 g/l and 2.3 g/l/h, respectively from banana peels have not been reported to date. Copyright © 2011 Elsevier Ltd. All rights reserved.
Optimizing Requirements Decisions with KEYS
NASA Technical Reports Server (NTRS)
Jalali, Omid; Menzies, Tim; Feather, Martin
2008-01-01
Recent work with NASA's Jet Propulsion Laboratory has allowed for external access to five of JPL's real-world requirements models, anonymized to conceal proprietary information, but retaining their computational nature. Experimentation with these models, reported herein, demonstrates a dramatic speedup in the computations performed on them. These models have a well defined goal: select mitigations that retire risks which, in turn, increases the number of attainable requirements. Such a non-linear optimization is a well-studied problem. However identification of not only (a) the optimal solution(s) but also (b) the key factors leading to them is less well studied. Our technique, called KEYS, shows a rapid way of simultaneously identifying the solutions and their key factors. KEYS improves on prior work by several orders of magnitude. Prior experiments with simulated annealing or treatment learning took tens of minutes to hours to terminate. KEYS runs much faster than that; e.g for one model, KEYS ran 13,000 times faster than treatment learning (40 minutes versus 0.18 seconds). Processing these JPL models is a non-linear optimization problem: the fewest mitigations must be selected while achieving the most requirements. Non-linear optimization is a well studied problem. With this paper, we challenge other members of the PROMISE community to improve on our results with other techniques.
Performance of Optimized Actuator and Sensor Arrays in an Active Noise Control System
NASA Technical Reports Server (NTRS)
Palumbo, D. L.; Padula, S. L.; Lyle, K. H.; Cline, J. H.; Cabell, R. H.
1996-01-01
Experiments have been conducted in NASA Langley's Acoustics and Dynamics Laboratory to determine the effectiveness of optimized actuator/sensor architectures and controller algorithms for active control of harmonic interior noise. Tests were conducted in a large scale fuselage model - a composite cylinder which simulates a commuter class aircraft fuselage with three sections of trim panel and a floor. Using an optimization technique based on the component transfer functions, combinations of 4 out of 8 piezoceramic actuators and 8 out of 462 microphone locations were evaluated against predicted performance. A combinatorial optimization technique called tabu search was employed to select the optimum transducer arrays. Three test frequencies represent the cases of a strong acoustic and strong structural response, a weak acoustic and strong structural response and a strong acoustic and weak structural response. Noise reduction was obtained using a Time Averaged/Gradient Descent (TAGD) controller. Results indicate that the optimization technique successfully predicted best and worst case performance. An enhancement of the TAGD control algorithm was also evaluated. The principal components of the actuator/sensor transfer functions were used in the PC-TAGD controller. The principal components are shown to be independent of each other while providing control as effective as the standard TAGD.
Project management: importance for diagnostic laboratories.
Croxatto, A; Greub, G
2017-07-01
The need for diagnostic laboratories to improve both quality and productivity alongside personnel shortages incite laboratory managers to constantly optimize laboratory workflows, organization, and technology. These continuous modifications of the laboratories should be conducted using efficient project and change management approaches to maximize the opportunities for successful completion of the project. This review aims at presenting a general overview of project management with an emphasis on selected critical aspects. Conventional project management tools and models, such as HERMES, described in the literature, associated personal experience, and educational courses on management have been used to illustrate this review. This review presents general guidelines of project management and highlights their importance for microbiology diagnostic laboratories. As an example, some critical aspects of project management will be illustrated with a project of automation, as experienced at the laboratories of bacteriology and hygiene of the University Hospital of Lausanne. It is important to define clearly beforehand the objective of a project, its perimeter, its costs, and its time frame including precise duration estimates of each step. Then, a project management plan including explanations and descriptions on how to manage, execute, and control the project is necessary to continuously monitor the progression of a project to achieve its defined goals. Moreover, a thorough risk analysis with contingency and mitigation measures should be performed at each phase of a project to minimize the impact of project failures. The increasing complexities of modern laboratories mean clinical microbiologists must use several management tools including project and change management to improve the outcome of major projects and activities. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.
Integrating Reservations and Queuing in Remote Laboratory Scheduling
ERIC Educational Resources Information Center
Lowe, D.
2013-01-01
Remote laboratories (RLs) have become increasingly seen as a useful tool in supporting flexible shared access to scarce laboratory resources. An important element in supporting shared access is coordinating the scheduling of the laboratory usage. Optimized scheduling can significantly decrease access waiting times and improve the utilization level…
OLTARIS: An Efficient Web-Based Tool for Analyzing Materials Exposed to Space Radiation
NASA Technical Reports Server (NTRS)
Slaba, Tony; McMullen, Amelia M.; Thibeault, Sheila A.; Sandridge, Chris A.; Clowdsley, Martha S.; Blatting, Steve R.
2011-01-01
The near-Earth space radiation environment includes energetic galactic cosmic rays (GCR), high intensity proton and electron belts, and the potential for solar particle events (SPE). These sources may penetrate shielding materials and deposit significant energy in sensitive electronic devices on board spacecraft and satellites. Material and design optimization methods may be used to reduce the exposure and extend the operational lifetime of individual components and systems. Since laboratory experiments are expensive and may not cover the range of particles and energies relevant for space applications, such optimization may be done computationally with efficient algorithms that include the various constraints placed on the component, system, or mission. In the present work, the web-based tool OLTARIS (On-Line Tool for the Assessment of Radiation in Space) is presented, and the applicability of the tool for rapidly analyzing exposure levels within either complicated shielding geometries or user-defined material slabs exposed to space radiation is demonstrated. An example approach for material optimization is also presented. Slabs of various advanced multifunctional materials are defined and exposed to several space radiation environments. The materials and thicknesses defining each layer in the slab are then systematically adjusted to arrive at an optimal slab configuration.
Li, Bai; Lin, Mu; Liu, Qiao; Li, Ya; Zhou, Changjun
2015-10-01
Protein folding is a fundamental topic in molecular biology. Conventional experimental techniques for protein structure identification or protein folding recognition require strict laboratory requirements and heavy operating burdens, which have largely limited their applications. Alternatively, computer-aided techniques have been developed to optimize protein structures or to predict the protein folding process. In this paper, we utilize a 3D off-lattice model to describe the original protein folding scheme as a simplified energy-optimal numerical problem, where all types of amino acid residues are binarized into hydrophobic and hydrophilic ones. We apply a balance-evolution artificial bee colony (BE-ABC) algorithm as the minimization solver, which is featured by the adaptive adjustment of search intensity to cater for the varying needs during the entire optimization process. In this work, we establish a benchmark case set with 13 real protein sequences from the Protein Data Bank database and evaluate the convergence performance of BE-ABC algorithm through strict comparisons with several state-of-the-art ABC variants in short-term numerical experiments. Besides that, our obtained best-so-far protein structures are compared to the ones in comprehensive previous literature. This study also provides preliminary insights into how artificial intelligence techniques can be applied to reveal the dynamics of protein folding. Graphical Abstract Protein folding optimization using 3D off-lattice model and advanced optimization techniques.
Optimizing Mass Spectrometry Analyses: A Tailored Review on the Utility of Design of Experiments.
Hecht, Elizabeth S; Oberg, Ann L; Muddiman, David C
2016-05-01
Mass spectrometry (MS) has emerged as a tool that can analyze nearly all classes of molecules, with its scope rapidly expanding in the areas of post-translational modifications, MS instrumentation, and many others. Yet integration of novel analyte preparatory and purification methods with existing or novel mass spectrometers can introduce new challenges for MS sensitivity. The mechanisms that govern detection by MS are particularly complex and interdependent, including ionization efficiency, ion suppression, and transmission. Performance of both off-line and MS methods can be optimized separately or, when appropriate, simultaneously through statistical designs, broadly referred to as "design of experiments" (DOE). The following review provides a tutorial-like guide into the selection of DOE for MS experiments, the practices for modeling and optimization of response variables, and the available software tools that support DOE implementation in any laboratory. This review comes 3 years after the latest DOE review (Hibbert DB, 2012), which provided a comprehensive overview on the types of designs available and their statistical construction. Since that time, new classes of DOE, such as the definitive screening design, have emerged and new calls have been made for mass spectrometrists to adopt the practice. Rather than exhaustively cover all possible designs, we have highlighted the three most practical DOE classes available to mass spectrometrists. This review further differentiates itself by providing expert recommendations for experimental setup and defining DOE entirely in the context of three case-studies that highlight the utility of different designs to achieve different goals. A step-by-step tutorial is also provided.
Design and Sampling Plan Optimization for RT-qPCR Experiments in Plants: A Case Study in Blueberry.
Die, Jose V; Roman, Belen; Flores, Fernando; Rowland, Lisa J
2016-01-01
The qPCR assay has become a routine technology in plant biotechnology and agricultural research. It is unlikely to be technically improved, but there are still challenges which center around minimizing the variability in results and transparency when reporting technical data in support of the conclusions of a study. There are a number of aspects of the pre- and post-assay workflow that contribute to variability of results. Here, through the study of the introduction of error in qPCR measurements at different stages of the workflow, we describe the most important causes of technical variability in a case study using blueberry. In this study, we found that the stage for which increasing the number of replicates would be the most beneficial depends on the tissue used. For example, we would recommend the use of more RT replicates when working with leaf tissue, while the use of more sampling (RNA extraction) replicates would be recommended when working with stems or fruits to obtain the most optimal results. The use of more qPCR replicates provides the least benefit as it is the most reproducible step. By knowing the distribution of error over an entire experiment and the costs at each step, we have developed a script to identify the optimal sampling plan within the limits of a given budget. These findings should help plant scientists improve the design of qPCR experiments and refine their laboratory practices in order to conduct qPCR assays in a more reliable-manner to produce more consistent and reproducible data.
Turner, Patricia V; Pekow, Cynthia; Vasbinder, Mary Ann; Brabb, Thea
2011-01-01
Administration of substances to laboratory animals requires careful consideration and planning to optimize delivery of the agent to the animal while minimizing potential adverse experiences from the procedure. The equipment selected to deliver substances to animals depends on the length of the study and the nature of the material being administered. This selection provides a significant opportunity for refining animal treatment. Similarly, when substances are administered as solutions or suspensions, attention should be given to selection of vehicles and methods used for preparing the solutions and suspensions. The research team, veterinarian, technical personnel, and IACUC members should be aware of reasons underlying selection of equipment for substance delivery and should consider carefully how substances will be prepared and stored prior to administration to animals. Failure to consider these factors during experimental planning may result in unintentional adverse effects on experimental animals and confounded results. PMID:22330706
DOE Office of Scientific and Technical Information (OSTI.GOV)
Srivastava, V.; Fannin, K.F.; Biljetina, R.
1986-07-01
The Institute of Gas Technology (IGT) conducted a comprehensive laboratory-scale research program to develop and optimize the anaerobic digestion process for producing methane from water hyacinth and sludge blends. This study focused on digester design and operating techniques, which gave improved methane yields and production rates over those observed using conventional digesters. The final digester concept and the operating experience was utilized to design and operate a large-scale experimentla test unit (ETU) at Walt Disney World, Florida. This paper describes the novel digester design, operating techniques, and the results obtained in the laboratory. The paper also discusses a kinetic modelmore » which predicts methane yield, methane production rate, and digester effluent solids as a function of retention time. This model was successfully utilized to predict the performance of the ETU. 15 refs., 6 figs., 6 tabs.« less
ICRF Development for the Variable Specific Impulse Magnetoplasma Rocket
NASA Astrophysics Data System (ADS)
Ryan, P. M.; Baity, F. W.; Barber, G. C.; Carter, M. D.; Hoffman, D. J.; Jaeger, E. F.; Taylor, D. J.; Chang-Diaz, F. R.; Squire, J. P.; McCaskill, G.
1997-11-01
The feasibility of using magnetically vectored and rf-heated plasmas for space propulsion (F. R. Chang-Diaz, et al., Bull. Am. Phys. Soc., 41, 1541 (1996)) is being investigated experimentally on an asymmetric magnetic mirror device at the Advanced Space Propulsion Laboratory (ASPL), Johnson Space Center, NASA. Analysis of the antenna interaction with and the wave propagation through the dense plasma propulsion system is being studied at ORNL(Oak Ridge National Laboratory, managed by Lockheed Martin Energy Research Corp. for the U.S. Department of Energy under contract number DE-AC05-96OR22464.), using antenna design codes developed for ICH systems and mirror codes developed for the EBT experiment at ORNL. The present modeling effort is directed toward the ASPL experimental device. Antenna optimization and performance, as well as the design considerations for space-qualified rf components and systems (minimizing weight while maximizing reliability) will be presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moen, Christopher D.; Dedrick, Daniel E.; Pratt, Joseph William
2014-03-01
The US Department of Energy (DOE) Energy Efficiency and Renewable Energy (EERE) Office of Fuel Cell Technologies Office (FCTO) is establishing the Hydrogen Fueling Infrastructure Research and Station Technology (H2FIRST) partnership, led by the National Renewable Energy Laboratory (NREL) and Sandia National Laboratories (SNL). FCTO is establishing this partnership and the associated capabilities in support of H2USA, the public/private partnership launched in 2013. The H2FIRST partnership provides the research and technology acceleration support to enable the widespread deployment of hydrogen infrastructure for the robust fueling of light-duty fuel cell electric vehicles (FCEV). H2FIRST will focus on improving private-sector economics, safety,more » availability and reliability, and consumer confidence for hydrogen fueling. This whitepaper outlines the goals, scope, activities associated with the H2FIRST partnership.« less
Monitoring tools of COMPASS experiment at CERN
NASA Astrophysics Data System (ADS)
Bodlak, M.; Frolov, V.; Huber, S.; Jary, V.; Konorov, I.; Levit, D.; Novy, J.; Salac, R.; Tomsa, J.; Virius, M.
2015-12-01
This paper briefly introduces the data acquisition system of the COMPASS experiment and is mainly focused on the part that is responsible for the monitoring of the nodes in the whole newly developed data acquisition system of this experiment. The COMPASS is a high energy particle experiment with a fixed target located at the SPS of the CERN laboratory in Geneva, Switzerland. The hardware of the data acquisition system has been upgraded to use FPGA cards that are responsible for data multiplexing and event building. The software counterpart of the system includes several processes deployed in heterogenous network environment. There are two processes, namely Message Logger and Message Browser, taking care of monitoring. These tools handle messages generated by nodes in the system. While Message Logger collects and saves messages to the database, the Message Browser serves as a graphical interface over the database containing these messages. For better performance, certain database optimizations have been used. Lastly, results of performance tests are presented.
Culturing immobilized plant cells for the TUBUL space experiments on the DELTA and 12S Missions
NASA Astrophysics Data System (ADS)
Sieberer, Björn J.; Emons, Anne Mie C.; Vos, Jan W.
2007-09-01
For the TUBUL experiments during the DELTA mission in April 2004 and 12S mission in March/April 2006 on board the Soyuz capsule and the International Space Station we developed a method to culture and chemically fix plant suspension culture cells. The aim of the ten day experiment was to investigate the effect of microgravity on single plant cells. Fully automated experiment cassettes (Plunger Box Units) were developed by Centre for Concepts in Mechatronics (Nuenen, the Netherlands). Tobacco BY- 2 cells were immobilized in a semi- solid agarose matrix that was reinforced by a nylon mesh. This assembly allowed liquid medium refreshment, oxygen supply and chemical fixation, including a post- fixative wash. The method was optimized for post- flight analysis of cell structure, shape and size, cell division, and the microtubule cytoskeleton. The viability of cells in the agarose matrix was similar to cells grown in liquid medium under laboratory conditions, only the stationary growth phase was reached six days later.
NASA Technical Reports Server (NTRS)
Priem, Richard J.
1988-01-01
The purpose of this study is to define the requirements of commercially motivated microgravity combustion experiments and the optimal way for space station to accommodate these requirements. Representatives of commercial organizations, universities and government agencies were contacted. Interest in and needs for microgravity combustion studies are identified for commercial/industrial groups involved in fire safety with terrestrial applications, fire safety with space applications, propulsion and power, industrial burners, or pollution control. From these interests and needs experiments involving: (1) no flow with solid or liquid fuels; (2) homogeneous mixtures of fuel and air; (3) low flow with solid or liquid fuels; (4) low flow with gaseous fuel; (5) high pressure combustion; and (6) special burner systems are described and space station resource requirements for each type of experiment provided. Critical technologies involving the creation of a laboratory environment and methods for combining experimental needs into one experiment in order to obtain effective use of space station are discussed. Diagnostic techniques for monitoring combustion process parameters are identified.
Optimization of confocal laser induced fluorescence for long focal length applications
NASA Astrophysics Data System (ADS)
Jemiolo, Andrew J.; Henriquez, Miguel F.; Thompson, Derek S.; Scime, Earl E.
2017-10-01
Laser induced fluorescence (LIF) is a non-perturbative diagnostic for measuring ion and neutral particle velocities and temperatures in a plasma. The conventional method for single-photon LIF requires intersecting optical paths for light injection and collection. The multiple vacuum windows needed for such measurements are unavailable in many plasma experiments. Confocal LIF eliminates the need for perpendicular intersecting optical paths by using concentric injection and collection paths through a single window. One of the main challenges with using confocal LIF is achieving high resolution measurements at the longer focal lengths needed for many plasma experiments. We present confocal LIF measurements in HELIX, a helicon plasma experiment at West Virginia University, demonstrating spatial resolution dependence on focal length and spatial filtering. By combining aberration mitigating optics with spatial filtering, our results show high resolution measurements at focal lengths of 0.5 m, long enough to access the interiors of many laboratory plasma experiments. This work was supported by U.S. National Science Foundation Grant No. PHY-1360278.
NASA Astrophysics Data System (ADS)
Menicucci, D. F.
1986-01-01
The performance of a photovoltaic (PV) system is affected by its mounting configuration. The optimal configuration is unclear because of lack of experience and data. Sandia National Laboratories, Albuquerque (SNLA), has conducted a controlled field experiment to compare four types of the most common module mounting. The data from the experiment were used to verify the accuracy of PVFORM, a new computer program that simulates PV performance. PVFORM was then used to simulate the performance of identical PV modules on different mounting configurations at 10 sites throughout the US. This report describes the module mounting configurations, the experimental methods used, the specialized statistical techniques used in the analysis, and the final results of the effort. The module mounting configurations are rank ordered at each site according to their annual and seasonal energy production performance, and each is briefly discussed in terms of its advantages and disadvantages in various applications.
PandaX-III neutrinoless double beta decay experiment
NASA Astrophysics Data System (ADS)
Wang, Shaobo; PandaX-III Collaboration
2017-09-01
The PandaX-III experiment uses high pressure Time Projection Chambers (TPCs) to search for neutrinoless double-beta decay of Xe-136 with high energy resolution and sensitivity at the China Jin-Ping underground Laboratory II (CJPL-II). Fine-pitch Microbulk Micromegas will be used for charge amplification and readout in order to reconstruct both the energy and track of the neutrinoless double-beta decay event. In the first phase of the experiment, the detector, which contains 200 kg of 90% Xe-136 enriched gas operated at 10 bar, will be immersed in a large water tank to ensure 5 m of water shielding. For the second phase, a ton-scale experiment with multiple TPCs will be constructed to improve the detection probability and sensitivity. A 20-kg scale prototype TPC with 7 Micromegas modules has been built to optimize the design of Micromegas readout module, study the energy calibration of TPC and develop algorithm of 3D track reconstruction.
The world`s first 27 T and 30 T resistive magnets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bird, M.D.; Bole, S.; Eyssa, Y.M.
1996-07-01
The authors describe in detail a 30 Tesla, 32mm warm bore,m 15 MW resistive magnet which was put into operation at the National High Magnetic Field Laboratory in Tallahassee, FL in March 1995. The magnet consists of three concentric axially-cooled Bitter stacks connected electrically in series. This magnet employs a substantial new development in Bitter magnet technology which allows high current densities without the usually accompanying high stresses. Details of magnet optimization, design, construction, testing and operation are presented. The authors also report on operating experience with the 27 T magnets.
Colli, Alessandra; Attenkofer, Klaus; Raghothamachar, Balaji; ...
2016-07-14
Here in this article, we present the first experiment to prove the capabilities of X-ray topography for the direct imaging and analysis of defects, stress, and strain affecting the cell within the laminated photovoltaic (PV) module. Cracks originating from grain boundaries structures have been detected, developing along the cleavage planes of the crystal. The strain affecting the cell is clearly visualized through the bending of the metallization line images and can be easily mapped. While the recording conditions need to be optimized to maximize image contrast, this experiment demonstrates how synchrotron facilities can enable PV industry and research to characterizemore » full PV modules. Appropriate development of the technique could also lead to future use of laboratory-level X-ray sources.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Panaccione, G.; Vobornik, I.; Fujii, J.
2009-04-15
We report the main characteristics of the advanced photoelectric effect experiments beamline, operational at Elettra storage ring, featuring a fully independent double branch scheme obtained by the use of chicane undulators and able to keep polarization control in both linear and circular mode. The paper describes the novel technical solutions adopted, namely, (a) the design of a quasiperiodic undulator resulting in optimized suppression of higher harmonics over a large photon energy range (10-100 eV), (b) the thermal stability of optics under high heat load via cryocoolers, and (c) the end station interconnected setup allowing full access to off-beam and on-beammore » facilities and, at the same time, the integration of users' specialized sample growth chambers or modules.« less
The Improvement Cycle: Analyzing Our Experience
NASA Technical Reports Server (NTRS)
Pajerski, Rose; Waligora, Sharon
1996-01-01
NASA's Software Engineering Laboratory (SEL), one of the earliest pioneers in the areas of software process improvement and measurement, has had a significant impact on the software business at NASA Goddard. At the heart of the SEL's improvement program is a belief that software products can be improved by optimizing the software engineering process used to develop them and a long-term improvement strategy that facilitates small incremental improvements that accumulate into significant gains. As a result of its efforts, the SEL has incrementally reduced development costs by 60%, decreased error rates by 85%, and reduced cycle time by 25%. In this paper, we analyze the SEL's experiences on three major improvement initiatives to better understand the cyclic nature of the improvement process and to understand why some improvements take much longer than others.
BEopt - Building Energy Optimization BEopt NREL - National Renewable Energy Laboratory Primary Energy Optimization) software provides capabilities to evaluate residential building designs and identify sequential search optimization technique used by BEopt: Finds minimum-cost building designs at different
Walther, Cornelia; Kellner, Martin; Berkemeyer, Matthias; Brocard, Cécile; Dürauer, Astrid
2017-10-21
Escherichia coli stores large amounts of highly pure product within inclusion bodies (IBs). To take advantage of this beneficial feature, after cell disintegration, the first step to optimal product recovery is efficient IB preparation. This step is also important in evaluating upstream optimization and process development, due to the potential impact of bioprocessing conditions on product quality and on the nanoscale properties of IBs. Proper IB preparation is often neglected, due to laboratory-scale methods requiring large amounts of materials and labor. Miniaturization and parallelization can accelerate analyses of individual processing steps and provide a deeper understanding of up- and downstream processing interdependencies. Consequently, reproducible, predictive microscale methods are in demand. In the present study, we complemented a recently established high-throughput cell disruption method with a microscale method for preparing purified IBs. This preparation provided results comparable to laboratory-scale IB processing, regarding impurity depletion, and product loss. Furthermore, with this method, we performed a "design of experiments" study to demonstrate the influence of fermentation conditions on the performance of subsequent downstream steps and product quality. We showed that this approach provided a 300-fold reduction in material consumption for each fermentation condition and a 24-fold reduction in processing time for 24 samples.
Rosebraugh, Matthew R; Widness, John A; Nalbant, Demet; Cress, Gretchen; Veng-Pedersen, Peter
2014-02-01
Preterm very-low-birth-weight (VLBW) infants weighing <1.5 kg at birth develop anemia, often requiring multiple red blood cell transfusions (RBCTx). Because laboratory blood loss is a primary cause of anemia leading to RBCTx in VLBW infants, our purpose was to simulate the extent to which RBCTx can be reduced or eliminated by reducing laboratory blood loss in combination with pharmacodynamically optimized erythropoietin (Epo) treatment. Twenty-six VLBW ventilated infants receiving RBCTx were studied during the first month of life. RBCTx simulations were based on previously published RBCTx criteria and data-driven Epo pharmacodynamic optimization of literature-derived RBC life span and blood volume data corrected for phlebotomy loss. Simulated pharmacodynamic optimization of Epo administration and reduction in phlebotomy by ≥ 55% predicted a complete elimination of RBCTx in 1.0-1.5 kg infants. In infants <1.0 kg with 100% reduction in simulated phlebotomy and optimized Epo administration, a 45% reduction in RBCTx was predicted. The mean blood volume drawn from all infants was 63 ml/kg: 33% required for analysis and 67% discarded. When reduced laboratory blood loss and optimized Epo treatment are combined, marked reductions in RBCTx in ventilated VLBW infants were predicted, particularly among those with birth weights >1.0 kg.
Demircik, Filiz; Klonoff, David; Musholt, Petra B; Ramljak, Sanja; Pfützner, Andreas
2016-10-01
Devices employing electrochemistry-based correction algorithms (EBCAs) are optimized for patient use and require special handling procedures when tested in the laboratory. This study investigated the impact of sample handling on the results of an accuracy and hematocrit interference test performed with BG*Star, iBG*Star; OneTouch Verio Pro and Accu-Chek Aviva versus YSI Stat 2300. Venous heparinized whole blood was manipulated to contain three different blood glucose concentrations (64-74, 147-163, and 313-335 mg/dL) and three different hematocrit levels (30%, 45%, and 60%). Sample preparation was done by either a very EBCA-experienced laboratory testing team (A), a group experienced with other meters but not EBCAs (B), or a team inexperienced with meter testing (C). Team A ensured physiological pO 2 and specific sample handling requirements, whereas teams B and C did not consider pO 2 . Each sample was tested four times with each device. In a separate experiment, a different group similar to group B performed the experiment before (D1) and after (D2) appropriate sample handling training. Mean absolute deviation from YSI was calculated as a metrix for all groups and devices. Mean absolute relative difference was 4.3% with team A (B: 9.2%, C: 5.2%). Team B had much higher readings and team C produced 100% of "sample composition" errors with high hematocrit levels. In a separate experiment, group D showed a result similar to group B before the training and improved significantly when considering the sample handling requirements (D1: 9.4%, D2: 4.5%, P < 0.05). Laboratory performance testing of EBCA devices should only be performed by trained staff considering specific sample handling requirements. The results suggest that healthcare centers should evaluate EBCA-based devices with capillary blood from patients in accordance with the instructions for use to achieve reliable results.
NASA Astrophysics Data System (ADS)
Maher, Pamela A.
Technology in college classrooms has gone from being an enhancement to the learning experience to being something expected by both instructors and students. This design-based research investigation takes technology one step further, putting the tools used to teach directly in the hands of students. The study examined the affordances and constraints of two simulation tools for use in introductory astronomy courses. The variety of experiences participants had using two tools; a virtual reality headset and fulldome immersive planetarium simulation, to manipulate a lunar surface flyby were identified using a multi-method research approach with N = 67 participants. Participants were recruited from classes of students taking astronomy over one academic year at a two-year college. Participants manipulated a lunar flyby using a virtual reality headset and a motion sensor device in the college fulldome planetarium. Data were collected in the form of two post-treatment questionnaires using Likert-type scales and one small group interview. The small group interview was intended to elicit various experiences participants had using the tools. Responses were analyzed quantitatively for optimal flyby speed and qualitatively for salient themes using data reduction informed by a methodological framework of phenomenography to identify the variety of experiences participants had using the tools. Findings for optimal flyby speed of the Moon based on analysis of data for both the Immersion Questionnaire and the Simulator Sickness Questionnaire done using SPSS software determine that the optimal flyby speed for college students to manipulate the Moon was calculated to be .04 x the radius of the Earth (3,959 miles) or 160 miles per second. A variety of different participant experiences were revealed using MAXQDA software to code positive and negative remarks participants had when engaged in the use of each tool. Both tools offer potential to actively engage students with astronomy content in college lecture and laboratory courses.
Co-Optimization of Fuels and Engines | Transportation Research | NREL
Co-Optimization of Fuels and Engines Co-Optimization of Fuels and Engines Photo of silver sedan in ), eight other national laboratories, and industry on the Co-Optimization of Fuels & Engines (Co-Optima research activities and accomplishments of the Co-Optima initiative in the Co-Optimization of Fuels &
NASA Technical Reports Server (NTRS)
Patten, William Neff
1989-01-01
There is an evident need to discover a means of establishing reliable, implementable controls for systems that are plagued by nonlinear and, or uncertain, model dynamics. The development of a generic controller design tool for tough-to-control systems is reported. The method utilizes a moving grid, time infinite element based solution of the necessary conditions that describe an optimal controller for a system. The technique produces a discrete feedback controller. Real time laboratory experiments are now being conducted to demonstrate the viability of the method. The algorithm that results is being implemented in a microprocessor environment. Critical computational tasks are accomplished using a low cost, on-board, multiprocessor (INMOS T800 Transputers) and parallel processing. Progress to date validates the methodology presented. Applications of the technique to the control of highly flexible robotic appendages are suggested.
Optical Sensor/Actuator Locations for Active Structural Acoustic Control
NASA Technical Reports Server (NTRS)
Padula, Sharon L.; Palumbo, Daniel L.; Kincaid, Rex K.
1998-01-01
Researchers at NASA Langley Research Center have extensive experience using active structural acoustic control (ASAC) for aircraft interior noise reduction. One aspect of ASAC involves the selection of optimum locations for microphone sensors and force actuators. This paper explains the importance of sensor/actuator selection, reviews optimization techniques, and summarizes experimental and numerical results. Three combinatorial optimization problems are described. Two involve the determination of the number and position of piezoelectric actuators, and the other involves the determination of the number and location of the sensors. For each case, a solution method is suggested, and typical results are examined. The first case, a simplified problem with simulated data, is used to illustrate the method. The second and third cases are more representative of the potential of the method and use measured data. The three case studies and laboratory test results establish the usefulness of the numerical methods.
Mazzitelli, S; Tosi, A; Balestra, C; Nastruzzi, C; Luca, G; Mancuso, F; Calafiore, R; Calvitti, M
2008-09-01
The optimization, through a Design of Experiments (DoE) approach, of a microencapsulation procedure for isolated neonatal porcine islets (NPI) is described. The applied method is based on the generation of monodisperse droplets by a vibrational nozzle. An alginate/polyornithine encapsulation procedure, developed and validated in our laboratory for almost a decade, was used to embody pancreatic islets. We analyzed different experimental parameters including frequency of vibration, amplitude of vibration, polymer pumping rate, and distance between the nozzle and the gelling bath. We produced calcium-alginate gel microbeads with excellent morphological characteristics as well as a very narrow size distribution. The automatically produced microcapsules did not alter morphology, viability and functional properties of the enveloped NPI. The optimization of this automatic procedure may provide a novel approach to obtain a large number of batches possibly suitable for large scale production of immunoisolated NPI for in vivo cell transplantation procedures in humans.
Digital PCR Modeling for Maximal Sensitivity, Dynamic Range and Measurement Precision
Majumdar, Nivedita; Wessel, Thomas; Marks, Jeffrey
2015-01-01
The great promise of digital PCR is the potential for unparalleled precision enabling accurate measurements for genetic quantification. A challenge associated with digital PCR experiments, when testing unknown samples, is to perform experiments at dilutions allowing the detection of one or more targets of interest at a desired level of precision. While theory states that optimal precision (Po) is achieved by targeting ~1.59 mean copies per partition (λ), and that dynamic range (R) includes the space spanning one positive (λL) to one negative (λU) result from the total number of partitions (n), these results are tempered for the practitioner seeking to construct digital PCR experiments in the laboratory. A mathematical framework is presented elucidating the relationships between precision, dynamic range, number of partitions, interrogated volume, and sensitivity in digital PCR. The impact that false reaction calls and volumetric variation have on sensitivity and precision is next considered. The resultant effects on sensitivity and precision are established via Monte Carlo simulations reflecting the real-world likelihood of encountering such scenarios in the laboratory. The simulations provide insight to the practitioner on how to adapt experimental loading concentrations to counteract any one of these conditions. The framework is augmented with a method of extending the dynamic range of digital PCR, with and without increasing n, via the use of dilutions. An example experiment demonstrating the capabilities of the framework is presented enabling detection across 3.33 logs of starting copy concentration. PMID:25806524
NASA Technical Reports Server (NTRS)
Peters, Mark; Boisvert, Ben; Escala, Diego
2009-01-01
Explicit integration of aviation weather forecasts with the National Airspace System (NAS) structure is needed to improve the development and execution of operationally effective weather impact mitigation plans and has become increasingly important due to NAS congestion and associated increases in delay. This article considers several contemporary weather-air traffic management (ATM) integration applications: the use of probabilistic forecasts of visibility at San Francisco, the Route Availability Planning Tool to facilitate departures from the New York airports during thunderstorms, the estimation of en route capacity in convective weather, and the application of mixed-integer optimization techniques to air traffic management when the en route and terminal capacities are varying with time because of convective weather impacts. Our operational experience at San Francisco and New York coupled with very promising initial results of traffic flow optimizations suggests that weather-ATM integrated systems warrant significant research and development investment. However, they will need to be refined through rapid prototyping at facilities with supportive operational users We have discussed key elements of an emerging aviation weather research area: the explicit integration of aviation weather forecasts with NAS structure to improve the effectiveness and timeliness of weather impact mitigation plans. Our insights are based on operational experiences with Lincoln Laboratory-developed integrated weather sensing and processing systems, and derivative early prototypes of explicit ATM decision support tools such as the RAPT in New York City. The technical components of this effort involve improving meteorological forecast skill, tailoring the forecast outputs to the problem of estimating airspace impacts, developing models to quantify airspace impacts, and prototyping automated tools that assist in the development of objective broad-area ATM strategies, given probabilistic weather forecasts. Lincoln Laboratory studies and prototype demonstrations in this area are helping to define the weather-assimilated decision-making system that is envisioned as a key capability for the multi-agency Next Generation Air Transportation System [1]. The Laboratory's work in this area has involved continuing, operations-based evolution of both weather forecasts and models for weather impacts on the NAS. Our experience has been that the development of usable ATM technologies that address weather impacts must proceed via rapid prototyping at facilities whose users are highly motivated to participate in system evolution.
The External Quality Assessment Scheme (EQAS): Experiences of a medium sized accredited laboratory.
Bhat, Vivek; Chavan, Preeti; Naresh, Chital; Poladia, Pratik
2015-06-15
We put forth our experiences of EQAS, analyzed the result discrepancies, reviewed the corrective actions and also put forth strategies for risk identification and prevention of potential errors in a medical laboratory. For hematology, EQAS samples - blood, peripheral and reticulocyte smears - were received quarterly every year. All the blood samples were processed on HMX hematology analyzer by Beckman-Coulter. For clinical chemistry, lyophilized samples were received and were processed on Siemens Dimension Xpand and RXL analyzers. For microbiology, EQAS samples were received quarterly every year as lyophilized strains along with smears and serological samples. In hematology no outliers were noted for reticulocyte and peripheral smear examination. Only one outlier was noted for CBC. In clinical chemistry outliers (SDI ≥ 2) were noted in 7 samples (23 parameters) out of total 36 samples (756 parameters) processed. Thirteen of these parameters were analyzed as random errors, 3 as transcriptional errors and seven instances of systemic error were noted. In microbiology, one discrepancy was noted in isolate identification and in the grading of smears for AFB by Ziehl Neelsen stain. EQAS along with IQC is a very important tool for maintaining optimal quality of services. Copyright © 2015 Elsevier B.V. All rights reserved.
Panorama: A Targeted Proteomics Knowledge Base
2015-01-01
Panorama is a web application for storing, sharing, analyzing, and reusing targeted assays created and refined with Skyline,1 an increasingly popular Windows client software tool for targeted proteomics experiments. Panorama allows laboratories to store and organize curated results contained in Skyline documents with fine-grained permissions, which facilitates distributed collaboration and secure sharing of published and unpublished data via a web-browser interface. It is fully integrated with the Skyline workflow and supports publishing a document directly to a Panorama server from the Skyline user interface. Panorama captures the complete Skyline document information content in a relational database schema. Curated results published to Panorama can be aggregated and exported as chromatogram libraries. These libraries can be used in Skyline to pick optimal targets in new experiments and to validate peak identification of target peptides. Panorama is open-source and freely available. It is distributed as part of LabKey Server,2 an open source biomedical research data management system. Laboratories and organizations can set up Panorama locally by downloading and installing the software on their own servers. They can also request freely hosted projects on https://panoramaweb.org, a Panorama server maintained by the Department of Genome Sciences at the University of Washington. PMID:25102069
Young, William F; Stanson, Anthony W
2009-01-01
Adrenal venous sampling (AVS) is the criterion standard to distinguish between unilateral and bilateral adrenal disease in patients with primary aldosteronism. The keys to successful AVS include appropriate patient selection, careful patient preparation, focused technical expertise, defined protocol, and accurate data interpretation. The use of AVS should be based on patient preferences, patient age, clinical comorbidities, and the clinical probability of finding an aldosterone-producing adenoma. AVS is optimally performed in the fasting state in the morning. AVS is an intricate procedure because the right adrenal vein is small and may be difficult to locate - the success rate depends on the proficiency of the angiographer. The key factors that determine the successful catheterization of both adrenal veins are experience, dedication and repetition. With experience, and focusing the expertise to 1 or 2 radiologists at a referral centre, the AVS success rate can be as high as 96%. A centre-specific, written protocol is mandatory. The protocol should be developed by an interested group of endocrinologists, radiologists and laboratory personnel. Safeguards should be in place to prevent mislabelling of the blood tubes in the radiology suite and to prevent sample mix-up in the laboratory.
Myneni, Sahiti; Patel, Vimla L; Bova, G Steven; Wang, Jian; Ackerman, Christopher F; Berlinicke, Cynthia A; Chen, Steve H; Lindvall, Mikael; Zack, Donald J
2016-04-01
This paper describes a distributed collaborative effort between industry and academia to systematize data management in an academic biomedical laboratory. Heterogeneous and voluminous nature of research data created in biomedical laboratories make information management difficult and research unproductive. One such collaborative effort was evaluated over a period of four years using data collection methods including ethnographic observations, semi-structured interviews, web-based surveys, progress reports, conference call summaries, and face-to-face group discussions. Data were analyzed using qualitative methods of data analysis to (1) characterize specific problems faced by biomedical researchers with traditional information management practices, (2) identify intervention areas to introduce a new research information management system called Labmatrix, and finally to (3) evaluate and delineate important general collaboration (intervention) characteristics that can optimize outcomes of an implementation process in biomedical laboratories. Results emphasize the importance of end user perseverance, human-centric interoperability evaluation, and demonstration of return on investment of effort and time of laboratory members and industry personnel for success of implementation process. In addition, there is an intrinsic learning component associated with the implementation process of an information management system. Technology transfer experience in a complex environment such as the biomedical laboratory can be eased with use of information systems that support human and cognitive interoperability. Such informatics features can also contribute to successful collaboration and hopefully to scientific productivity. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Optimization of a sample processing protocol for recovery of Bacillus anthracis spores from soil
Silvestri, Erin E.; Feldhake, David; Griffin, Dale; Lisle, John T.; Nichols, Tonya L.; Shah, Sanjiv; Pemberton, A; Schaefer III, Frank W
2016-01-01
Following a release of Bacillus anthracis spores into the environment, there is a potential for lasting environmental contamination in soils. There is a need for detection protocols for B. anthracis in environmental matrices. However, identification of B. anthracis within a soil is a difficult task. Processing soil samples helps to remove debris, chemical components, and biological impurities that can interfere with microbiological detection. This study aimed to optimize a previously used indirect processing protocol, which included a series of washing and centrifugation steps. Optimization of the protocol included: identifying an ideal extraction diluent, variation in the number of wash steps, variation in the initial centrifugation speed, sonication and shaking mechanisms. The optimized protocol was demonstrated at two laboratories in order to evaluate the recovery of spores from loamy and sandy soils. The new protocol demonstrated an improved limit of detection for loamy and sandy soils over the non-optimized protocol with an approximate matrix limit of detection at 14 spores/g of soil. There were no significant differences overall between the two laboratories for either soil type, suggesting that the processing protocol will be robust enough to use at multiple laboratories while achieving comparable recoveries.
Experiences with the hydraulic design of the high specific speed Francis turbine
NASA Astrophysics Data System (ADS)
Obrovsky, J.; Zouhar, J.
2014-03-01
The high specific speed Francis turbine is still suitable alternative for refurbishment of older hydro power plants with lower heads and worse cavitation conditions. In the paper the design process of such kind of turbine together with the results comparison of homological model tests performed in hydraulic laboratory of ČKD Blansko Engineering is introduced. The turbine runner was designed using the optimization algorithm and considering the high specific speed hydraulic profile. It means that hydraulic profiles of the spiral case, the distributor and the draft tube were used from a Kaplan turbine. The optimization was done as the automatic cycle and was based on a simplex optimization method as well as on a genetic algorithm. The number of blades is shown as the parameter which changes the resulting specific speed of the turbine between ns=425 to 455 together with the cavitation characteristics. Minimizing of cavitation on the blade surface as well as on the inlet edge of the runner blade was taken into account during the design process. The results of CFD analyses as well as the model tests are mentioned in the paper.
Using the Computer as a Laboratory Instrument.
ERIC Educational Resources Information Center
Collings, Peter J.; Greenslade, Thomas B., Jr.
1989-01-01
Reports experiences during a two-year period in introducing the computer to the laboratory and students to the computer as a laboratory instrument. Describes a working philosophy, data acquisition system, and experiments. Summarizes the laboratory procedures of nine experiments, covering mechanics, heat, electromagnetism, and optics. (YP)
Exploratory study of the acceptance of two individual practical classes with remote labs
NASA Astrophysics Data System (ADS)
Tirado-Morueta, Ramón; Sánchez-Herrera, Reyes; Márquez-Sánchez, Marco A.; Mejías-Borrero, Andrés; Andujar-Márquez, José Manuel
2018-03-01
Remote lab experiences are proliferating in higher education, although there are still few studies that manage to build a theoretical framework for educational assessment and design of this technology. In order to explore to what extent the use of facilitators of proximity to the laboratory and the autonomy of the experiment makes remote laboratories a technology accepted by students, two remote labs different yet similar educational conditions in laboratories are used. A sample of 98 undergraduate students from a degree course in Energy Engineering was used for this study; 57 of these students ran experiments in a laboratory of electrical machines and 41 in a photovoltaic systems laboratory. The data suggest using conditions that facilitate the proximity of the laboratory and the autonomy in the realisation of the experiment; in both laboratories the experience was positively valued by the students. Also, data suggest that the types of laboratory and experiment have influences on usability - autonomy and lab proximity - perceived by students.
NASA Technical Reports Server (NTRS)
Eaton, L. R.; Greco, E. V.
1973-01-01
The experiment program definition and preliminary laboratory concept studies on the zero G cloud physics laboratory are reported. This program involves the definition and development of an atmospheric cloud physics laboratory and the selection and delineations of a set of candidate experiments that must utilize the unique environment of zero gravity or near zero gravity.
NASA Astrophysics Data System (ADS)
Lo, Wei-Cheng; Sposito, Garrison; Huang, Yu-Han
2012-03-01
Seismic stimulation, the application of low-frequency stress-pulsing to the boundary of a porous medium containing water and a non-aqueous fluid to enhance the removal of the latter, shows great promise for both contaminated groundwater remediation and enhanced oil recovery, but theory to elucidate the underlying mechanisms lag significantly behind the progress achieved in experimental research. We address this conceptual lacuna by formulating a boundary-value problem to describe pore-pressure pulsing at seismic frequencies that is based on the continuum theory of poroelasticity for an elastic porous medium permeated by two immiscible fluids. An exact analytical solution is presented that is applied numerically using elasticity parameters and hydraulic data relevant to recent proof-of-principle laboratory experiments investigating the stimulation-induced mobilization of trichloroethene (TCE) in water flowing through a compressed sand core. The numerical results indicated that significant stimulation-induced increases of the TCE concentration in effluent can be expected from pore-pressure pulsing in the frequency range of 25-100 Hz, which is in good agreement with what was observed in the laboratory experiments. Sensitivity analysis of our numerical results revealed that the TCE concentration in the effluent increases with the porous medium framework compressibility and the pulsing pressure. Increasing compressibility also leads to an optimal stimulation response at lower frequencies, whereas changing the pulsing pressure does not affect the optimal stimulation frequency. Within the context of our model, the dominant physical cause for enhancement of non-aqueous fluid mobility by seismic stimulation is the dilatory motion of the porous medium in which the solid and fluid phases undergo opposite displacements, resulting in stress-induced changes of the pore volume.
JPL control/structure interaction test bed real-time control computer architecture
NASA Technical Reports Server (NTRS)
Briggs, Hugh C.
1989-01-01
The Control/Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts - such as active structure - and new tools - such as combined structure and control optimization algorithm - and their verification in ground and possibly flight test. A focus mission spacecraft was designed based upon a space interferometer and is the basis for design of the ground test article. The ground test bed objectives include verification of the spacecraft design concepts, the active structure elements and certain design tools such as the new combined structures and controls optimization tool. In anticipation of CSI technology flight experiments, the test bed control electronics must emulate the computation capacity and control architectures of space qualifiable systems as well as the command and control networks that will be used to connect investigators with the flight experiment hardware. The Test Bed facility electronics were functionally partitioned into three units: a laboratory data acquisition system for structural parameter identification and performance verification; an experiment supervisory computer to oversee the experiment, monitor the environmental parameters and perform data logging; and a multilevel real-time control computing system. The design of the Test Bed electronics is presented along with hardware and software component descriptions. The system should break new ground in experimental control electronics and is of interest to anyone working in the verification of control concepts for large structures.
NASA Technical Reports Server (NTRS)
DeYoung, J. A.; McKinley, A.; Davis, J. A.; Hetzel, P.; Bauch, A.
1996-01-01
Eight laboratories are participating in an international two-way satellite time and frequency transfer (TWSTFT) experiment. Regular time and frequency transfers have been performed over a period of almost two years, including both European and transatlantic time transfers. The performance of the regular TWSTFT sessions over an extended period has demonstrated conclusively the usefulness of the TWSTFT method for routine international time and frequency comparisons. Regular measurements are performed three times per week resulting in a regular but unevenly spaced data set. A method is presented that allows an estimate of the values of delta (sub y)(gamma) to be formed from these data. In order to maximize efficient use of paid satellite time an investigation to determine the optimal length of a single TWSTFT session is presented. The optimal experiment length is determined by evaluating how long white phase modulation (PM) instabilities are the dominant noise source during the typical 300-second sampling times currently used. A detailed investigation of the frequency transfers realized via the transatlantic TWSTFT links UTC(USNO)-UTC(NPL), UTC(USNO)-UTC(PTB), and UTC(PTB)-UTC(NPL) is presented. The investigation focuses on the frequency instabilities realized, a three cornered hat resolution of the delta (sub y) (gamma) values, and a comparison of the transatlantic and inter-European determination of UTC(PTB)-UTC(NPL). Future directions of this TWSTFT experiment are outlined.
Chang, Yaw-Jen; Chang, Cheng-Hao
2016-06-01
Based on the principle of immobilized metal affinity chromatography (IMAC), it has been found that a Ni-Co alloy-coated protein chip is able to immobilize functional proteins with a His-tag attached. In this study, an intelligent computational approach was developed to promote the performance and repeatability of a Ni-Co alloy-coated protein chip. This approach was launched out of L18 experiments. Based on the experimental data, the fabrication process model of a Ni-Co protein chip was established by using an artificial neural network, and then an optimal fabrication condition was obtained using the Taguchi genetic algorithm. The result was validated experimentally and compared with a nitrocellulose chip. Consequentially, experimental outcomes revealed that the Ni-Co alloy-coated chip, fabricated using the proposed approach, had the best performance and repeatability compared with the Ni-Co chips of an L18 orthogonal array design and the nitrocellulose chip. Moreover, the low fluorescent background of the chip surface gives a more precise fluorescent detection. Based on a small quantity of experiments, this proposed intelligent computation approach can significantly reduce the experimental cost and improve the product's quality. © 2015 Society for Laboratory Automation and Screening.
Optimal background matching camouflage.
Michalis, Constantine; Scott-Samuel, Nicholas E; Gibson, David P; Cuthill, Innes C
2017-07-12
Background matching is the most familiar and widespread camouflage strategy: avoiding detection by having a similar colour and pattern to the background. Optimizing background matching is straightforward in a homogeneous environment, or when the habitat has very distinct sub-types and there is divergent selection leading to polymorphism. However, most backgrounds have continuous variation in colour and texture, so what is the best solution? Not all samples of the background are likely to be equally inconspicuous, and laboratory experiments on birds and humans support this view. Theory suggests that the most probable background sample (in the statistical sense), at the size of the prey, would, on average, be the most cryptic. We present an analysis, based on realistic assumptions about low-level vision, that estimates the distribution of background colours and visual textures, and predicts the best camouflage. We present data from a field experiment that tests and supports our predictions, using artificial moth-like targets under bird predation. Additionally, we present analogous data for humans, under tightly controlled viewing conditions, searching for targets on a computer screen. These data show that, in the absence of predator learning, the best single camouflage pattern for heterogeneous backgrounds is the most probable sample. © 2017 The Authors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waisman, E. M.; Reisman, D. B.; Stoltzfus, B. S.
2016-06-15
The Thor pulsed power generator is being developed at Sandia National Laboratories. The design consists of up to 288 decoupled and transit time isolated capacitor-switch units, called “bricks,” that can be individually triggered to achieve a high degree of pulse tailoring for magnetically driven isentropic compression experiments (ICE) [D. B. Reisman et al., Phys. Rev. Spec. Top.–Accel. Beams 18, 090401 (2015)]. The connecting transmission lines are impedance matched to the bricks, allowing the capacitor energy to be efficiently delivered to an ICE strip-line load with peak pressures of over 100 GPa. Thor will drive experiments to explore equation of state,more » material strength, and phase transition properties of a wide variety of materials. We present an optimization process for producing tailored current pulses, a requirement for many material studies, on the Thor generator. This technique, which is unique to the novel “current-adder” architecture used by Thor, entirely avoids the iterative use of complex circuit models to converge to the desired electrical pulse. We begin with magnetohydrodynamic simulations for a given material to determine its time dependent pressure and thus the desired strip-line load current and voltage. Because the bricks are connected to a central power flow section through transit-time isolated coaxial cables of constant impedance, the brick forward-going pulses are independent of each other. We observe that the desired equivalent forward-going current driving the pulse must be equal to the sum of the individual brick forward-going currents. We find a set of optimal brick delay times by requiring that the L{sub 2} norm of the difference between the brick-sum current and the desired forward-going current be a minimum. We describe the optimization procedure for the Thor design and show results for various materials of interest.« less
Laboratory automation in clinical bacteriology: what system to choose?
Croxatto, A; Prod'hom, G; Faverjon, F; Rochais, Y; Greub, G
2016-03-01
Automation was introduced many years ago in several diagnostic disciplines such as chemistry, haematology and molecular biology. The first laboratory automation system for clinical bacteriology was released in 2006, and it rapidly proved its value by increasing productivity, allowing a continuous increase in sample volumes despite limited budgets and personnel shortages. Today, two major manufacturers, BD Kiestra and Copan, are commercializing partial or complete laboratory automation systems for bacteriology. The laboratory automation systems are rapidly evolving to provide improved hardware and software solutions to optimize laboratory efficiency. However, the complex parameters of the laboratory and automation systems must be considered to determine the best system for each given laboratory. We address several topics on laboratory automation that may help clinical bacteriologists to understand the particularities and operative modalities of the different systems. We present (a) a comparison of the engineering and technical features of the various elements composing the two different automated systems currently available, (b) the system workflows of partial and complete laboratory automation, which define the basis for laboratory reorganization required to optimize system efficiency, (c) the concept of digital imaging and telebacteriology, (d) the connectivity of laboratory automation to the laboratory information system, (e) the general advantages and disadvantages as well as the expected impacts provided by laboratory automation and (f) the laboratory data required to conduct a workflow assessment to determine the best configuration of an automated system for the laboratory activities and specificities. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Longenecker, R J; Galazyuk, A V
2012-11-16
Recently prepulse inhibition of the acoustic startle reflex (ASR) became a popular technique for tinnitus assessment in laboratory animals. This method confers a significant advantage over the previously used time-consuming behavioral approaches utilizing basic mechanisms of conditioning. Although this technique has been successfully used to assess tinnitus in different laboratory animals, many of the finer details of this methodology have not been described enough to be replicated, but are critical for tinnitus assessment. Here we provide detail description of key procedures and methodological issues that provide guidance for newcomers with the process of learning to correctly apply gap detection techniques for tinnitus assessment in laboratory animals. The major categories of these issues include: refinement of hardware for best performance, optimization of stimulus parameters, behavioral considerations, and identification of optimal strategies for data analysis. This article is part of a Special Issue entitled: Tinnitus Neuroscience. Copyright © 2012. Published by Elsevier B.V.
UHPC for Blast and Ballistic Protection, Explosion Testing and Composition Optimization
NASA Astrophysics Data System (ADS)
Bibora, P.; Drdlová, M.; Prachař, V.; Sviták, O.
2017-10-01
The realization of high performance concrete resistant to detonation is the aim and expected outcome of the presented project, which is oriented to development of construction materials for larger objects as protective walls and bunkers. Use of high-strength concrete (HSC / HPC - “high strength / performance concrete”) and high-fiber reinforced concrete (UHPC / UHPFC -“Ultra High Performance Fiber Reinforced Concrete”) seems to be optimal for this purpose of research. The paper describes the research phase of the project, in which we focused on the selection of specific raw materials and chemical additives, including determining the most suitable type and amount of distributed fiber reinforcement. Composition of UHPC was optimized during laboratory manufacture of test specimens to obtain the best desired physical- mechanical properties of developed high performance concretes. In connection with laboratory testing, explosion field tests of UHPC specimens were performed and explosion resistance of laboratory produced UHPC testing boards was investigated.
Development of optimized PPP insulated pipe-cable systems in the commercial voltage range
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allam, E.M.; McKean, A.L.
1992-05-01
The primary objectives of this project included the development of an alternate domestic source of Paper-Polypropylene-Paper (PPP) laminate and the development of optimized designs for PPP-insulated pipe-type cable systems in the commercial voltage range. The development of a domestic source of PPP laminate was successfully completed. This laminate was utilized throughout the program for fabrication of full-size prototype cables submitted for laboratory qualification tests. Selected cables at rated voltages of 138, 230 and 345kV have been designed, fabricated and subjected to the series of qualification tests leading to full laboratory qualification. An optimized design of 2000 kcmil, 345kV cable insulatedmore » with 600 mils of domestic PPP laminate was fabricated and successfully passed all laboratory qualification tests. This cable design was subsequently installed at Waltz Mill to undergo the series of field tests leading to full commercial qualification.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allam, E.M.; McKean, A.L.
1992-05-01
The primary objectives of this project included the development of an alternate domestic source of Paper-Polypropylene-Paper (PPP) laminate and the development of optimized designs for PPP-insulated pipe-type cable systems in the commercial voltage range. The development of a domestic source of PPP laminate was successfully completed. This laminate was utilized throughout the program for fabrication of full-size prototype cables submitted for laboratory qualification tests. Selected cables at rated voltages of 138, 230 and 345kV have been designed, fabricated and subjected to the series of qualification tests leading to full laboratory qualification. An optimized design of 2000 kcmil, 345kV cable insulatedmore » with 600 mils of domestic PPP laminate was fabricated and successfully passed all laboratory qualification tests. This cable design was subsequently installed at Waltz Mill to undergo the series of field tests leading to full commercial qualification.« less
Optimization of cDNA microarrays procedures using criteria that do not rely on external standards.
Bruland, Torunn; Anderssen, Endre; Doseth, Berit; Bergum, Hallgeir; Beisvag, Vidar; Laegreid, Astrid
2007-10-18
The measurement of gene expression using microarray technology is a complicated process in which a large number of factors can be varied. Due to the lack of standard calibration samples such as are used in traditional chemical analysis it may be a problem to evaluate whether changes done to the microarray procedure actually improve the identification of truly differentially expressed genes. The purpose of the present work is to report the optimization of several steps in the microarray process both in laboratory practices and in data processing using criteria that do not rely on external standards. We performed a cDNA microarry experiment including RNA from samples with high expected differential gene expression termed "high contrasts" (rat cell lines AR42J and NRK52E) compared to self-self hybridization, and optimized a pipeline to maximize the number of genes found to be differentially expressed in the "high contrasts" RNA samples by estimating the false discovery rate (FDR) using a null distribution obtained from the self-self experiment. The proposed high-contrast versus self-self method (HCSSM) requires only four microarrays per evaluation. The effects of blocking reagent dose, filtering, and background corrections methodologies were investigated. In our experiments a dose of 250 ng LNA (locked nucleic acid) dT blocker, no background correction and weight based filtering gave the largest number of differentially expressed genes. The choice of background correction method had a stronger impact on the estimated number of differentially expressed genes than the choice of filtering method. Cross platform microarray (Illumina) analysis was used to validate that the increase in the number of differentially expressed genes found by HCSSM was real. The results show that HCSSM can be a useful and simple approach to optimize microarray procedures without including external standards. Our optimizing method is highly applicable to both long oligo-probe microarrays which have become commonly used for well characterized organisms such as man, mouse and rat, as well as to cDNA microarrays which are still of importance for organisms with incomplete genome sequence information such as many bacteria, plants and fish.
Optimization of cDNA microarrays procedures using criteria that do not rely on external standards
Bruland, Torunn; Anderssen, Endre; Doseth, Berit; Bergum, Hallgeir; Beisvag, Vidar; Lægreid, Astrid
2007-01-01
Background The measurement of gene expression using microarray technology is a complicated process in which a large number of factors can be varied. Due to the lack of standard calibration samples such as are used in traditional chemical analysis it may be a problem to evaluate whether changes done to the microarray procedure actually improve the identification of truly differentially expressed genes. The purpose of the present work is to report the optimization of several steps in the microarray process both in laboratory practices and in data processing using criteria that do not rely on external standards. Results We performed a cDNA microarry experiment including RNA from samples with high expected differential gene expression termed "high contrasts" (rat cell lines AR42J and NRK52E) compared to self-self hybridization, and optimized a pipeline to maximize the number of genes found to be differentially expressed in the "high contrasts" RNA samples by estimating the false discovery rate (FDR) using a null distribution obtained from the self-self experiment. The proposed high-contrast versus self-self method (HCSSM) requires only four microarrays per evaluation. The effects of blocking reagent dose, filtering, and background corrections methodologies were investigated. In our experiments a dose of 250 ng LNA (locked nucleic acid) dT blocker, no background correction and weight based filtering gave the largest number of differentially expressed genes. The choice of background correction method had a stronger impact on the estimated number of differentially expressed genes than the choice of filtering method. Cross platform microarray (Illumina) analysis was used to validate that the increase in the number of differentially expressed genes found by HCSSM was real. Conclusion The results show that HCSSM can be a useful and simple approach to optimize microarray procedures without including external standards. Our optimizing method is highly applicable to both long oligo-probe microarrays which have become commonly used for well characterized organisms such as man, mouse and rat, as well as to cDNA microarrays which are still of importance for organisms with incomplete genome sequence information such as many bacteria, plants and fish. PMID:17949480
Kwong, Jason C; Chua, Kyra; Charles, Patrick G P
2012-06-01
Community-associated methicillin-resistant Staphylococcus aureus (MRSA) is a rare, but significant cause of community-acquired pneumonia (CAP). A number of virulence determinants have been implicated in the development of severe community MRSA pneumonia, characterized by multilobar cavitating necrosis in patients without usual risk-factors for pneumonia. Optimal management is uncertain, and is extrapolated from anecdotal experiences with small case series, randomized studies of hospital-acquired pneumonia, and laboratory investigations using in vitro experiments and animal models of MRSA pneumonia. Adequate clinical suspicion, early diagnosis and administration of appropriate antibiotics are necessary for best patient outcomes, although some patients will still do badly even with early anti-MRSA therapy. Vancomycin or linezolid have been recommended as first-line therapy, possibly in combination with other antibiotics. Newer antibiotics such as ceftaroline are still being evaluated.
Jacobsen, Sonja; Patel, Pranav; Schmidt-Chanasit, Jonas; Leparc-Goffart, Isabelle; Teichmann, Anette; Zeller, Herve; Niedrig, Matthias
2016-03-01
Since the re-emergence of Chikungunya virus (CHIKV) in Reunion in 2005 and the recent outbreak in the Caribbean islands with an expansion to the Americas the CHIK diagnostic became very important. We evaluate the performance of laboratories regarding molecular and serological diagnostic of CHIK worldwide. A panel of 12 samples for molecular and 13 samples for serology were provided to 60 laboratories in 40 countries for evaluating the sensitivity and specificity of molecular and serology testing. The panel for molecular diagnostic testing was analysed by 56 laboratories returning 60 data sets of results whereas the 56 and 60 data sets were returned for IgG and IgM diagnostic from the participating laboratories. Twenty-three from 60 data sets performed optimal, 7 acceptable and 30 sets of results require improvement. From 50 data sets only one laboratory shows an optimal performance for IgM detection, followed by 9 data sets with acceptable and the rest need for improvement. From 46 IgG serology data sets 20 provide an optimal, 2 an acceptable and 24 require improvement performance. The evaluation of some of the diagnostic performances allows linking the quality of results to the in-house methods or commercial assays used. The external quality assurance for CHIK diagnostics provides a good overview on the laboratory performance regarding sensitivity and specificity for the molecular and serology diagnostic required for the quick and reliable analysis of suspected CHIK patients. Nearly half of the laboratories have to improve their diagnostic profile to achieve a better performance. Copyright © 2016 Z. Published by Elsevier B.V. All rights reserved.
The chemistry teaching laboratory: The student perspective
NASA Astrophysics Data System (ADS)
Polles, John Steven
In this study, I investigated the Student/learner's experiences in the chemistry teaching laboratory and the meaning that she or he derived from these experiences. This study sought to answer these questions: (1) What was the students experience in the teaching laboratory?, (2) What aspects of the laboratory experience did the student value?, and (3) What beliefs did the student hold concerning the role of the laboratory experience in developing her or his understanding of chemistry? Students involved in an introductory chemistry course at Purdue University were asked to complete a two-part questionnaire consisting of 16 scaled response and 5 free response items, and 685 did so. Fourteen students also participated in a semi-structured individual interview. The questionnaire and interview were designed to probe the students' perceived experience and answer the above questions. I found that students possess strong conceptions of the laboratory experience: a pre-conception that colors their experience from the outset, and a post-conception that is a mix of positive and negative reflections. I also found that the learner deeply holds an implicit value in the laboratory experience. The other major finding was that the students' lived experience is dramatically shaped or influenced by external agencies, primarily the faculty (and by extension the teaching assistants). There is much debate in the extant literature over the learning value of the science teaching laboratory, but it is all from the perspective of faculty, curriculum designers, and administrators. This study adds the students' voice to the argument.
A Feasability Study of the Wheel Electrostatic Spectrometer
NASA Technical Reports Server (NTRS)
Johansen, Michael Ryan; Phillips, James Ralph; Kelley, Joshua David; Mackey, Paul J.; Holbert, Eirik; Clements, Gregory R.; Calle, Carlos I.
2014-01-01
Mars rover missions rely on time-consuming, power-exhausting processes to analyze the Martian regolith. A low power electrostatic sensor in the wheels of a future Mars rover could be used to quickly determine when the rover is driving over a different type of regolith. The Electrostatics and Surface Physics Laboratory at NASA's Kennedy Space Center developed the Wheel Electrostatic Spectrometer as a feasibility study to investigate this option. In this paper, we discuss recent advances in this technology to increase the repeatability of the tribocharging experiments, along with supporting data. In addition, we discuss the development of a static elimination tool optimized for Martian conditions.
An improved numerical model for wave rotor design and analysis
NASA Technical Reports Server (NTRS)
Paxson, Daniel E.; Wilson, Jack
1993-01-01
A numerical model has been developed which can predict both the unsteady flows within a wave rotor and the steady averaged flows in the ports. The model is based on the assumptions of one-dimensional, unsteady, and perfect gas flow. Besides the dominant wave behavior, it is also capable of predicting the effects of finite tube opening time, leakage from the tube ends, and viscosity. The relative simplicity of the model makes it useful for design, optimization, and analysis of wave rotor cycles for any application. This paper discusses some details of the model and presents comparisons between the model and two laboratory wave rotor experiments.
An improved numerical model for wave rotor design and analysis
NASA Technical Reports Server (NTRS)
Paxson, Daniel E.; Wilson, Jack
1992-01-01
A numerical model has been developed which can predict both the unsteady flows within a wave rotor and the steady averaged flows in the ports. The model is based on the assumptions of one-dimensional, unsteady, and perfect gas flow. Besides the dominant wave behavior, it is also capable of predicting the effects of finite tube opening time, leakage from the tube ends, and viscosity. The relative simplicity of the model makes it useful for design, optimization, and analysis of wave rotor cycles for any application. This paper discusses some details of the model and presents comparisons between the model and two laboratory wave rotor experiments.
Demodulation System for Fiber Optic Bragg Grating Dynamic Pressure Sensing
NASA Technical Reports Server (NTRS)
Lekki, John D.; Adamovsky, Grigory; Floyd, Bertram
2001-01-01
Fiber optic Bragg gratings have been used for years to measure quasi-static phenomena. In aircraft engine applications there is a need to measure dynamic signals such as variable pressures. In order to monitor these pressures a detection system with broad dynamic range is needed. This paper describes an interferometric demodulator that was developed and optimized for this particular application. The signal to noise ratio was maximized through temporal coherence analysis. The demodulator was incorporated in a laboratory system that simulates conditions to be measured. Several pressure sensor configurations incorporating a fiber optic Bragg grating were also explored. The results of the experiments are reported in this paper.
Observation of Stronger-than-Binary Correlations with Entangled Photonic Qutrits
NASA Astrophysics Data System (ADS)
Hu, Xiao-Min; Liu, Bi-Heng; Guo, Yu; Xiang, Guo-Yong; Huang, Yun-Feng; Li, Chuan-Feng; Guo, Guang-Can; Kleinmann, Matthias; Vértesi, Tamás; Cabello, Adán
2018-05-01
We present the first experimental confirmation of the quantum-mechanical prediction of stronger-than-binary correlations. These are correlations that cannot be explained under the assumption that the occurrence of a particular outcome of an n ≥3 -outcome measurement is due to a two-step process in which, in the first step, some classical mechanism precludes n -2 of the outcomes and, in the second step, a binary measurement generates the outcome. Our experiment uses pairs of photonic qutrits distributed between two laboratories, where randomly chosen three-outcome measurements are performed. We report a violation by 9.3 standard deviations of the optimal inequality for nonsignaling binary correlations.
Investigations to improve carbon dioxide control with amine and molecular sieve type sorbers
NASA Technical Reports Server (NTRS)
Bertrand, J. F.; Brose, H. F.; Kester, F. L.; Lunde, P. J.
1972-01-01
The optimization trends and operating parameters of an integral molecular sieve bed heat exchanger were investigated. The optimum combination of substrate and coating for the HS-B porous polymer was determined based on the CO2 dynamic capacity in the presence of water vapor. Full size HS-B canister performance was evaluated. An Amine CO2 Concentrator utilizing IR-45 sorber material and available Manned Orbiting Laboratory hardware was designed, fabricated and tested for use as an experiment in the NASA 90-day space simulator test of 1970. It supported four men in the simulator for 71 days out of the 90-day test duration.
NASA Astrophysics Data System (ADS)
Gatlin, Todd Adam
Graduate teaching assistants (GTAs) play a prominent role in chemistry laboratory instruction at research based universities. They teach almost all undergraduate chemistry laboratory courses. However, their role in laboratory instruction has often been overlooked in educational research. Interest in chemistry GTAs has been placed on training and their perceived expectations, but less attention has been paid to their experiences or their potential benefits from teaching. This work was designed to investigate GTAs' experiences in and benefits from laboratory instructional environments. This dissertation includes three related studies on GTAs' experiences teaching in general chemistry laboratories. Qualitative methods were used for each study. First, phenomenological analysis was used to explore GTAs' experiences in an expository laboratory program. Post-teaching interviews were the primary data source. GTAs experiences were described in three dimensions: doing, knowing, and transferring. Gains available to GTAs revolved around general teaching skills. However, no gains specifically related to scientific development were found in this laboratory format. Case-study methods were used to explore and illustrate ways GTAs develop a GTA self-image---the way they see themselves as instructors. Two general chemistry laboratory programs that represent two very different instructional frameworks were chosen for the context of this study. The first program used a cooperative project-based approach. The second program used weekly, verification-type activities. End of the semester interviews were collected and served as the primary data source. A follow-up case study of a new cohort of GTAs in the cooperative problem-based laboratory was undertaken to investigate changes in GTAs' self-images over the course of one semester. Pre-semester and post-semester interviews served as the primary data source. Findings suggest that GTAs' construction of their self-image is shaped through the interaction of 1) prior experiences, 2) training, 3) beliefs about the nature of knowledge, 4) beliefs about the nature of laboratory work, and 5) involvement in the laboratory setting. Further GTAs' self-images are malleable and susceptible to change through their laboratory teaching experiences. Overall, this dissertation contributes to chemistry education by providing a model useful for exploring GTAs' development of a self-image in laboratory teaching. This work may assist laboratory instructors and coordinators in reconsidering, when applicable, GTA training and support. This work also holds considerable implications for how teaching experiences are conceptualized as part of the chemistry graduate education experience. Findings suggest that appropriate teaching experiences may contribute towards better preparing graduate students for their journey in becoming scientists.
Spell, Rachelle M.; Guinan, Judith A.; Miller, Kristen R.; Beck, Christopher W.
2014-01-01
Incorporating authentic research experiences in introductory biology laboratory classes would greatly expand the number of students exposed to the excitement of discovery and the rigor of the scientific process. However, the essential components of an authentic research experience and the barriers to their implementation in laboratory classes are poorly defined. To guide future reform efforts in this area, we conducted a national survey of biology faculty members to determine 1) their definitions of authentic research experiences in laboratory classes, 2) the extent of authentic research experiences currently experienced in their laboratory classes, and 3) the barriers that prevent incorporation of authentic research experiences into these classes. Strikingly, the definitions of authentic research experiences differ among faculty members and tend to emphasize either the scientific process or the discovery of previously unknown data. The low level of authentic research experiences in introductory biology labs suggests that more development and support is needed to increase undergraduate exposure to research experiences. Faculty members did not cite several barriers commonly assumed to impair pedagogical reform; however, their responses suggest that expanded support for development of research experiences in laboratory classes could address the most common barrier. PMID:24591509
Spell, Rachelle M; Guinan, Judith A; Miller, Kristen R; Beck, Christopher W
2014-01-01
Incorporating authentic research experiences in introductory biology laboratory classes would greatly expand the number of students exposed to the excitement of discovery and the rigor of the scientific process. However, the essential components of an authentic research experience and the barriers to their implementation in laboratory classes are poorly defined. To guide future reform efforts in this area, we conducted a national survey of biology faculty members to determine 1) their definitions of authentic research experiences in laboratory classes, 2) the extent of authentic research experiences currently experienced in their laboratory classes, and 3) the barriers that prevent incorporation of authentic research experiences into these classes. Strikingly, the definitions of authentic research experiences differ among faculty members and tend to emphasize either the scientific process or the discovery of previously unknown data. The low level of authentic research experiences in introductory biology labs suggests that more development and support is needed to increase undergraduate exposure to research experiences. Faculty members did not cite several barriers commonly assumed to impair pedagogical reform; however, their responses suggest that expanded support for development of research experiences in laboratory classes could address the most common barrier.
Developing an online chemistry laboratory for non-chemistry majors
NASA Astrophysics Data System (ADS)
Poole, Jacqueline H.
Distance education, also known as online learning, is student-centered/self-directed educational opportunities. This style of learning is expanding in scope and is increasingly being accepted throughout the academic curriculum as a result of its flexibility for the student as well as the cost-effectiveness for the institution. Nevertheless, the introduction of online science courses including chemistry and physics have lagged behind due to the challenge of re-creation of the hands-on laboratory learning experience. This dissertation looks at the effectiveness of the design of a series of chemistry laboratory experiments for possible online delivery that provide students with simulated hands-on experiences. One class of college Chemistry 101 students conducted chemistry experiments inside and outside of the physical laboratory using instructions on Blackboard and Late Nite Labs(TM). Learning outcomes measured by (a) pretests, (b) written laboratory reports, (c) posttest assessments, (d) student reactions as determined by a questionnaire, and (e) a focus group interview were utilized to compare both types of laboratory experiences. The research findings indicated learning outcomes achieved by students outside of the traditional physical laboratory were statistically greater than the equivalent face-to-face instruction in the traditional laboratory. Evidence from student reactions comparing both types of laboratory formats (online and traditional face-to-face) indicated student preference for the online laboratory format. The results are an initial contribution to the design of a complete sequence of experiments that can be performed independently by online students outside of the traditional face-to-face laboratory that will satisfy the laboratory requirement for the two-semester college Chemistry 101 laboratory course.
The Master level optics laboratory at the Institute of Optics
NASA Astrophysics Data System (ADS)
Adamson, Per
2017-08-01
The master level optics laboratory is a biannual, intensive laboratory course in the fields of geometrical, physical and modern optics. This course is intended for the master level student though Ph.D. advisors which often recommend it to their advisees. The students are required to complete five standard laboratory experiments and an independent project during a semester. The goals of the laboratory experiments are for the students to get hands-on experience setting up optical laboratory equipment, collecting and analyzing data, as well as to communicate key results. The experimental methods, analysis, and results of the standard experiments are submitted in a journal style report, while an oral presentation is given for the independent project.
Nudging Cooperation in a Crowd Experiment
Niella, Tamara; Stier-Moses, Nicolás; Sigman, Mariano
2016-01-01
We examine the hypothesis that driven by a competition heuristic, people don't even reflect or consider whether a cooperation strategy may be better. As a paradigmatic example of this behavior we propose the zero-sum game fallacy, according to which people believe that resources are fixed even when they are not. We demonstrate that people only cooperate if the competitive heuristic is explicitly overridden in an experiment in which participants play two rounds of a game in which competition is suboptimal. The observed spontaneous behavior for most players was to compete. Then participants were explicitly reminded that the competing strategy may not be optimal. This minor intervention boosted cooperation, implying that competition does not result from lack of trust or willingness to cooperate but instead from the inability to inhibit the competition bias. This activity was performed in a controlled laboratory setting and also as a crowd experiment. Understanding the psychological underpinnings of these behaviors may help us improve cooperation and thus may have vast practical consequences to our society. PMID:26797425
Nudging Cooperation in a Crowd Experiment.
Niella, Tamara; Stier-Moses, Nicolás; Sigman, Mariano
2016-01-01
We examine the hypothesis that driven by a competition heuristic, people don't even reflect or consider whether a cooperation strategy may be better. As a paradigmatic example of this behavior we propose the zero-sum game fallacy, according to which people believe that resources are fixed even when they are not. We demonstrate that people only cooperate if the competitive heuristic is explicitly overridden in an experiment in which participants play two rounds of a game in which competition is suboptimal. The observed spontaneous behavior for most players was to compete. Then participants were explicitly reminded that the competing strategy may not be optimal. This minor intervention boosted cooperation, implying that competition does not result from lack of trust or willingness to cooperate but instead from the inability to inhibit the competition bias. This activity was performed in a controlled laboratory setting and also as a crowd experiment. Understanding the psychological underpinnings of these behaviors may help us improve cooperation and thus may have vast practical consequences to our society.
Laboratory experiments of heat and moisture fluxes through supraglacial debris
NASA Astrophysics Data System (ADS)
Nicholson, Lindsey; Mayer, Christoph; Wirbel, Anna
2014-05-01
Inspired by earlier work (Reznichenko et al., 2010), we have carried out experiments within a climate chamber to explore the best ways to measure the heat and moisture fluxes through supraglacial debris. Sample ice blocks were prepared with debris cover of varying lithology, grain size and thickness and were instrumented with a combination of Gemini TinyTag temperature/relative humidity sensors and Decagon soil moisture sensors in order to monitor the heat and moisture fluxes through the overlying debris material when the experiment is exposed to specified solar lamp radiation and laminar airflow within the temperature-controlled climate chamber. Experimental results can be used to determine the optimal set up for numerical models of heat and moisture flux through supraglacial debris and also indicate the performance limitations of such sensors that can be expected in field installations. Reznichenko, N., Davies, T., Shulmeister, J. and McSaveney, M. (2010) Effects of debris on ice-surface melting rates: an experimental study. Journal of Glaciology, Volume 56, Number 197, 384-394.
Extinction of cue-evoked drug-seeking relies on degrading hierarchical instrumental expectancies
Hogarth, Lee; Retzler, Chris; Munafò, Marcus R.; Tran, Dominic M.D.; Troisi, Joseph R.; Rose, Abigail K.; Jones, Andrew; Field, Matt
2014-01-01
There has long been need for a behavioural intervention that attenuates cue-evoked drug-seeking, but the optimal method remains obscure. To address this, we report three approaches to extinguish cue-evoked drug-seeking measured in a Pavlovian to instrumental transfer design, in non-treatment seeking adult smokers and alcohol drinkers. The results showed that the ability of a drug stimulus to transfer control over a separately trained drug-seeking response was not affected by the stimulus undergoing Pavlovian extinction training in experiment 1, but was abolished by the stimulus undergoing discriminative extinction training in experiment 2, and was abolished by explicit verbal instructions stating that the stimulus did not signal a more effective response-drug contingency in experiment 3. These data suggest that cue-evoked drug-seeking is mediated by a propositional hierarchical instrumental expectancy that the drug-seeking response is more likely to be rewarded in that stimulus. Methods which degraded this hierarchical expectancy were effective in the laboratory, and so may have therapeutic potential. PMID:25011113
Some factors affecting performance of rats in the traveling salesman problem.
Bellizzi, C; Goldsteinholm, K; Blaser, R E
2015-11-01
The traveling salesman problem (TSP) is used to measure the efficiency of spatial route selection. Among researchers in cognitive psychology and neuroscience, it has been utilized to examine the mechanisms of decision making, planning, and spatial navigation. While both human and non-human animals produce good solutions to the TSP, the solution strategies engaged by non-human species are not well understood. We conducted two experiments on the TSP using Long-Evans laboratory rats as subjects. The first experiment examined the role of arena walls in route selection. Rats tend to display thigmotaxis in testing conditions comparable to the TSP, which could produce results similar to a convex hull type strategy suggested for humans. The second experiment examined the role of turn angle between targets along the optimal route, to determine whether rats exhibit a preferential turning bias. Our results indicated that both thigmotaxis and preferential turn angles do affect performance in the TSP, but neither is sufficient as a predictor of route choice in this task.
NASA Astrophysics Data System (ADS)
Hill, K. W.; Bitter, M.; Delgado-Aparicio, L.; Efthimion, P.; Pablant, N.; Lu, J.; Beiersdorfer, P.; Chen, H.; Magee, E.
2014-10-01
A high resolution 1D imaging x-ray spectrometer concept comprising a spherically bent crystal and a 2D pixelated detector is being optimized for diagnostics of small sources such as high energy density physics (HEDP) and synchrotron radiation or x-ray free electron laser experiments. This instrument is used on tokamak experiments for measurement of spatial profiles of Doppler ion temperature and plasma flow velocity, as well as electron temperature. Laboratory measurements demonstrate a resolving power, E/ ΔE of 10,000 and spatial resolution better than 10 μm. Good performance is obtained for Bragg angles ranging from 23 to 63 degrees. Initial tests of the instrument on HEDP plasmas are being performed with a goal of developing spatially resolved ion and electron temperature diagnostics. This work was performed under the auspices of the US DOE by PPPL under Contract DE-AC02-09CH11466 and by LLNL under Contract DE-AC52-07NA27344.
Goldstein, Zil; Corneil, Trevor A; Greene, Dina N
2017-08-01
Transgender is an umbrella term used to describe individuals who identify with a gender incongruent to or variant from their sex recorded at birth. Affirming gender identity through a variety of social, medical, and surgical interventions is critical to the mental health of transgender individuals. In recent years, awareness surrounding transgender identities has increased, which has highlighted the health disparities that parallel this demographic. These disparities are reflected in the experience of transgender patients and their providers when seeking clinical laboratory services. Little is known about the effect of gender-affirming hormone therapy and surgery on optimal laboratory test interpretation. Efforts to diminish health disparities encountered by transgender individuals and their providers can be accomplished by increasing social and clinical awareness regarding sex/gender incongruence and gaining insight into the physiological manifestations and laboratory interpretations of gender-affirming strategies. This review summarizes knowledge required to understand transgender healthcare including current clinical interventions for gender dysphoria. Particular attention is paid to the subsequent impact of these interventions on laboratory test utilization and interpretation. Common nomenclature and system barriers are also discussed. Understanding gender incongruence, the clinical changes associated with gender transition, and systemic barriers that maintain a gender/sex binary are key to providing adequate healthcare to transgender community. Transgender appropriate reference interval studies are virtually absent within the medical literature and should be explored. The laboratory has an important role in improving the physiological understanding, electronic medical system recognition, and overall social awareness of the transgender community. © 2017 American Association for Clinical Chemistry.
Development of Accessible Laboratory Experiments for Students with Visual Impairments
ERIC Educational Resources Information Center
Kroes, KC; Lefler, Daniel; Schmitt, Aaron; Supalo, Cary A.
2016-01-01
The hands-on laboratory experiments are frequently what spark students' interest in science. Students who are blind or have low vision (BLV) typically do not get the same experience while participating in hands-on activities due to accessibility. Over the course of approximately nine months, common chemistry laboratory experiments were adapted and…
Do-It-Yourself Experiments for the Instructional Laboratory
ERIC Educational Resources Information Center
Craig, Norman C.; Hill, Cortland S.
2012-01-01
A new design for experiments in the general chemistry laboratory incorporates a "do-it-yourself" component for students. In this design, students perform proven experiments to gain experience with techniques for about two-thirds of a laboratory session and then spend the last part in the do-it-yourself component, applying the techniques to an…
NASA Astrophysics Data System (ADS)
Sapriadil, S.; Setiawan, A.; Suhandi, A.; Malik, A.; Safitri, D.; Lisdiani, S. A. S.; Hermita, N.
2018-05-01
Communication skill is one skill that is very needed in this 21st century. Preparing and teaching this skill in teaching physics is relatively important. The focus of this research is to optimizing of students’ scientific communication skills after the applied higher order thinking virtual laboratory (HOTVL) on topic electric circuit. This research then employed experimental study particularly posttest-only control group design. The subject in this research involved thirty senior high school students which were taken using purposive sampling. A sample of seventy (70) students participated in the research. An equivalent number of thirty five (35) students were assigned to the control and experimental group. The results of this study found that students using higher order thinking virtual laboratory (HOTVL) in laboratory activities had higher scientific communication skills than students who used the verification virtual lab.
A teaching intervention for reading laboratory experiments in college-level introductory chemistry
NASA Astrophysics Data System (ADS)
Kirk, Maria Kristine
The purpose of this study was to determine the effects that a pre-laboratory guide, conceptualized as a "scientific story grammar," has on college chemistry students' learning when they read an introductory chemistry laboratory manual and perform the experiments in the chemistry laboratory. The participants (N = 56) were students enrolled in four existing general chemistry laboratory sections taught by two instructors at a women's liberal arts college. The pre-laboratory guide consisted of eight questions about the experiment, including the purpose, chemical species, variables, chemical method, procedure, and hypothesis. The effects of the intervention were compared with those of the traditional pre-laboratory assignment for the eight chemistry experiments. Measures included quizzes, tests, chemistry achievement test, science process skills test, laboratory reports, laboratory average, and semester grade. The covariates were mathematical aptitude and prior knowledge of chemistry and science processes, on which the groups differed significantly. The study captured students' perceptions of their experience in general chemistry through a survey and interviews with eight students. The only significant differences in the treatment group's performance were in some subscores on lecture items and laboratory items on the quizzes. An apparent induction period was noted, in that significant measures occurred in mid-semester. Voluntary study with the pre-laboratory guide by control students precluded significant differences on measures given later in the semester. The groups' responses to the survey were similar. Significant instructor effects on three survey items were corroborated by the interviews. The researcher's students were more positive about their pre-laboratory tasks, enjoyed the laboratory sessions more, and were more confident about doing chemistry experiments than the laboratory instructor's groups due to differences in scaffolding by the instructors.
NASA Astrophysics Data System (ADS)
Kano, Masayuki; Miyazaki, Shin'ichi; Ishikawa, Yoichi; Hiyoshi, Yoshihisa; Ito, Kosuke; Hirahara, Kazuro
2015-10-01
Data assimilation is a technique that optimizes the parameters used in a numerical model with a constraint of model dynamics achieving the better fit to observations. Optimized parameters can be utilized for the subsequent prediction with a numerical model and predicted physical variables are presumably closer to observations that will be available in the future, at least, comparing to those obtained without the optimization through data assimilation. In this work, an adjoint data assimilation system is developed for optimizing a relatively large number of spatially inhomogeneous frictional parameters during the afterslip period in which the physical constraints are a quasi-dynamic equation of motion and a laboratory derived rate and state dependent friction law that describe the temporal evolution of slip velocity at subduction zones. The observed variable is estimated slip velocity on the plate interface. Before applying this method to the real data assimilation for the afterslip of the 2003 Tokachi-oki earthquake, a synthetic data assimilation experiment is conducted to examine the feasibility of optimizing the frictional parameters in the afterslip area. It is confirmed that the current system is capable of optimizing the frictional parameters A-B, A and L by adopting the physical constraint based on a numerical model if observations capture the acceleration and decaying phases of slip on the plate interface. On the other hand, it is unlikely to constrain the frictional parameters in the region where the amplitude of afterslip is less than 1.0 cm d-1. Next, real data assimilation for the 2003 Tokachi-oki earthquake is conducted to incorporate slip velocity data inferred from time dependent inversion of Global Navigation Satellite System time-series. The optimized values of A-B, A and L are O(10 kPa), O(102 kPa) and O(10 mm), respectively. The optimized frictional parameters yield the better fit to the observations and the better prediction skill of slip velocity afterwards. Also, further experiment shows the importance of employing a fine-mesh model. It will contribute to the further understanding of the frictional properties on plate interfaces and lead to the forecasting system that provides useful information on the possibility of consequent earthquakes.
Measuring meaningful learning in the undergraduate chemistry laboratory
NASA Astrophysics Data System (ADS)
Galloway, Kelli R.
The undergraduate chemistry laboratory has been an essential component in chemistry education for over a century. The literature includes reports on investigations of singular aspects laboratory learning and attempts to measure the efficacy of reformed laboratory curriculum as well as faculty goals for laboratory learning which found common goals among instructors for students to learn laboratory skills, techniques, experimental design, and to develop critical thinking skills. These findings are important for improving teaching and learning in the undergraduate chemistry laboratory, but research is needed to connect the faculty goals to student perceptions. This study was designed to explore students' ideas about learning in the undergraduate chemistry laboratory. Novak's Theory of Meaningful Learning was used as a guide for the data collection and analysis choices for this research. Novak's theory states that in order for meaningful learning to occur the cognitive, affective, and psychomotor domains must be integrated. The psychomotor domain is inherent in the chemistry laboratory, but the extent to which the cognitive and affective domains are integrated is unknown. For meaningful learning to occur in the laboratory, students must actively integrate both the cognitive domain and the affective domains into the "doing" of their laboratory work. The Meaningful Learning in the Laboratory Instrument (MLLI) was designed to measure students' cognitive and affective expectations and experiences within the context of conducting experiments in the undergraduate chemistry laboratory. Evidence for the validity and reliability of the data generated by the MLLI were collected from multiple quantitative studies: a one semester study at one university, a one semester study at 15 colleges and universities across the United States, and a longitudinal study where the MLLI was administered 6 times during two years of general and organic chemistry laboratory courses. Results from these studies revealed students' narrow cognitive expectations for learning that go largely unmet by their experiences and diverse affective expectations and experiences. Concurrently, a qualitative study was carried out to describe and characterize students' cognitive and affective experiences in the undergraduate chemistry laboratory. Students were video recorded while performing one of their regular laboratory experiments and then interviewed about their experiences. The students' descriptions of their learning experiences were characterized by their overreliance on following the experimental procedure correctly rather than developing process-oriented problem solving skills. Future research could use the MLLI to intentionally compare different types of laboratory curricula or environments.
Temporary threshold shift after impulse-noise during video game play: laboratory data.
Spankovich, C; Griffiths, S K; Lobariñas, E; Morgenstein, K E; de la Calle, S; Ledon, V; Guercio, D; Le Prell, C G
2014-03-01
Prevention of temporary threshold shift (TTS) after laboratory-based exposure to pure-tones, broadband noise, and narrowband noise signals has been achieved, but prevention of TTS under these experimental conditions may not accurately reflect protection against hearing loss following impulse noise. This study used a controlled laboratory-based TTS paradigm that incorporated impulsive stimuli into the exposure protocol; development of this model could provide a novel platform for assessing proposed therapeutics. Participants played a video game that delivered gunfire-like sound through headphones as part of a target practice game. Effects were measured using audiometric threshold evaluations and distortion product otoacoustic emissions (DPOAEs). The sound level and number of impulses presented were sequentially increased throughout the study. Participants were normal-hearing students at the University of Florida who provided written informed consent prior to participation. TTS was not reliably induced by any of the exposure conditions assessed here. However, there was significant individual variability, and a subset of subjects showed TTS under some exposure conditions. A subset of participants demonstrated reliable threshold shifts under some conditions. Additional experiments are needed to better understand and optimize stimulus parameters that influence TTS after simulated impulse noise.
3D-PTV around Operational Wind Turbines
NASA Astrophysics Data System (ADS)
Brownstein, Ian; Dabiri, John
2016-11-01
Laboratory studies and numerical simulations of wind turbines are typically constrained in how they can inform operational turbine behavior. Laboratory experiments are usually unable to match both pertinent parameters of full-scale wind turbines, the Reynolds number (Re) and tip speed ratio, using scaled-down models. Additionally, numerical simulations of the flow around wind turbines are constrained by the large domain size and high Re that need to be simulated. When these simulations are preformed, turbine geometry is typically simplified resulting in flow structures near the rotor not being well resolved. In order to bypass these limitations, a quantitative flow visualization method was developed to take in situ measurements of the flow around wind turbines at the Field Laboratory for Optimized Wind Energy (FLOWE) in Lancaster, CA. The apparatus constructed was able to seed an approximately 9m x 9m x 5m volume in the wake of the turbine using artificial snow. Quantitative measurements were obtained by tracking the evolution of the artificial snow using a four camera setup. The methodology for calibrating and collecting data, as well as preliminary results detailing the flow around a 2kW vertical-axis wind turbine (VAWT), will be presented.
Temporary threshold shift after impulse-noise during video game play: Laboratory data
Spankovich, C.; Griffiths, S. K.; Lobariñas, E.; Morgenstein, K.E.; de la Calle, S.; Ledon, V.; Guercio, D.; Le Prell, C.G.
2015-01-01
Objective Prevention of temporary threshold shift (TTS) after laboratory-based exposure to pure-tones, broadband noise, and narrow band noise signals has been achieved, but prevention of TTS under these experimental conditions may not accurately reflect protection against hearing loss following impulse noise. This study used a controlled laboratory-based TTS paradigm that incorporated impulsive stimuli into the exposure protocol; development of this model could provide a novel platform for assessing proposed therapeutics. Design Participants played a video game that delivered gunfire-like sound through headphones as part of a target practice game. Effects were measured using audiometric threshold evaluations and distortion product otoacoustic emissions (DPOAEs). The sound level and number of impulses presented were sequentially increased throughout the study. Study sample Participants were normal-hearing students at the University of Florida who provided written informed consent prior to participation. Results TTS was not reliably induced by any of the exposure conditions assessed here. However, there was significant individual variability, and a subset of subjects showed TTS under some exposure conditions. Conclusions A subset of participants demonstrated reliable threshold shifts under some conditions. Additional experiments are needed to better understand and optimize stimulus parameters that influence TTS after simulated impulse noise. PMID:24564694
Chen, Ruifeng; Zhu, Lijun; Lv, Lihuo; Yao, Su; Li, Bin; Qian, Junqing
2017-06-01
Optimization of compatible solutes (ectoine) extraction and purification from Halomonas elongata cell fermentation had been investigated in the laboratory tests of a large scale commercial production project. After culturing H. elongata cells in developed medium at 28 °C for 23-30 h, we obtained an average yield and biomass of ectoine for 15.9 g/L and 92.9 (OD 600 ), respectively. Cell lysis was performed with acid treatment at moderate high temperature (60-70 °C). The downstream processing operations were designed to be as follows: filtration, desalination, cation exchange, extraction of crude product and three times of refining. Among which the cation exchange and extraction of crude product acquired a high average recovery rate of 95 and 96%; whereas a great loss rate of 19 and 15% was observed during the filtration and desalination, respectively. Combined with the recovering of ectoine from the mother liquor of the three times refining, the average of overall yield (referring to the amount of ectoine synthesized in cells) and purity of final product obtained were 43% and over 98%, respectively. However, key factors that affected the production efficiency were not yields but the time used in the extraction of crude product, involving the crystallization step from water, which spended 24-72 h according to the production scale. Although regarding to the productivity and simplicity on laboratory scale, the method described here can not compete with other investigations, in this study we acquired higher purity of ectoine and provided downstream processes that are capable of operating on industrial scale.
ERIC Educational Resources Information Center
Ural, Evrim
2016-01-01
The study aims to search the effect of guided inquiry laboratory experiments on students' attitudes towards chemistry laboratory, chemistry laboratory anxiety and their academic achievement in the laboratory. The study has been carried out with 37 third-year, undergraduate science education students, as a part of their Science Education Laboratory…
Solar Collector Design Optimization: A Hands-on Project Case Study
ERIC Educational Resources Information Center
Birnie, Dunbar P., III; Kaz, David M.; Berman, Elena A.
2012-01-01
A solar power collector optimization design project has been developed for use in undergraduate classrooms and/or laboratories. The design optimization depends on understanding the current-voltage characteristics of the starting photovoltaic cells as well as how the cell's electrical response changes with increased light illumination. Students…
NASA Astrophysics Data System (ADS)
Guglielmi, Y.; Cappa, F.; Nussbaum, C.
2015-12-01
The appreciation of the sensitivity of fractures and fault zones to fluid-induced-deformations in the subsurface is a key question in predicting the reservoir/caprock system integrity around fluid manipulations with applications to reservoir leakage and induced seismicity. It is also a question of interest in understanding earthquakes source, and recently the hydraulic behavior of clay faults under a potential reactivation around nuclear underground depository sites. Fault and fractures dynamics studies face two key problems (1) the up-scaling of laboratory determined properties and constitutive laws to the reservoir scale which is not straightforward when considering faults and fractures heterogeneities, (2) the difficulties to control both the induced seismicity and the stimulated zone geometry when a fault is reactivated. Using instruments dedicated to measuring coupled pore pressures and deformations downhole, we conducted field academic experiments to characterize fractures and fault zones hydromechanical properties as a function of their multi-scale architecture, and to monitor their dynamic behavior during the earthquake nucleation process. We show experiments on reservoir or cover rocks analogues in underground research laboratories where experimental conditions can be optimized. Key result of these experiments is to highlight how important the aseismic fault activation is compared to the induced seismicity. We show that about 80% of the fault kinematic moment is aseismic and discuss the complex associated fault friction coefficient variations. We identify that the slip stability and the slip velocity are mainly controlled by the rate of the permeability/porosity increase, and discuss the conditions for slip nucleation leading to seismic instability.
ERIC Educational Resources Information Center
Goldwasser, M. R.; Leal, O.
1979-01-01
Outlines an approach for instruction in a physical chemistry laboratory which combines traditional and project-like experiments. An outline of laboratory experiments and examples of project-like experiments are included. (BT)
Laboratory Astrophysics Using a Spare XRS Microcalorimeter
NASA Technical Reports Server (NTRS)
Audley, M. Damian; Beiersdorfer, Peter; Porter, Frederick Scott; Brown, Gregory; Boyce, Kevin R.; Brekosky, Regis; Brown, Gregory V.; Gendreau, Keith C.; Gygax, John; Kahn, Steve;
2000-01-01
The XRS instrument on Astro-E is a fully self-contained microcalorimeter x-ray instrument capable of acquiring optimally filtering, and characterizing events for 32 independent pixels. With the launch of the Astro-E spacecraft, a full flight spare detector system has been integrated into a laboratory cryostat for use on the electron beam ion trap (EBIT) at Lawrence Livermore National Laboratory. The detector system contains a microcalorimeter array with 32 instrumented pixels heat sunk to 60 mK using an adiabatic demagnetization refrio,erator. The instrument has a composite resolution of 8eV at 1 keV and 12eV at 6 keV with a minimum of 95% quantum efficiency. This will allow high spectral resolution, broadband observations of collisionally excited plasmas which are produced in the EBIT experiment. Unique to our instrument are exceptionally well characterized 1000 Angstrom thick aluminum on polyimide infrared blocking filters. The detailed transmission function including the edc,e fine structure of these filters has been measured in our laboratory using an erect field grating spectrometer. This will allow the instrument to perform the first broadband absolute flux measurements with the EBIT instrument. The instrument performance as well as the results of preliminary measurements will be discussed. Work performed under the auspices of the U.S. D.o.E. by Lawrence Livermore National Laboratory under contract W-7405-ENG-48 and was supported by the NASA High Energy Astrophysics Supporting Research and Technology Program.
Summary of engineering-scale experiments for the Solar Detoxification of Water project
NASA Astrophysics Data System (ADS)
Pacheco, J. E.; Yellowhorse, L.
1992-03-01
This report contains a summary of large-scale experiments conducted at Sandia National Laboratories under the Solar Detoxification of Water project. The objectives of the work performed were to determine the potential of using solar radiation to destroy organic contaminants in water by photocatalysis and to develop the process and improve its performance. For these experiments, we used parabolic troughs to focus sunlight onto glass pipes mounted at the trough's focus. Water spiked with a contaminant and containing suspended titanium dioxide catalyst was pumped through the illuminated glass pipe, activating the catalyst with the ultraviolet portion of the solar spectrum. The activated catalyst creates oxidizers that attack and destroy the organics. Included in this report are a summary and discussion of the implications of experiments conducted to determine: the effect of process kinetics on the destruction of chlorinated solvents (such as trichloroethylene, perchloroethylene, trichloroethane, methylene chloride, chloroform and carbon tetrachloride), the enhancement due to added hydrogen peroxide, the optimal catalyst loading, the effect of light intensity, the inhibition due to bicarbonates, and catalyst issues.
NASA Technical Reports Server (NTRS)
Vadali, Srinivas R.; Carter, Michael T.
1994-01-01
The Phillips Laboratory at the Edwards Air Force Base has developed the Advanced Space Structures Technology Research Experiment (ASTREX) facility to serve as a testbed for demonstrating the applicability of proven theories to the challenges of spacecraft maneuvers and structural control. This report describes the work performed on the ASTREX test article by Texas A&M University under contract NAS119373 as a part of the Control-Structure Interaction (CSI) Guest Investigator Program. The focus of this work is on maneuvering the ASTREX test article with compressed air thrusters that can be throttled, while attenuating structural excitation. The theoretical foundation for designing the near minimum-time thrust commands is based on the generation of smooth, parameterized optimal open-loop control profiles, and the determination of control laws for final position regulation and tracking using Lyapunov stability theory. Details of the theory, mathematical modeling, model updating, and compensation for the presence of 'real world' effects are described and the experimental results are presented. The results show an excellent match between theory and experiments.
Efficiency Optimization for FEL Oscillators,
1987-12-01
I 7 -ŕvle 3IIATIONCIFOR FEL OSCILLATORS(U) MARYLAND i/1’ UNIV COLLEGE PARK LAS FOR PLASMIA AND FUSION ENERGY STUDIES A SERBETO ET AL DEC 87 UMLPF-88...University of Maryland, By3 f *O- 0Laboratory for Plasrra and Fusion Energy Studies D i~ Avciil adi r "UnOUIO SAEMNT A APPrOVed for public reloe...Distribution Unlimited EFFICIENCY OPTIMIZATION FOR FEL OSCILLATORS A. Serbeto, B. Levush, and T. M. Antonsen, Jr. Laboratory for Plasma and Fusion Energy Studies
(Too) optimistic about optimism: the belief that optimism improves performance.
Tenney, Elizabeth R; Logg, Jennifer M; Moore, Don A
2015-03-01
A series of experiments investigated why people value optimism and whether they are right to do so. In Experiments 1A and 1B, participants prescribed more optimism for someone implementing decisions than for someone deliberating, indicating that people prescribe optimism selectively, when it can affect performance. Furthermore, participants believed optimism improved outcomes when a person's actions had considerable, rather than little, influence over the outcome (Experiment 2). Experiments 3 and 4 tested the accuracy of this belief; optimism improved persistence, but it did not improve performance as much as participants expected. Experiments 5A and 5B found that participants overestimated the relationship between optimism and performance even when their focus was not on optimism exclusively. In summary, people prescribe optimism when they believe it has the opportunity to improve the chance of success-unfortunately, people may be overly optimistic about just how much optimism can do. PsycINFO Database Record (c) 2015 APA, all rights reserved.
Optimal Predator Risk Assessment by the Sonar-Jamming Arctiine Moth Bertholdia trigona
Corcoran, Aaron J.; Wagner, Ryan D.; Conner, William E.
2013-01-01
Nearly all animals face a tradeoff between seeking food and mates and avoiding predation. Optimal escape theory holds that an animal confronted with a predator should only flee when benefits of flight (increased survival) outweigh the costs (energetic costs, lost foraging time, etc.). We propose a model for prey risk assessment based on the predator's stage of attack. Risk level should increase rapidly from when the predator detects the prey to when it commits to the attack. We tested this hypothesis using a predator – the echolocating bat – whose active biosonar reveals its stage of attack. We used a prey defense – clicking used for sonar jamming by the tiger moth Bertholdia trigona– that can be readily studied in the field and laboratory and is enacted simultaneously with evasive flight. We predicted that prey employ defenses soon after being detected and targeted, and that prey defensive thresholds discriminate between legitimate predatory threats and false threats where a nearby prey is attacked. Laboratory and field experiments using playbacks of ultrasound signals and naturally behaving bats, respectively, confirmed our predictions. Moths clicked soon after bats detected and targeted them. Also, B. trigona clicking thresholds closely matched predicted optimal thresholds for discriminating legitimate and false predator threats for bats using search and approach phase echolocation – the period when bats are searching for and assessing prey. To our knowledge, this is the first quantitative study to correlate the sensory stimuli that trigger defensive behaviors with measurements of signals provided by predators during natural attacks in the field. We propose theoretical models for explaining prey risk assessment depending on the availability of cues that reveal a predator's stage of attack. PMID:23671686
Eichmiller, Jessica J; Miller, Loren M; Sorensen, Peter W
2016-01-01
Few studies have examined capture and extraction methods for environmental DNA (eDNA) to identify techniques optimal for detection and quantification. In this study, precipitation, centrifugation and filtration eDNA capture methods and six commercially available DNA extraction kits were evaluated for their ability to detect and quantify common carp (Cyprinus carpio) mitochondrial DNA using quantitative PCR in a series of laboratory experiments. Filtration methods yielded the most carp eDNA, and a glass fibre (GF) filter performed better than a similar pore size polycarbonate (PC) filter. Smaller pore sized filters had higher regression slopes of biomass to eDNA, indicating that they were potentially more sensitive to changes in biomass. Comparison of DNA extraction kits showed that the MP Biomedicals FastDNA SPIN Kit yielded the most carp eDNA and was the most sensitive for detection purposes, despite minor inhibition. The MoBio PowerSoil DNA Isolation Kit had the lowest coefficient of variation in extraction efficiency between lake and well water and had no detectable inhibition, making it most suitable for comparisons across aquatic environments. Of the methods tested, we recommend using a 1.5 μm GF filter, followed by extraction with the MP Biomedicals FastDNA SPIN Kit for detection. For quantification of eDNA, filtration through a 0.2-0.6 μm pore size PC filter, followed by extraction with MoBio PowerSoil DNA Isolation Kit was optimal. These results are broadly applicable for laboratory studies on carps and potentially other cyprinids. The recommendations can also be used to inform choice of methodology for field studies. © 2015 John Wiley & Sons Ltd.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-30
... in physics, chemistry, mathematics, computer science, or engineering. Institutions should have a 4..., mathematics, computer science, or engineering with work experiences in laboratories or other settings...-0141-01] Professional Research Experience Program in Chemical Science and Technology Laboratory...
A Review of Use of Enantiomers in Homeopathy
Kuzeff, R. M.
2012-01-01
This paper reviews publications of laboratory experiments using pairs of enantiomers in homeopathy. Many molecules in nature have geometry which enables them to exist as nonsuperimposable mirror images or enantiomers. Modulation of toxicity of such molecules provides possibility for therapeutics, since they target multiple points in biochemical pathways. It was hypothesized that toxicity of a chemical agent could be counteracted by a homeopathic preparation of the enantiomer of the chemical agent (patents applied for: PCT/AU2003/000219-PCT/AU2008/001611). A diverse body of data, including controlled laboratory studies, supports the conclusion that toxicity of optical isomers may be inhibited by homeopathic enantiomer preparations. These data were obtained with minimal or no pretesting to determine optimal test solutions. Inhibition of the excitotoxic neurotransmitter L-glutamic acid with homeopathic preparations of D-glutamic acid indicates the latter may be of use for amelioration of symptoms of disturbances of mood. Similarly, homeopathic preparation of (+)-nicotine may be of use for inhibition of effects of nicotine in tobacco. PMID:23724294
NASA Astrophysics Data System (ADS)
Taylor, C. N.; Shimada, M.; Merrill, B. J.; Akers, D. W.; Hatano, Y.
2015-08-01
The present work is a continuation of a recent research to develop and optimize positron annihilation spectroscopy (PAS) for characterizing neutron-irradiated tungsten. Tungsten samples were exposed to neutrons in the High Flux Isotope Reactor (HFIR) at Oak Ridge National Laboratory and damaged to 0.025 and 0.3 dpa. Subsequently, they were exposed to deuterium plasmas in the Tritium Plasma Experiment (TPE) at Idaho National Laboratory. The implanted deuterium was desorbed through sample heating to 900 °C, and Doppler broadening (DB)-PAS was performed both before and after heating. Results show that deuterium impregnated tungsten is identified as having a smaller S-parameter. The S-parameter increases after deuterium desorption. Microstructural changes also occur during sample heating. These effects can be isolated from deuterium desorption by comparing the S-parameters from the deuterium-free back face with the deuterium-implanted front face. The application of using DB-PAS to examine deuterium retention in tungsten is examined.
NASA Technical Reports Server (NTRS)
Bulfin, R. L.; Perdue, C. A.
1994-01-01
The Mission Planning Division of the Mission Operations Laboratory at NASA's Marshall Space Flight Center is responsible for scheduling experiment activities for space missions controlled at MSFC. In order to draw statistically relevant conclusions, all experiments must be scheduled at least once and may have repeated performances during the mission. An experiment consists of a series of steps which, when performed, provide results pertinent to the experiment's functional objective. Since these experiments require a set of resources such as crew and power, the task of creating a timeline of experiment activities for the mission is one of resource constrained scheduling. For each experiment, a computer model with detailed information of the steps involved in running the experiment, including crew requirements, processing times, and resource requirements is created. These models are then loaded into the Experiment Scheduling Program (ESP) which attempts to create a schedule which satisfies all resource constraints. ESP uses a depth-first search technique to place each experiment into a time interval, and a scoring function to evaluate the schedule. The mission planners generate several schedules and choose one with a high value of the scoring function to send through the approval process. The process of approving a mission timeline can take several months. Each timeline must meet the requirements of the scientists, the crew, and various engineering departments as well as enforce all resource restrictions. No single objective is considered in creating a timeline. The experiment scheduling problem is: given a set of experiments, place each experiment along the mission timeline so that all resource requirements and temporal constraints are met and the timeline is acceptable to all who must approve it. Much work has been done on multicriteria decision making (MCDM). When there are two criteria, schedules which perform well with respect to one criterion will often perform poorly with respect to the other. One schedule dominates another if it performs strictly better on one criterion, and no worse on the other. Clearly, dominated schedules are undesireable. A nondominated schedule can be generated by some sort of optimization problem. Generally there are two approaches: the first is a hierarchical approach while the second requires optimizing a weighting or scoring function.
Sarcomere mechanics in striated muscles: from molecules to sarcomeres to cells.
Rassier, Dilson E
2017-08-01
Muscle contraction is commonly associated with the cross-bridge and sliding filament theories, which have received strong support from experiments conducted over the years in different laboratories. However, there are studies that cannot be readily explained by the theories, showing 1 ) a plateau of the force-length relation extended beyond optimal filament overlap, and forces produced at long sarcomere lengths that are higher than those predicted by the sliding filament theory; 2 ) passive forces at long sarcomere lengths that can be modulated by activation and Ca 2+ , which changes the force-length relation; and 3 ) an unexplained high force produced during and after stretch of activated muscle fibers. Some of these studies even propose "new theories of contraction." While some of these observations deserve evaluation, many of these studies present data that lack a rigorous control and experiments that cannot be repeated in other laboratories. This article reviews these issues, looking into studies that have used intact and permeabilized fibers, myofibrils, isolated sarcomeres, and half-sarcomeres. A common mechanism associated with sarcomere and half-sarcomere length nonuniformities and a Ca 2+ -induced increase in the stiffness of titin is proposed to explain observations that derive from these studies. Copyright © 2017 the American Physiological Society.
Farm Deployable Microbial Bioreactor for Fuel Ethanol Production
DOE Office of Scientific and Technical Information (OSTI.GOV)
Okeke, Benedict
Research was conducted to develop a farm and field deployable microbial bioreactor for bioethanol production from biomass. Experiments were conducted to select the most efficient microorganisms for conversion of plant fiber to sugars for fermentation to ethanol. Mixtures of biomass and surface soil samples were collected from selected sites in Alabama black belt counties (Macon, Sumter, Choctaw, Dallas, Montgomery, Lowndes) and other areas within the state of Alabama. Experiments were conducted to determine the effects of culture parameters on key biomass saccharifying enzymes (cellulase, beta-glucosidase, xylanase and beta-xylosidase). A wide-scale sampling of locally-grown fruits in Central Alabama was embarked tomore » isolate potential xylose fermenting microorganisms. Yeast isolates were evaluated for xylose fermentation. Selected microorganisms were characterized by DNA based methods. Factors affecting enzyme production and biomass saccharification were examined and optimized in the laboratory. Methods of biomass pretreatment were compared. Co-production of amylolytic enzymes with celluloytic-xylanolytic enzymes was evaluated; and co-saccharification of a combination of biomass, and starch-rich materials was examined. Simultaneous saccharification and fermentation with and without pre-saccharifcation was studied. Whole culture broth and filtered culture broth simultaneous saccahrifcation and fermentation were compared. A bioreactor system was designed and constructed to employ laboratory results for scale up of biomass saccharification.« less
Zhou, Yongqiang; Jeppesen, Erik; Zhang, Yunlin; Shi, Kun; Liu, Xiaohan; Zhu, Guangwei
2016-02-01
Surface drinking water sources have been threatened globally and there have been few attempts to detect point-source contamination in these waters using chromophoric dissolved organic matter (CDOM) fluorescence. To determine the optimal wavelength derived from CDOM fluorescence as an indicator of point-source contamination in drinking waters, a combination of field campaigns in Lake Qiandao and a laboratory wastewater addition experiment was used. Parallel factor (PARAFAC) analysis identified six components, including three humic-like, two tryptophan-like, and one tyrosine-like component. All metrics showed strong correlation with wastewater addition (r(2) > 0.90, p < 0.0001). Both the field campaigns and the laboratory contamination experiment revealed that CDOM fluorescence at 275/342 nm was the most responsive wavelength to the point-source contamination in the lake. Our results suggest that pollutants in Lake Qiandao had the highest concentrations in the river mouths of upstream inflow tributaries and the single wavelength at 275/342 nm may be adapted for online or in situ fluorescence measurements as an early warning of contamination events. This study demonstrates the potential utility of CDOM fluorescence to monitor water quality in surface drinking water sources. Copyright © 2015 Elsevier Ltd. All rights reserved.
Wilson, Preston S; Dunton, Kenneth H
2009-04-01
Previous in situ investigations of seagrass have revealed acoustic phenomena that depend on plant density, tissue gas content, and free bubbles produced by photosynthetic activity, but corresponding predictive models that could be used to optimize acoustic remote sensing, shallow water sonar, and mine hunting applications have not appeared. To begin to address this deficiency, low frequency (0.5-2.5 kHz) acoustic laboratory experiments were conducted on three freshly collected Texas Gulf Coast seagrass species. A one-dimensional acoustic resonator technique was used to assess the biomass and effective acoustic properties of the leaves and rhizomes of Thalassia testudinum (turtle grass), Syringodium filiforme (manatee grass), and Halodule wrightii (shoal grass). Independent biomass and gas content estimates were obtained via microscopic cross-section imagery. The acoustic results were compared to model predictions based on Wood's equation for a two-phase medium. The effective sound speed in the plant-filled resonator was strongly dependent on plant biomass, but the Wood's equation model (based on tissue gas content alone) could not predict the effective sound speed for the low irradiance conditions of the experiment, in which no free bubbles were generated by photosynthesis. The results corroborate previously published results obtained in situ for another seagrass species, Posidonia oceanica.
Long-Term Coexistence of Rotifer Cryptic Species
Serra, Manuel; Gómez, Africa
2011-01-01
Despite their high morphological similarity, cryptic species often coexist in aquatic habitats presenting a challenge in the framework of niche differentiation theory and coexistence mechanisms. Here we use a rotifer species complex inhabiting highly unpredictable and fluctuating salt lakes to gain insights into the mechanisms involved in stable coexistence in cryptic species. We combined molecular barcoding surveys of planktonic populations and paleogenetic analysis of diapausing eggs to reconstruct the current and historical coexistence dynamics of two highly morphologically similar rotifer species, B. plicatilis and B. manjavacas. In addition, we carried out laboratory experiments using clones isolated from eight lakes where both species coexist to explore their clonal growth responses to salinity, a challenging, highly variable and unpredictable condition in Mediterranean salt lakes. We show that both species have co-occurred in a stable way in one lake, with population fluctuations in which no species was permanently excluded. The seasonal occurrence patterns of the plankton in two lakes agree with laboratory experiments showing that both species differ in their optimal salinity. These results suggest that stable species coexistence is mediated by differential responses to salinity and its fluctuating regime. We discuss the role of fluctuating salinity and a persistent diapausing egg banks as a mechanism for species coexistence in accordance with the ‘storage effect’. PMID:21738691
Multidimensional Screening as a Pharmacology Laboratory Experience.
ERIC Educational Resources Information Center
Malone, Marvin H.; And Others
1979-01-01
A multidimensional pharmacodynamic screening experiment that addresses drug interaction is included in the pharmacology-toxicology laboratory experience of pharmacy students at the University of the Pacific. The student handout with directions for the procedure is reproduced, drug compounds tested are listed, and laboratory evaluation results are…
Nie, Jianhui; Wang, Wenbo; Wen, Zhiheng; Song, Aijing; Hong, Kunxue; Lu, Shan; Zhong, Ping; Xu, Jianqing; Kong, Wei; Li, Jingyun; Shang, Hong; Ling, Hong; Ruan, Li; Wang, Youchun
2012-11-01
Among the neutralizing antibody evaluation assays, the single-cycle pseudovirus infection assay is high-throughput and can provide rapid, sensitive and reproducible measurements after a single cycle of infection. Cell counts, pseudovirus inoculation levels, amount of diethylaminoethyl-dextran (DEAE-dextran), and the nonspecific effects of serum and plasma were tested to identify the optimal conditions for a neutralizing antibody assay based on pseudoviruses. Optimal conditions for cell counts, pseudovirus inoculation, and amount of DEAE-dextran were 1 × 10(4)cells/well, 200TCID(50)/well, and 15 μg/ml, respectively. Compared with serum samples, high-concentration anticoagulants reduced the relative light unit (RLU) value. The RLU value increased sharply initially but then decreased slowly with dilution of the plasma sample. Test kits containing 10 HIV-1 CRF07/08_BC pseudovirus strains and 10 plasma samples from individuals infected with HIV-1 CRF07/08_BC were assembled into two packages and distributed to nine laboratories with a standard operating procedure included. For the 10 laboratories that evaluated the test, 17 of 44 (37%) laboratory pairs were considered equivalent. A statistical qualification rule was developed based on the testing results from 5 experienced laboratories, where a laboratory qualified if at least 83% of values lied within the acceptable range. Copyright © 2012 Elsevier B.V. All rights reserved.
Improving American Healthcare Through “Clinical Lab 2.0”
Shotorbani, Khosrow; Sharma, Gaurav; Crossey, Michael; Kothari, Tarush; Lorey, Thomas S.; Prichard, Jeffrey W.; Wilkerson, Myra; Fisher, Nancy
2017-01-01
Project Santa Fe was established both to provide thought leadership and to help develop the evidence base for the valuation of clinical laboratory services in the next era of American healthcare. The participants in Project Santa Fe represent major regional health systems that can operationalize laboratory-driven innovations and test their valuation in diverse regional marketplaces in the United States. We provide recommendations from the inaugural March 2016 meeting of Project Santa Fe. Specifically, in the transition from volume-based to value-based health care, clinical laboratories are called upon to provide programmatic leadership in reducing total cost of care through optimization of time-to-diagnosis and time-to-effective therapeutics, optimization of care coordination, and programmatic support of wellness care, screening, and monitoring. This call to action is more than working with industry stakeholders on the basis of our expertise; it is providing leadership in creating the programs that accomplish these objectives. In so doing, clinical laboratories can be effectors in identifying patients at risk for escalation in care, closing gaps in care, and optimizing outcomes of health care innovation. We also hope that, through such activities, the evidence base will be created for the new value propositions of integrated laboratory networks. In the very simplest sense, this effort to create “Clinical Lab 2.0” will establish the impact of laboratory diagnostics on the full 100% spend in American healthcare, not just the 2.5% spend attributed to in vitro diagnostics. In so doing, our aim is to empower regional and local laboratories to thrive under new models of payment in the next era of American health care delivery. PMID:28725789
Harp, Jason Michael; Lessing, Paul Alan; Hoggan, Rita Elaine
2015-06-21
In collaboration with industry, Idaho National Laboratory is investigating uranium silicide for use in future light water reactor fuels as a more accident resistant alternative to uranium oxide base fuels. Specifically this project was focused on producing uranium silicide (U 3Si 2) pellets by conventional powder metallurgy with a density greater than 94% of the theoretical density. This work has produced a process to consistently produce pellets with the desired density through careful optimization of the process. Milling of the U 3Si 2 has been optimized and high phase purity U 3Si 2 has been successfully produced. Results are presentedmore » from sintering studies and microstructural examinations that illustrate the need for a finely ground reproducible particle size distribution in the source powder. The optimized process was used to produce pellets for the Accident Tolerant Fuel-1 irradiation experiment. The average density of these pellets was 11.54 ±0.06 g/cm 3. Additional characterization of the pellets by scaning electron microscopy and X-ray diffraction has also been performed. As a result, pellets produced in this work have been encapsulated for irradiation, and irradiation in the Advanced Test Reactor is expected soon.« less
Wang, Hao; Jiang, Jie; Zhang, Guangjun
2017-04-21
The simultaneous extraction of optical navigation measurements from a target celestial body and star images is essential for autonomous optical navigation. Generally, a single optical navigation sensor cannot simultaneously image the target celestial body and stars well-exposed because their irradiance difference is generally large. Multi-sensor integration or complex image processing algorithms are commonly utilized to solve the said problem. This study analyzes and demonstrates the feasibility of simultaneously imaging the target celestial body and stars well-exposed within a single exposure through a single field of view (FOV) optical navigation sensor using the well capacity adjusting (WCA) scheme. First, the irradiance characteristics of the celestial body are analyzed. Then, the celestial body edge model and star spot imaging model are established when the WCA scheme is applied. Furthermore, the effect of exposure parameters on the accuracy of star centroiding and edge extraction is analyzed using the proposed model. Optimal exposure parameters are also derived by conducting Monte Carlo simulation to obtain the best performance of the navigation sensor. Finally, laboratorial and night sky experiments are performed to validate the correctness of the proposed model and optimal exposure parameters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harp, Jason Michael; Lessing, Paul Alan; Hoggan, Rita Elaine
In collaboration with industry, Idaho National Laboratory is investigating uranium silicide for use in future light water reactor fuels as a more accident resistant alternative to uranium oxide base fuels. Specifically this project was focused on producing uranium silicide (U 3Si 2) pellets by conventional powder metallurgy with a density greater than 94% of the theoretical density. This work has produced a process to consistently produce pellets with the desired density through careful optimization of the process. Milling of the U 3Si 2 has been optimized and high phase purity U 3Si 2 has been successfully produced. Results are presentedmore » from sintering studies and microstructural examinations that illustrate the need for a finely ground reproducible particle size distribution in the source powder. The optimized process was used to produce pellets for the Accident Tolerant Fuel-1 irradiation experiment. The average density of these pellets was 11.54 ±0.06 g/cm 3. Additional characterization of the pellets by scaning electron microscopy and X-ray diffraction has also been performed. As a result, pellets produced in this work have been encapsulated for irradiation, and irradiation in the Advanced Test Reactor is expected soon.« less
Wang, Hao; Jiang, Jie; Zhang, Guangjun
2017-01-01
The simultaneous extraction of optical navigation measurements from a target celestial body and star images is essential for autonomous optical navigation. Generally, a single optical navigation sensor cannot simultaneously image the target celestial body and stars well-exposed because their irradiance difference is generally large. Multi-sensor integration or complex image processing algorithms are commonly utilized to solve the said problem. This study analyzes and demonstrates the feasibility of simultaneously imaging the target celestial body and stars well-exposed within a single exposure through a single field of view (FOV) optical navigation sensor using the well capacity adjusting (WCA) scheme. First, the irradiance characteristics of the celestial body are analyzed. Then, the celestial body edge model and star spot imaging model are established when the WCA scheme is applied. Furthermore, the effect of exposure parameters on the accuracy of star centroiding and edge extraction is analyzed using the proposed model. Optimal exposure parameters are also derived by conducting Monte Carlo simulation to obtain the best performance of the navigation sensor. Finally, laboratorial and night sky experiments are performed to validate the correctness of the proposed model and optimal exposure parameters. PMID:28430132
Monte Carlo simulation of a photodisintegration of 3 H experiment in Geant4
NASA Astrophysics Data System (ADS)
Gray, Isaiah
2013-10-01
An upcoming experiment involving photodisintegration of 3 H at the High Intensity Gamma-Ray Source facility at Duke University has been simulated in the software package Geant4. CAD models of silicon detectors and wire chambers were imported from Autodesk Inventor using the program FastRad and the Geant4 GDML importer. Sensitive detectors were associated with the appropriate logical volumes in the exported GDML file so that changes in detector geometry will be easily manifested in the simulation. Probability distribution functions for the energy and direction of outgoing protons were generated using numerical tables from previous theory, and energies and directions were sampled from these distributions using a rejection sampling algorithm. The simulation will be a useful tool to optimize detector geometry, estimate background rates, and test data analysis algorithms. This work was supported by the Triangle Universities Nuclear Laboratory REU program at Duke University.
Development of aerogel-lined targets for inertial confinement fusion experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Braun, Tom
2013-03-28
This thesis explores the formation of ICF compatible foam layers inside of an ablator shell used for inertial confinement fusion experiments at the National Ignition Facility. In particular, the capability of p- DCPD polymer aerogels to serve as a scaffold for the deuterium-tritium mix was analyzed. Four different factors were evaluated: the dependency of different factors such as thickness or composition of a precursor solution on the uniformity of the aerogel layer, how to bring the optimal composition inside of the ablator shell, the mechanical stability of ultra-low density p-DCPD aerogel bulk pieces during wetting and freezing with hydrogen, andmore » the wetting behavior of thin polymer foam layers in HDC carbon ablator shells with liquid deuterium. The research for thesis was done at Lawrence Livermore National Laboratory in cooperation with the Technical University Munich.« less
Towards an informative mutant phenotype for every bacterial gene
Deutschbauer, Adam; Price, Morgan N.; Wetmore, Kelly M.; ...
2014-08-11
Mutant phenotypes provide strong clues to the functions of the underlying genes and could allow annotation of the millions of sequenced yet uncharacterized bacterial genes. However, it is not known how many genes have a phenotype under laboratory conditions, how many phenotypes are biologically interpretable for predicting gene function, and what experimental conditions are optimal to maximize the number of genes with a phenotype. To address these issues, we measured the mutant fitness of 1,586 genes of the ethanol-producing bacterium Zymomonas mobilis ZM4 across 492 diverse experiments and found statistically significant phenotypes for 89% of all assayed genes. Thus, inmore » Z. mobilis, most genes have a functional consequence under laboratory conditions. We demonstrate that 41% of Z. mobilis genes have both a strong phenotype and a similar fitness pattern (cofitness) to another gene, and are therefore good candidates for functional annotation using mutant fitness. Among 502 poorly characterized Z. mobilis genes, we identified a significant cofitness relationship for 174. For 57 of these genes without a specific functional annotation, we found additional evidence to support the biological significance of these gene-gene associations, and in 33 instances, we were able to predict specific physiological or biochemical roles for the poorly characterized genes. Last, we identified a set of 79 diverse mutant fitness experiments in Z. mobilis that are nearly as biologically informative as the entire set of 492 experiments. Therefore, our work provides a blueprint for the functional annotation of diverse bacteria using mutant fitness.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deutschbauer, Adam; Price, Morgan N.; Wetmore, Kelly M.
Mutant phenotypes provide strong clues to the functions of the underlying genes and could allow annotation of the millions of sequenced yet uncharacterized bacterial genes. However, it is not known how many genes have a phenotype under laboratory conditions, how many phenotypes are biologically interpretable for predicting gene function, and what experimental conditions are optimal to maximize the number of genes with a phenotype. To address these issues, we measured the mutant fitness of 1,586 genes of the ethanol-producing bacterium Zymomonas mobilis ZM4 across 492 diverse experiments and found statistically significant phenotypes for 89% of all assayed genes. Thus, inmore » Z. mobilis, most genes have a functional consequence under laboratory conditions. We demonstrate that 41% of Z. mobilis genes have both a strong phenotype and a similar fitness pattern (cofitness) to another gene, and are therefore good candidates for functional annotation using mutant fitness. Among 502 poorly characterized Z. mobilis genes, we identified a significant cofitness relationship for 174. For 57 of these genes without a specific functional annotation, we found additional evidence to support the biological significance of these gene-gene associations, and in 33 instances, we were able to predict specific physiological or biochemical roles for the poorly characterized genes. Last, we identified a set of 79 diverse mutant fitness experiments in Z. mobilis that are nearly as biologically informative as the entire set of 492 experiments. Therefore, our work provides a blueprint for the functional annotation of diverse bacteria using mutant fitness.« less
Pervious concrete mix optimization for sustainable pavement solution
NASA Astrophysics Data System (ADS)
Barišić, Ivana; Galić, Mario; Netinger Grubeša, Ivanka
2017-10-01
In order to fulfill requirements of sustainable road construction, new materials for pavement construction are investigated with the main goal to preserve natural resources and achieve energy savings. One of such sustainable pavement material is pervious concrete as a new solution for low volume pavements. To accommodate required strength and porosity as the measure of appropriate drainage capability, four mixtures of pervious concrete are investigated and results of laboratory tests of compressive and flexural strength and porosity are presented. For defining the optimal pervious concrete mixture in a view of aggregate and financial savings, optimization model is utilized and optimal mixtures defined according to required strength and porosity characteristics. Results of laboratory research showed that comparing single-sized aggregate pervious concrete mixtures, coarse aggregate mixture result in increased porosity but reduced strengths. The optimal share of the coarse aggregate turn to be 40.21%, the share of fine aggregate is 49.79% for achieving required compressive strength of 25 MPa, flexural strength of 4.31 MPa and porosity of 21.66%.
A multi-center ring trial for the identification of anaerobic bacteria using MALDI-TOF MS.
Veloo, A C M; Jean-Pierre, H; Justesen, U S; Morris, T; Urban, E; Wybo, I; Shah, H N; Friedrich, A W; Morris, T; Shah, H N; Jean-Pierre, H; Justesen, U S; Nagy, E; Urban, E; Kostrzewa, M; Veloo, A; Friedrich, A W
2017-12-01
Inter-laboratory reproducibility of Matrix Assisted Laser Desorption Time-of-Flight Mass Spectrometry (MALDI-TOF MS) of anaerobic bacteria has not been shown before. Therefore, ten anonymized anaerobic strains were sent to seven participating laboratories, an initiative of the European Network for the Rapid Identification of Anaerobes (ENRIA). On arrival the strains were cultured and identified using MALDI-TOF MS. The spectra derived were compared with two different Biotyper MALDI-TOF MS databases, the db5627 and the db6903. The results obtained using the db5627 shows a reasonable variation between the different laboratories. However, when a more optimized database is used, the variation is less pronounced. In this study we show that an optimized database not only results in a higher number of strains which can be identified using MALDI-TOF MS, but also corrects for differences in performance between laboratories. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Todd, P. W.
1985-01-01
The objectives of the red blood cell experiments were to provide a visual check on the electrophoretic process and especially electroosmotic flow in space as well as to provide test separations of non-degradable standard particles for comparison with the separations of the three viable cell types studied on the Apollo-Soyuz Test Project. Determination of the maximum concentrations of cells that can be separated in column electrophore was a significant goal. Two of the eight columns were available for red cell experiments, so two concentrations of human and rabbit RBC mixtures were used. The objectives of another experiment were to evaluate the reproducibility of microgravity electrophoretic separation of living kidney cells, to separate cells with highly viability despite two freeze-thaw cycles, and to optimize the physical conditions of cell separation. Owing to the uncertain heterogeneity of the starting material, the experimental design does not assess resolution in microgravity, but improved separability was sought in comparison to density-gradient electrophoresis or continuous-flow electrophoresis. Efforts were made to increase cell yield and cell viability and to assess reproducibility directly.
Rapid determination of tartaric acid in wines.
Bastos, Sandra S T; Tafulo, Paula A R; Queirós, Raquel B; Matos, Cristina D; Sales, M Goreti F
2009-08-01
A flow-spectrophotometric method is proposed for the routine determination of tartaric acid in wines. The reaction between tartaric acid and vanadate in acetic media is carried out in flowing conditions and the subsequent colored complex is monitored at 475 nm. The stability of the complex and the corresponding formation constant are presented. The effect of wavelength and pH was evaluated by batch experiments. The selected conditions were transposed to a flow-injection analytical system. Optimization of several flow parameters such as reactor lengths, flow-rate and injection volume was carried out. Using optimized conditions, a linear behavior was observed up to 1000 microg mL(-1) tartaric acid, with a molar extinction coefficient of 450 L mg(-1) cm(-1) and +/- 1 % repeatability. Sample throughput was 25 samples per hour. The flow-spectrophotometric method was satisfactorily applied to the quantification of TA in wines from different sources. Its accuracy was confirmed by statistical comparison to the conventional Rebelein procedure and to a certified analytical method carried out in a routine laboratory.
A new magnet design for future Kibble balances
NASA Astrophysics Data System (ADS)
Li, Shisong; Stock, Michael; Schlamminger, Stephan
2018-06-01
We propose a new permanent magnet system for Kibble balance experiments, which combines advantages of the magnet designs invented by the National Physical Laboratory (NPL) and by the Bureau International des Poids et Mesures (BIPM). The goal of the proposed magnet system is to minimize the coil-current effect and to optimize the shielding at the same time. In the proposed design, a permanent magnet system with two gaps, each housing a coil, is employed to minimize the coil current effect, by reducing the linear coil-current dependence reported for the single air gap design by at least one order of magnitude. Both air gaps of the magnet are completely surrounded by high-permeability material, and hence the coils are shielded from outside magnetic fields and no magnetic field leaks outside of the magnet system. An example of the new magnet system is given and the analysis shows that the magnetic field in the air gap can be optimized to meet the requirement to be used in Kibble balances.
Salsamendi, Jason; Pereira, Keith; Baker, Reginald; Bhatia, Shivank S; Narayanan, Govindarajan
2015-10-01
Transplant renal artery stenosis (TRAS) is a vascular complication frequently seen because of increase in the number of renal transplantations. Early diagnosis and management is essential to optimize a proper graft function. Currently, the endovascular treatment of TRAS using angioplasty and/or stenting is considered the treatment of choice with the advantage that it does not preclude subsequent surgical correction. Treatment of TRAS with the use of stents, particularly in tortuous transplant renal anatomy presents a unique challenge to an interventional radiologist. In this study, we present three cases from our practice highlighting the use of a balloon-expandable Multi-Link RX Ultra coronary stent system (Abbott Laboratories, Abbott Park, Illinois, USA) for treating high grade focal stenosis along very tortuous renal arterial segments. Cobalt-Chromium alloy stent scaffold provides excellent radial force, whereas the flexible stent design conforms to the vessel course allowing for optimal stent alignment.
Attached cultivation for improving the biomass productivity of Spirulina platensis.
Zhang, Lanlan; Chen, Lin; Wang, Junfeng; Chen, Yu; Gao, Xin; Zhang, Zhaohui; Liu, Tianzhong
2015-04-01
To improve cultivation efficiency for microalgae Spirulina platensis is related to increase its potential use as food source and as an effective alternative for CO2 fixation. The present work attempted to establish a technique, namely attached cultivation, for S. platensis. Laboratory experiments were made firstly to investigate optimal conditions on attached cultivation. The optimal conditions were found: 25 g m(-2) for initial inoculum density using electrostatic flocking cloth as substrata, light intensity lower than 200 μmol m(-2) s(-1), CO2 enriched air flow (0.5%) at a superficial aeration rate of 0.0056 m s(-1) in a NaHCO3-free Zarrouk medium. An outdoor attached cultivation bench-scale bioreactor was built and a 10d culture of S. platensis was carried out with daily harvesting. A high footprint areal biomass productivity of 60 g m(-2) d(-1) was obtained. The nutrition of S. platensis with attached cultivation is identical to that with conventional liquid cultivation. Copyright © 2015 Elsevier Ltd. All rights reserved.
NEET-AMM Final Technical Report on Laser Direct Manufacturing (LDM) for Nuclear Power Components
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Scott; Baca, Georgina; O'Connor, Michael
2015-12-31
Final technical report summarizes the program progress and technical accomplishments of the Laser Direct Manufacturing (LDM) for Nuclear Power Components project. A series of experiments varying build process parameters (scan speed and laser power) were conducted at the outset to establish the optimal build conditions for each of the alloys. Fabrication was completed in collaboration with Quad City Manufacturing Laboratory (QCML). The density of all sample specimens was measured and compared to literature values. Optimal build process conditions giving fabricated part densities close to literature values were chosen for making mechanical test coupons. Test coupons whose principal axis is onmore » the x-y plane (perpendicular to build direction) and on the z plane (parallel to build direction) were built and tested as part of the experimental build matrix to understand the impact of the anisotropic nature of the process.. Investigations are described 316L SS, Inconel 600, 718 and 800 and oxide dispersion strengthed 316L SS (Yttria) alloys.« less
NASA Astrophysics Data System (ADS)
Poperechnikova, O. Yu; Filippov, L. O.; Shumskaya, E. N.; Filippova, I. V.
2017-07-01
The demand of high grade iron ore concentrates is a major issue due to the depletion of rich iron-bearing ores and high competitiveness in the iron ore market. Iron ore production is forced out to upgrade flowsheets to decrease the silica content in the pelettes. Different types of ore have different mineral composition and texture-structural features which require different mineral processing methods and technologies. The paper presents a comparative study of the cationic and anionic flotation routes to process a fine-grain oxidized iron ore. The modified carboxymethyl cellulose was found as the most efficient depressant in reverse cationic flotation. The results of flotation optimization of hematite ores using matrix of second-order center rotatable uniform design allowed to define the collector concentration, impeller rotation speed and air flowrate as the main flotation parameters impacting on the iron ore concentrate quality and iron recovery in a laboratory flotation machine. These parameters have been selected as independent during the experiments.
Reference earth orbital research and applications investigations (blue book). Volume 3: Physics
NASA Technical Reports Server (NTRS)
1971-01-01
The definition of physics experiments to be conducted aboard the space station is presented. The four functional program elements are: (1) space physics research laboratory, (2) plasma physics and environmental perturbation laboratory, (3) cosmic ray physics laboratory, and (4) physics and chemistry laboratory. The experiments to be conducted by each facility are defined and the crew member requirements to accomplish the experiments are presented.
A novel microseeding method for the crystallization of membrane proteins in lipidic cubic phase.
Kolek, Stefan Andrew; Bräuning, Bastian; Stewart, Patrick Douglas Shaw
2016-04-01
Random microseed matrix screening (rMMS), in which seed crystals are added to random crystallization screens, is an important breakthrough in soluble protein crystallization that increases the number of crystallization hits that are available for optimization. This greatly increases the number of soluble protein structures generated every year by typical structural biology laboratories. Inspired by this success, rMMS has been adapted to the crystallization of membrane proteins, making LCP seed stock by scaling up LCP crystallization conditions without changing the physical and chemical parameters that are critical for crystallization. Seed crystals are grown directly in LCP and, as with conventional rMMS, a seeding experiment is combined with an additive experiment. The new method was used with the bacterial integral membrane protein OmpF, and it was found that it increased the number of crystallization hits by almost an order of magnitude: without microseeding one new hit was found, whereas with LCP-rMMS eight new hits were found. It is anticipated that this new method will lead to better diffracting crystals of membrane proteins. A method of generating seed gradients, which allows the LCP seed stock to be diluted and the number of crystals in each LCP bolus to be reduced, if required for optimization, is also demonstrated.
Kimmel, Stacy A.; Roberts, Robert F.; Ziegler, Gregory R.
1998-01-01
The optimal fermentation temperature, pH, and Bacto-casitone (Difco Laboratories, Detroit, Mich.) concentration for production of exopolysaccharide by Lactobacillus delbrueckii subsp. bulgaricus RR in a semidefined medium were determined by using response surface methods. The design consisted of 20 experiments, 15 unique combinations, and five replications. All fermentations were conducted in a fermentor with a 2.5-liter working volume and were terminated when 90% of the glucose in the medium had been consumed. The population of L. delbrueckii subsp. bulgaricus RR and exopolysaccharide content were measured at the end of each fermentation. The optimum temperature, pH, and Bacto-casitone concentration for exopolysaccharide production were 38°C, 5, and 30 g/liter, respectively, with a predicted yield of 295 mg of exopolysaccharide/liter. The actual yield under these conditions was 354 mg of exopolysaccharide/liter, which was within the 95% confidence interval (217 to 374 mg of exopolysaccharide/liter). An additional experiment conducted under optimum conditions showed that exopolysaccharide production was growth associated, with a specific production at the endpoint of 101.4 mg/g of dry cells. Finally, to obtain material for further characterization, a 100-liter fermentation was conducted under optimum conditions. Twenty-nine grams of exopolysaccharide was isolated from centrifuged, ultrafiltered fermentation broth by ethanol precipitation. PMID:9464404
Optimization and Modification of the SeaQuest Trigger Efficiency Program
NASA Astrophysics Data System (ADS)
White, Nattapat
2017-09-01
The primary purpose E906/SeaQuest is to examine the quark and antiquark distributions within the nucleon. This experiment uses the proton beam from the 120 GeV Fermi National Accelerator Laboratory Main Injector to collide with one of several fixed targets. From the collision, a pair of muons produced by the Drell-Yan process directly probes the nucleon sea antiquarks. The Seaquest spectrometer consists of two focusing magnets, several detectors, and multiple planes of scintillating hodoscopes that helped track and analyze the properties of particles. Hodoscope hits are compared to predetermined hit combinations that would result from a pair of muons that originated in the target. Understanding the trigger efficiency is part of the path to determine the probability of Drell Yan muon pair production in the experiment. Over the years of data taking, the trigger efficiency varied as individual scintillator detection efficiency changed. To accurately determine how the trigger efficiency varied over time, the trigger efficiency program needed to be upgraded to include the effects of inefficiencies in the 284 individual channels in the hodoscope systems. The optimization, modification, and results of the upgraded trigger efficiency program will be presented. Supported by U.S. D.O.E. Medium Energy Nuclear Physics under Grant DE-FG02-03ER41243.
Application of Titration-Based Screening for the Rapid Pilot Testing of High-Throughput Assays.
Zhang, Ji-Hu; Kang, Zhao B; Ardayfio, Ophelia; Ho, Pei-i; Smith, Thomas; Wallace, Iain; Bowes, Scott; Hill, W Adam; Auld, Douglas S
2014-06-01
Pilot testing of an assay intended for high-throughput screening (HTS) with small compound sets is a necessary but often time-consuming step in the validation of an assay protocol. When the initial testing concentration is less than optimal, this can involve iterative testing at different concentrations to further evaluate the pilot outcome, which can be even more time-consuming. Quantitative HTS (qHTS) enables flexible and rapid collection of assay performance statistics, hits at different concentrations, and concentration-response curves in a single experiment. Here we describe the qHTS process for pilot testing in which eight-point concentration-response curves are produced using an interplate asymmetric dilution protocol in which the first four concentrations are used to represent the range of typical HTS screening concentrations and the last four concentrations are added for robust curve fitting to determine potency/efficacy values. We also describe how these data can be analyzed to predict the frequency of false-positives, false-negatives, hit rates, and confirmation rates for the HTS process as a function of screening concentration. By taking into account the compound pharmacology, this pilot-testing paradigm enables rapid assessment of the assay performance and choosing the optimal concentration for the large-scale HTS in one experiment. © 2013 Society for Laboratory Automation and Screening.
A Two-Week Guided Inquiry Protein Separation and Detection Experiment for Undergraduate Biochemistry
ERIC Educational Resources Information Center
Carolan, James P.; Nolta, Kathleen V.
2016-01-01
A laboratory experiment for teaching protein separation and detection in an undergraduate biochemistry laboratory course is described. This experiment, performed in two, 4 h laboratory periods, incorporates guided inquiry principles to introduce students to the concepts behind and difficulties of protein purification. After using size-exclusion…
ERIC Educational Resources Information Center
Simon, Nicole A.
2013-01-01
Virtual laboratory experiments using interactive computer simulations are not being employed as viable alternatives to laboratory science curriculum at extensive enough rates within higher education. Rote traditional lab experiments are currently the norm and are not addressing inquiry, Critical Thinking, and cognition throughout the laboratory…
ERIC Educational Resources Information Center
Lawrie, Gwendolyn Angela; Grøndahl, Lisbeth; Boman, Simon; Andrews, Trish
2016-01-01
Recent examples of high-impact teaching practices in the undergraduate chemistry laboratory that include course-based undergraduate research experiences and inquiry-based experiments require new approaches to assessing individual student learning outcomes. Instructors require tools and strategies that can provide them with insight into individual…
ERIC Educational Resources Information Center
Rowe, Laura
2017-01-01
An introductory bioinformatics laboratory experiment focused on protein analysis has been developed that is suitable for undergraduate students in introductory biochemistry courses. The laboratory experiment is designed to be potentially used as a "stand-alone" activity in which students are introduced to basic bioinformatics tools and…
An Example of a Laboratory Teaching Experience in a Professional Year (Plan B) Program
ERIC Educational Resources Information Center
Miller, P. J.; And Others
1978-01-01
A laboratory teaching experience (L.T.E.) was designed to focus on three teaching behaviors. It was recognized that a behavioral approach to teaching simplified its complexity by isolating specific teaching behaviors. Discusses the development and evaluation of the laboratory teaching experience. (Author/RK)
ERIC Educational Resources Information Center
Pursell, Christopher J.; Chandler, Bert; Bushey, Michelle M.
2004-01-01
Capillary electrophoresis is gradually working its way into the undergraduate laboratory curriculum. Typically, experiments utilizing this newer technology have been introduced into analytical or instrumental courses. The authors of this article have introduced an experiment into the introductory laboratory that utilizes capillary electrophoresis…
An Undergraduate Laboratory Experiment in Bioinorganic Chemistry: Ligation States of Myoglobin
ERIC Educational Resources Information Center
Bailey, James A.
2011-01-01
Although there are numerous inorganic model systems that are readily presented as undergraduate laboratory experiments in bioinorganic chemistry, there are few examples that explore the inorganic chemistry of actual biological molecules. We present a laboratory experiment using the oxygen-binding protein myoglobin that can be easily incorporated…
Consumer-Oriented Laboratory Activities: A Manual for Secondary Science Students.
ERIC Educational Resources Information Center
Anderson, Jacqueline; McDuffie, Thomas E., Jr.
This document provides a laboratory manual for use by secondary level students in performing consumer-oriented laboratory experiments. Each experiment includes an introductory question outlining the purpose of the investigation, a detailed discussion, detailed procedures, questions to be answered upon completing the experiment, and information for…
Lewis, Russell L; Seal, Erin L; Lorts, Aimee R; Stewart, Amanda L
2017-11-01
The undergraduate biochemistry laboratory curriculum is designed to provide students with experience in protein isolation and purification protocols as well as various data analysis techniques, which enhance the biochemistry lecture course and give students a broad range of tools upon which to build in graduate level laboratories or once they begin their careers. One of the most common biochemistry protein purification experiments is the isolation and characterization of cytochrome c. Students across the country purify cytochrome c, lysozyme, or some other well-known protein to learn these common purification techniques. What this series of experiments lacks is the use of sophisticated instrumentation that is rarely available to undergraduate students. To give students a broader background in biochemical spectroscopy techniques, a new circular dichroism (CD) laboratory experiment was introduced into the biochemistry laboratory curriculum. This CD experiment provides students with a means of conceptualizing the secondary structure of their purified protein, and assessments indicate that students' understanding of the technique increased significantly. Students conducted this experiment with ease and in a short time frame, so this laboratory is conducive to merging with other data analysis techniques within a single laboratory period. © 2017 by The International Union of Biochemistry and Molecular Biology, 45(6):515-520, 2017. © 2017 The International Union of Biochemistry and Molecular Biology.
Interactive virtual optical laboratories
NASA Astrophysics Data System (ADS)
Liu, Xuan; Yang, Yi
2017-08-01
Laboratory experiences are essential for optics education. However, college students have limited access to advanced optical equipment that is generally expensive and complicated. Hence there is a need for innovative solutions to expose students to advanced optics laboratories. Here we describe a novel approach, interactive virtual optical laboratory (IVOL) that allows unlimited number of students to participate the lab session remotely through internet, to improve laboratory education in photonics. Although students are not physically conducting the experiment, IVOL is designed to engage students, by actively involving students in the decision making process throughout the experiment.
ERIC Educational Resources Information Center
Southam, Daniel C.; Shand, Bradley; Buntine, Mark A.; Kable, Scott H.; Read, Justin R.; Morris, Jonathan C.
2013-01-01
An assessment of the acylation of ferrocene laboratory exercise across three successive years resulted in a significant fluctuation in student perception of the experiment. This perception was measured by collecting student responses to an instrument immediately after the experiment, which includes Likert and open-ended responses from the student.…
NASA Astrophysics Data System (ADS)
Javidi, Giti
2005-07-01
This study was designed to investigate an alternative to the use of traditional physical laboratory activities in a communication systems course. Specifically, this study examined whether as an alternative, computer simulation is as effective as physical laboratory activities in teaching college-level electronics engineering education students about the concepts of signal transmission, modulation and demodulation. Eighty undergraduate engineering students participated in the study, which was conducted at a southeastern four-year university. The students were randomly assigned to two groups. The groups were compared on understanding the concepts, remembering the concepts, completion time of the lab experiments and perception toward the laboratory experiments. The physical group's (n = 40) treatment was to conduct laboratory experiments in a physical laboratory. The students in this group used equipment in a controlled electronics laboratory. The Simulation group's (n = 40) treatment was to conduct similar experiments in a PC laboratory. The students in this group used a simulation program in a controlled PC lab. At the completion of the treatment, scores on a validated conceptual test were collected once after the treatment and again three weeks after the treatment. Attitude surveys and qualitative study were administered at the completion of the treatment. The findings revealed significant differences, in favor of the simulation group, between the two groups on both the conceptual post-test and the follow-up test. The findings also revealed significant correlation between simulation groups' attitude toward the simulation program and their post-test scores. Moreover, there was a significant difference between the two groups on their attitude toward their laboratory experience in favor of the simulation group. In addition, there was significant difference between the two groups on their lab completion time in favor of the simulation group. At the same time, the qualitative research has uncovered several issues not explored by the quantitative research. It was concluded that incorporating the recommendations acquired from the qualitative research, especially elements of incorporating hardware experience to avoid lack of hands-on skills, into the laboratory pedagogy should help improve students' experience regardless of the environment in which the laboratory is conducted.
NASA Astrophysics Data System (ADS)
Berger, Spencer Granett
This dissertation explores student perceptions of the instructional chemistry laboratory and the approaches students take when learning in the laboratory environment. To measure student perceptions of the chemistry laboratory, a survey instrument was developed. 413 students responded to the survey during the Fall 2011 semester. Students' perception of the usefulness of the laboratory in helping them learn chemistry in high school was related to several factors regarding their experiences in high school chemistry. Students' perception of the usefulness of the laboratory in helping them learn chemistry in college was also measured. Reasons students provided for the usefulness of the laboratory were categorized. To characterize approaches to learning in the laboratory, students were interviewed midway through semester (N=18). The interviews were used to create a framework describing learning approaches that students use in the laboratory environment. Students were categorized into three levels: students who view the laboratory as a requirement, students who believe that the laboratory augments their understanding, and students who view the laboratory as an important part of science. These categories describe the types of strategies students used when conducting experiments. To further explore the relationship between students' perception of the laboratory and their approaches to learning, two case studies are described. These case studies involve interviews in the beginning and end of the semester. In the interviews, students reflect on what they have learned in the laboratory and describe their perceptions of the laboratory environment. In order to encourage students to adopt higher-level approaches to learning in the laboratory, a metacognitive intervention was created. The intervention involved supplementary questions that students would answer while completing laboratory experiments. The questions were designed to encourage students to think critically about the laboratory procedures. In order to test the effects of the intervention, an experimental group (N=87) completed these supplementary questions during two laboratory experiments while a control group (N=84) performed the same experiments without these additional questions. The effects of the intervention on laboratory exam performance were measured. Students in the experimental group had a higher average on the laboratory exam than students in the control group.
OGO-6 gas-surface energy transfer experiment
NASA Technical Reports Server (NTRS)
Mckeown, D.; Dummer, R. S.; Bowyer, J. M., Jr.; Corbin, W. E., Jr.
1973-01-01
The kinetic energy flux of the upper atmosphere was analyzed using OGO-6 data. Energy transfer between 10 microwatts/sq cm and 0.1 W/sq cm was measured by short-term frequency changes of temperature-sensitive quartz crystals used in the energy transfer probe. The condition of the surfaces was continuously monitored by a quartz crystal microbalance to determine the effect surface contamination had on energy accommodation. Results are given on the computer analysis and laboratory tests performed to optimize the operation of the energy transfer probe. Data are also given on the bombardment of OGO-6 surfaces by high energy particles. The thermoelectrically-cooled quartz crystal microbalance is described in terms of its development and applications.
Supercomputers for engineering analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goudreau, G.L.; Benson, D.J.; Hallquist, J.O.
1986-07-01
The Cray-1 and Cray X-MP/48 experience in engineering computations at the Lawrence Livermore National Laboratory is surveyed. The fully vectorized explicit DYNA and implicit NIKE finite element codes are discussed with respect to solid and structural mechanics. The main efficiencies for production analyses are currently obtained by simple CFT compiler exploitation of pipeline architecture for inner do-loop optimization. Current developmet of outer-loop multitasking is also discussed. Applications emphasis will be on 3D examples spanning earth penetrator loads analysis, target lethality assessment, and crashworthiness. The use of a vectorized large deformation shell element in both DYNA and NIKE has substantially expandedmore » 3D nonlinear capability. 25 refs., 7 figs.« less
Optimizing multi-dimensional high throughput screening using zebrafish
Truong, Lisa; Bugel, Sean M.; Chlebowski, Anna; Usenko, Crystal Y.; Simonich, Michael T.; Massey Simonich, Staci L.; Tanguay, Robert L.
2016-01-01
The use of zebrafish for high throughput screening (HTS) for chemical bioactivity assessments is becoming routine in the fields of drug discovery and toxicology. Here we report current recommendations from our experiences in zebrafish HTS. We compared the effects of different high throughput chemical delivery methods on nominal water concentration, chemical sorption to multi-well polystyrene plates, transcription responses, and resulting whole animal responses. We demonstrate that digital dispensing consistently yields higher data quality and reproducibility compared to standard plastic tip-based liquid handling. Additionally, we illustrate the challenges in using this sensitive model for chemical assessment when test chemicals have trace impurities. Adaptation of these better practices for zebrafish HTS should increase reproducibility across laboratories. PMID:27453428
Andonian, G.; Barber, S.; O’Shea, F. H.; ...
2017-02-03
We show that temporal pulse tailoring of charged-particle beams is essential to optimize efficiency in collinear wakefield acceleration schemes. In this Letter, we demonstrate a novel phase space manipulation method that employs a beam wakefield interaction in a dielectric structure, followed by bunch compression in a permanent magnet chicane, to longitudinally tailor the pulse shape of an electron beam. This compact, passive, approach was used to generate a nearly linearly ramped current profile in a relativistic electron beam experiment carried out at the Brookhaven National Laboratory Accelerator Test Facility. Here, we report on these experimental results including beam and wakefieldmore » diagnostics and pulse profile reconstruction techniques.« less
ERIC Educational Resources Information Center
Dunnett, K.; Bartlett, P. A.
2018-01-01
It was planned to introduce online pre-laboratory session activities to a first-year undergraduate physics laboratory course to encourage a minimum level of student preparation for experiments outside the laboratory environment. A group of 16 and 17 year old laboratory work-experience students were tasked to define and design a pre-laboratory…
Chemical Remediation of Nickel(II) Waste: A Laboratory Experiment for General Chemistry Students
ERIC Educational Resources Information Center
Corcoran, K. Blake; Rood, Brian E.; Trogden, Bridget G.
2011-01-01
This project involved developing a method to remediate large quantities of aqueous waste from a general chemistry laboratory experiment. Aqueous Ni(II) waste from a general chemistry laboratory experiment was converted into solid nickel hydroxide hydrate with a substantial decrease in waste volume. The remediation method was developed for a…
A Laboratory Experiment on the Statistical Theory of Nuclear Reactions
ERIC Educational Resources Information Center
Loveland, Walter
1971-01-01
Describes an undergraduate laboratory experiment on the statistical theory of nuclear reactions. The experiment involves measuring the relative cross sections for formation of a nucleus in its meta stable excited state and its ground state by applying gamma-ray spectroscopy to an irradiated sample. Involves 3-4 hours of laboratory time plus…
ERIC Educational Resources Information Center
Spell, Rachelle M.; Guinan, Judith A.; Miller, Kristen R.; Beck, Christopher W.
2014-01-01
Incorporating authentic research experiences in introductory biology laboratory classes would greatly expand the number of students exposed to the excitement of discovery and the rigor of the scientific process. However, the essential components of an authentic research experience and the barriers to their implementation in laboratory classes are…
ERIC Educational Resources Information Center
Whitaker, Ragnhild D.; Truhlar, Laura M.; Yksel, Deniz; Walt, David R.; Williams, Mark D.
2010-01-01
The development and implementation of a research-based organic chemistry laboratory experiment is presented. The experiment was designed to simulate a scientific research environment, involve students in critical thinking, and develop the student's ability to analyze and present research-based data. In this experiment, a laboratory class…
A Laboratory Experiment for Rapid Determination of the Stability of Vitamin C
ERIC Educational Resources Information Center
Adem, Seid M.; Lueng, Sam H.; Elles, Lisa M. Sharpe; Shaver, Lee Alan
2016-01-01
Experiments in laboratory manuals intended for general, organic, and biological (GOB) chemistry laboratories include few opportunities for students to engage in instrumental methods of analysis. Many of these students seek careers in modern health-related fields where experience in spectroscopic techniques would be beneficial. A simple, rapid,…
NASA Astrophysics Data System (ADS)
Barrie, Simon C.; Bucat, Robert B.; Buntine, Mark A.; Burke da Silva, Karen; Crisp, Geoffrey T.; George, Adrian V.; Jamie, Ian M.; Kable, Scott H.; Lim, Kieran F.; Pyke, Simon M.; Read, Justin R.; Sharma, Manjula D.; Yeung, Alexandra
2015-07-01
Student experience surveys have become increasingly popular to probe various aspects of processes and outcomes in higher education, such as measuring student perceptions of the learning environment and identifying aspects that could be improved. This paper reports on a particular survey for evaluating individual experiments that has been developed over some 15 years as part of a large national Australian study pertaining to the area of undergraduate laboratories-Advancing Science by Enhancing Learning in the Laboratory. This paper reports on the development of the survey instrument and the evaluation of the survey using student responses to experiments from different institutions in Australia, New Zealand and the USA. A total of 3153 student responses have been analysed using factor analysis. Three factors, motivation, assessment and resources, have been identified as contributing to improved student attitudes to laboratory activities. A central focus of the survey is to provide feedback to practitioners to iteratively improve experiments. Implications for practitioners and researchers are also discussed.
Testing Optimal Foraging Theory Using Bird Predation on Goldenrod Galls
ERIC Educational Resources Information Center
Yahnke, Christopher J.
2006-01-01
All animals must make choices regarding what foods to eat, where to eat, and how much time to spend feeding. Optimal foraging theory explains these behaviors in terms of costs and benefits. This laboratory exercise focuses on optimal foraging theory by investigating the winter feeding behavior of birds on the goldenrod gall fly by comparing…
ERIC Educational Resources Information Center
Schmidt-McCormack, Jennifer A.; Muniz, Marc N.; Keuter, Ellie C.; Shaw, Scott K.; Cole, Renée S.
2017-01-01
Well-designed laboratories can help students master content and science practices by successfully completing the laboratory experiments. Upper-division chemistry laboratory courses often present special challenges for instruction due to the instrument intensive nature of the experiments. To address these challenges, particularly those associated…
A 13-Week Research-Based Biochemistry Laboratory Curriculum
ERIC Educational Resources Information Center
Lefurgy, Scott T.; Mundorff, Emily C.
2017-01-01
Here, we present a 13-week research-based biochemistry laboratory curriculum designed to provide the students with the experience of engaging in original research while introducing foundational biochemistry laboratory techniques. The laboratory experience has been developed around the directed evolution of an enzyme chosen by the instructor, with…
Inducing Mutations in "Paramecium": An Inquiry-Based Approach
ERIC Educational Resources Information Center
Elwess, Nancy L.; Latourelle, Sandra L.
2004-01-01
A major challenge in teaching any college level general genetics course including a laboratory component is having the students actively understand the research part of an experiment as well as develop the necessary laboratory skills. This laboratory experience furthers the students' knowledge of genetics while improving their laboratory skills.…
A Spectrophotometric Assay Optimizing Conditions for Pepsin Activity.
ERIC Educational Resources Information Center
Harding, Ethelynda E.; Kimsey, R. Scott
1998-01-01
Describes a laboratory protocol optimizing the conditions for the assay of pepsin activity using the Coomasie Blue dye binding assay of protein concentration. The dye bonds through strong, noncovalent interactions to basic and aromatic amino acid residues. (DDR)
Optimization of Asphalt Mixture Design for the Louisiana ALF Test Sections
DOT National Transportation Integrated Search
2018-05-01
This research presents an extensive study on the design and characterization of asphalt mixtures used in road pavements. Both mixture volumetrics and physical properties obtained from several laboratory tests were considered in optimizing the mixture...
The optimization of the inverted occulter of the solar orbiter/METIS coronagraph/spectrometer
NASA Astrophysics Data System (ADS)
Landini, F.; Vives, S.; Romoli, M.; Guillon, C.; Pancrazzi, M.; Escolle, C.; Focardi, M.; Fineschi, S.; Antonucci, E.; Nicolini, G.; Naletto, G.; Nicolosi, P.; Spadaro, D.
2017-11-01
The coronagraph/spectrometer METIS (Multi Element Telescope for Imaging and Spectroscopy), selected to fly aboard the Solar Orbiter ESA/NASA mission, is conceived to perform imaging (in visible, UV and EUV) and spectroscopy (in EUV) of the solar corona. It is an integrated instrument suite located on a single optical bench and sharing a unique aperture on the satellite heat shield. As every coronagraph, METIS is highly demanding in terms of stray light suppression. In order to meet the strict thermal requirements of Solar Orbiter, METIS optical design has been optimized by moving the entrance pupil at the level of the external occulter on the S/C thermal shield, thus reducing the size of the external aperture. The scheme is based on an inverted external-occulter (IEO). The IEO consists of a circular aperture on the Solar Orbiter thermal shield. A spherical mirror rejects back the disk-light through the IEO. The experience built on all the previous space coronagraphs forces designers to dedicate a particular attention to the occulter optimization. Two breadboards were manufactured to perform occulter optimization measurements: BOA (Breadboard of the Occulting Assembly) and ANACONDA (AN Alternative COnfiguration for the Occulting Native Design Assembly). A preliminary measurement campaign has been carried on at the Laboratoire d'Astrophysique de Marseille. In this paper we describe BOA and ANACONDA designs, the laboratory set-up and the preliminary results.
ERIC Educational Resources Information Center
Barrie, Simon C.; Bucat, Robert B.; Buntine, Mark A.; Burke da Silva, Karen; Crisp, Geoffrey T.; George, Adrian V.; Jamie, Ian M.; Kable, Scott H.; Lim, Kieran F.; Pyke, Simon M.; Read, Justin R.; Sharma, Manjula D.; Yeung, Alexandra
2015-01-01
Student experience surveys have become increasingly popular to probe various aspects of processes and outcomes in higher education, such as measuring student perceptions of the learning environment and identifying aspects that could be improved. This paper reports on a particular survey for evaluating individual experiments that has been developed…
NASA Astrophysics Data System (ADS)
Schill, Janna Marie
Professional socialization is a process that individuals experience as members of a profession and consists of the knowledge, attitudes, and experiences that influence and shape their professional identity. The process of professional socialization has not been studied in the clinical laboratory science profession. Clinical laboratory science is an allied health profession that is faced by a workforce shortage that has been caused by a decrease in new graduates, decreased retention of qualified professionals, and increased retirements. Other allied health professions such as nursing, athletic training, and pharmacy have studied professional socialization as a way to identify factors that may influence the retention of early career professionals. This mixed method study, which quantitatively used Hall's Professionalism Scale (1968) in addition to qualitative focus group interviews, sought to identify the professional attitudes and behaviors, sense of belonging, and professional socialization of early career clinical laboratory scientists. Early career clinical laboratory scientists were divided into two groups based upon the amount of work experience they had; new clinical laboratory science graduates have had less than one year of work experience and novice clinical laboratory scientists had between one and three years of work experience. This study found that early career clinical laboratory scientists have established professional identities and view themselves as members of the clinical laboratory science field within four proposed stages of professional socialization consisting of pre-arrival, encounter, adaptation, and commitment. New CLS graduates and novice clinical laboratory scientists were found to be at different stages of the professional stage process. New CLS graduates, who had less than one year of work experience, were found to be in the encounter stage. Novice clinical laboratory scientists, with one to three years of work experience, were found to be in the adaptation stage. In order for early career clinical laboratory scientists to successfully transition from student to committed professional, increased support from more experienced colleagues needs to be provided for this group of laboratory professionals. This study provided an initial examination of the professional socialization process in the CLS profession and adds to existing professional socialization studies in allied health.
Optimizing Mass Spectrometry Analyses: A Tailored Review on the Utility of Design of Experiments
Hecht, Elizabeth S.; Oberg, Ann L.; Muddiman, David
2016-01-01
SUMMARY Mass spectrometry (MS) has emerged as a tool that can analyze nearly all classes of molecules, with its scope rapidly expanding in the areas of post-translational modifications, MS instrumentation, and many others. Yet integration of novel analyte preparatory and purification methods with existing or novel mass spectrometers can introduce new challenges for MS sensitivity. The mechanisms that govern detection by MS are particularly complex and interdependent, including ionization efficiency, ion suppression, and transmission. Performance of both off-line and MS methods can be optimized separately or, when appropriate, simultaneously through statistical designs, broadly referred to as “design of experiments” (DOE). The following review provides a tutorial-like guide into the selection of DOE for MS experiments, the practices for modeling and optimization of response variables, and the available software tools that support DOE implementation in any laboratory. This review comes three years after the latest DOE review (Hibbert DB 2012), which provided a comprehensive overview on the types of designs available and their statistical construction. Since that time, new classes of DOE, such as the definitive screening design, have emerged and new calls have been made for mass spectrometrists to adopt the practice. Rather than exhaustively cover all possible designs, we have highlighted the three most practical DOE classes available to mass spectrometrists. This review further differentiates itself by providing expert recommendations for experimental setup and defining DOE entirely in the context of three case-studies that highlight the utility of different designs to achieve different goals. A step-by-step tutorial is also provided. PMID:26951559
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yamada, Y.; Kawase, Y.
2006-07-01
In order to examine the optimal design and operating parameters, kinetics for microbiological reaction and oxygen consumption in composting of waste activated sludge were quantitatively examined. A series of experiments was conducted to discuss the optimal operating parameters for aerobic composting of waste activated sludge obtained from Kawagoe City Wastewater Treatment Plant (Saitama, Japan) using 4 and 20 L laboratory scale bioreactors. Aeration rate, compositions of compost mixture and height of compost pile were investigated as main design and operating parameters. The optimal aerobic composting of waste activated sludge was found at the aeration rate of 2.0 L/min/kg (initial compostingmore » mixture dry weight). A compost pile up to 0.5 m could be operated effectively. A simple model for composting of waste activated sludge in a composting reactor was developed by assuming that a solid phase of compost mixture is well mixed and the kinetics for microbiological reaction is represented by a Monod-type equation. The model predictions could fit the experimental data for decomposition of waste activated sludge with an average deviation of 2.14%. Oxygen consumption during composting was also examined using a simplified model in which the oxygen consumption was represented by a Monod-type equation and the axial distribution of oxygen concentration in the composting pile was described by a plug-flow model. The predictions could satisfactorily simulate the experiment results for the average maximum oxygen consumption rate during aerobic composting with an average deviation of 7.4%.« less
NASA Technical Reports Server (NTRS)
Vogl, J. L.
1973-01-01
Current work aimed at identifying the active magnetospheric experiments that can be performed from the Space Shuttle, and designing a laboratory to carry out these experiments is described. The laboratory, known as the PPEPL (Plasma Physics and Environmental Perturbation Laboratory) consists of 35-ft pallet of instruments connected to a 25-ft pressurized control module. The systems deployed from the pallet are two 50-m booms, two subsatellites, a high-power transmitter, a multipurpose accelerator, a set of deployable canisters, and a gimbaled instrument platform. Missions are planned to last seven days, during which two scientists will carry out experiments from within the pressurized module. The type of experiments to be performed are outlined.
Study of Fluid Experiment System (FES)/CAST/Holographic Ground System (HGS)
NASA Technical Reports Server (NTRS)
Workman, Gary L.; Cummings, Rick; Jones, Brian
1992-01-01
The use of holographic and schlieren optical techniques for studying the concentration gradients in solidification processes has been used by several investigators over the years. The HGS facility at MSFC has been primary resource in researching this capability. Consequently, scientific personnel have been able to utilize these techniques in both ground based research and in space experiments. An important event in the scientific utilization of the HGS facilities was the TGS Crystal Growth and the casting and solidification technology (CAST) experiments that were flown on the International Microgravity Laboratory (IML) mission in March of this year. The preparation and processing of these space observations are the primary experiments reported in this work. This project provides some ground-based studies to optimize on the holographic techniques used to acquire information about the crystal growth processes flown on IML. Since the ground-based studies will be compared with the space-based experimental results, it is necessary to conduct sufficient ground based studies to best determine how the experiment worked in space. The current capabilities in computer based systems for image processing and numerical computation have certainly assisted in those efforts. As anticipated, this study has certainly shown that these advanced computing capabilities are helpful in the data analysis of such experiments.
Growth of InSb and InI Crystals on Earth and in Microgravity
NASA Technical Reports Server (NTRS)
Ostrogorsky, A. G.; Churilov, A.; Volz, M. P.; Riabov, V.; Van den Berg, L.
2015-01-01
During the past 40 years, dozens of semiconductor crystal growth experiments have been conducted in space laboratories. The subsequent analysis of the space-grown crystals revealed (i) that weak convection existed in virtually all melt-growth experiments, (ii) de-wetting significantly reduced the level of stress-induced defects, and (iii) particularly encouraging results were obtained in vapor-growth experiments. In 2002, following a decade of ground based research in growing doped Ge and GaSb crystals, a series of crystal growth experiments was performed at the ISS, within the SUBSA (Solidification Using a Baffle in Sealed Ampoules) investigation. Te- and Zn-doped InSb crystals were grown from the melt. The specially designed furnace provided a side-view of the melt and precise seeding measurement of the growth rate. At present, under sponsorship of CASIS (Center for the Advancement of Science in Space, www.iss-casis.org), we are conducting ground-based experiments with indium mono-iodide (InI) in preparation for the "SUBSA II" ISS investigation, planned for 2017. The experiments include: i) Horizontal Bridgman (HB) growth and ii) Vapor Transport (VT) growth. Finite element modeling will also be conducted, to optimize the design of the flight ampoules, for vapor and melt growth.
Integrated design, execution, and analysis of arrayed and pooled CRISPR genome-editing experiments.
Canver, Matthew C; Haeussler, Maximilian; Bauer, Daniel E; Orkin, Stuart H; Sanjana, Neville E; Shalem, Ophir; Yuan, Guo-Cheng; Zhang, Feng; Concordet, Jean-Paul; Pinello, Luca
2018-05-01
CRISPR (clustered regularly interspaced short palindromic repeats) genome-editing experiments offer enormous potential for the evaluation of genomic loci using arrayed single guide RNAs (sgRNAs) or pooled sgRNA libraries. Numerous computational tools are available to help design sgRNAs with optimal on-target efficiency and minimal off-target potential. In addition, computational tools have been developed to analyze deep-sequencing data resulting from genome-editing experiments. However, these tools are typically developed in isolation and oftentimes are not readily translatable into laboratory-based experiments. Here, we present a protocol that describes in detail both the computational and benchtop implementation of an arrayed and/or pooled CRISPR genome-editing experiment. This protocol provides instructions for sgRNA design with CRISPOR (computational tool for the design, evaluation, and cloning of sgRNA sequences), experimental implementation, and analysis of the resulting high-throughput sequencing data with CRISPResso (computational tool for analysis of genome-editing outcomes from deep-sequencing data). This protocol allows for design and execution of arrayed and pooled CRISPR experiments in 4-5 weeks by non-experts, as well as computational data analysis that can be performed in 1-2 d by both computational and noncomputational biologists alike using web-based and/or command-line versions.
NASA Astrophysics Data System (ADS)
Silverstone, S.; Nelson, M.; Alling, A.; Allen, J. P.
During the years 2002 and 2003, three closed system experiments were carried out in the "Laboratory Biosphere" facility located in Santa Fe, New Mexico. The program involved experimentation of "Hoyt" Soy Beans, (experiment #1) USU Apogee Wheat (experiment #2) and TU-82-155 sweet potato (experiment #3) using a 5.37 m 2 soil planting bed which was 30 cm deep. The soil texture, 40% clay, 31% sand and 28% silt (a clay loam), was collected from an organic farm in New Mexico to avoid chemical residues. Soil management practices involved minimal tillage, mulching, returning crop residues to the soil after each experiment and increasing soil biota by introducing worms, soil bacteria and mycorrhizae fungi. High soil pH of the original soil appeared to be a factor affecting the first two experiments. Hence, between experiments #2 and #3, the top 15 cm of the soil was amended using a mix of peat moss, green sand, humates and pumice to improve soil texture, lower soil pH and increase nutrient availability. This resulted in lowering the initial pH of 8.0-6.7 at the start of experiment #3. At the end of the experiment, the pH was 7.6. Soil nitrogen and phosphorus has been adequate, but some chlorosis was evident in the first two experiments. Aphid infestation was the only crop pest problem during the three experiments and was handled using an introduction of Hyppodamia convergens. Experimentation showed there were environmental differences even in this 1200 cubic foot ecological system facility, such as temperature and humidity gradients because of ventilation and airflow patterns which resulted in consequent variations in plant growth and yield. Additional humidifiers were added to counteract low humidity and helped optimize conditions for the sweet potato experiment. The experience and information gained from these experiments are being applied to the future design of the Mars On Earth ® facility (Silverstone et al., Development and research program for a soil-based bioregenerative agriculture system to feed a four person crew at a Mars base, Advances in Space Research 31(1) (2003) 69-75; Allen and Alling, The design approach for Mars On Earth ®, a biospheric closed system testing facility for long-term space habitation, American Institute of Aeronautics and Astronautics Inc., IAC-02-IAA.8.2.02, 2002).
Wain, Karen E; Riggs, Erin; Hanson, Karen; Savage, Melissa; Riethmaier, Darlene; Muirhead, Andrea; Mitchell, Elyse; Packard, Bethanny Smith; Faucett, W Andrew
2012-10-01
The International Standards for Cytogenomic Arrays (ISCA) Consortium is a worldwide collaborative effort dedicated to optimizing patient care by improving the quality of chromosomal microarray testing. The primary effort of the ISCA Consortium has been the development of a database of copy number variants (CNVs) identified during the course of clinical microarray testing. This database is a powerful resource for clinicians, laboratories, and researchers, and can be utilized for a variety of applications, such as facilitating standardized interpretations of certain CNVs across laboratories or providing phenotypic information for counseling purposes when published data is sparse. A recognized limitation to the clinical utility of this database, however, is the quality of clinical information available for each patient. Clinical genetic counselors are uniquely suited to facilitate the communication of this information to the laboratory by virtue of their existing clinical responsibilities, case management skills, and appreciation of the evolving nature of scientific knowledge. We intend to highlight the critical role that genetic counselors play in ensuring optimal patient care through contributing to the clinical utility of the ISCA Consortium's database, as well as the quality of individual patient microarray reports provided by contributing laboratories. Current tools, paper and electronic forms, created to maximize this collaboration are shared. In addition to making a professional commitment to providing complete clinical information, genetic counselors are invited to become ISCA members and to become involved in the discussions and initiatives within the Consortium.
ERIC Educational Resources Information Center
Goldman, Corey A., Ed.
The focus of the Association for Biology Laboratory Education (ABLE) is to improve the undergraduate biology laboratory experience by promoting the development and dissemination of interesting, innovative, and reliable laboratory exercises. This proceedings volume includes 13 papers: "Non-Radioactive DNA Hybridization Experiments for the…
Sun, Jennifer; Rudstam, Lars S.; Boscarino, Brent T.; Walsh, Maureen G.; Lantry, Brian F.
2013-01-01
Hemimysis anomala is a warm-water mysid that invaded the Great Lakes region in 2006 and has since rapidly spread throughout the basin. We conducted three laboratory experiments to better define the temperature preference, tolerance limits, and temperature effects on feeding rates of juvenile Hemimysis, using individuals acclimated to mid (16 °C) and upper (22 °C) preferred temperature values previously reported for the species. For temperature preference, we fit a two-parameter Gaussian (μ, σ) function to the experimental data, and found that the peak values (μ, interpreted as the preference temperature) were 22.0 °C (SE 0.25) when acclimated to 16 and 21.9 °C (SE 0.38) when acclimated to 22 °C, with the σ-values of the curves at 2.6 and 2.5 °C, respectively. No mysids were observed in temperatures below 10 or above 28 °C in these preference experiments. In short-term tolerance experiments for temperatures between 4 and 32 °C, all mysids died within 8 h at 30.2 °C for 16 °C acclimated mysids, and at 31.8 °C for 22 °C acclimated mysids. No lower lethal limit was found. Feeding rates increased with temperature from an average of 4 Bosmina eaten per hour at 5 °C to 19 Bosmina eaten per hour at 27 °C. The results of our experiments indicate an optimal temperature for Hemimysis between 21 and 27 °C, which corresponds with temperatures during periods of high population growth in the field. These results contribute a better understanding of this species' biological response to temperature that will help guide field studies and inform bioenergetics modeling.
Process Optimization Assessment: Fort Leonard Wood, MO and Fort Carson, CO
2003-11-01
IUJ US Army Corps of Engineers, Engineer Research and Development Center Process Optimization Assessment Fort Leonard Wood, MO and Fort Carson, CO... Optimization Assessment: Fort Leonard Wood, MO and Fort Carson, CO Mike C.J. Lin and John Vavrin Construction Engineering Research Laboratory PO Box 9005...work performed a Process Optimization Assessment (POA) on behalf of Fort Leonard Wood, MO and Fort Carson, CO to identify process, energy, and
ERIC Educational Resources Information Center
Abdel-Salam, Tarek; Kauffman, Paul J.; Crossman, Gary
2006-01-01
Educators question whether performing a laboratory experiment as an observer (non-hands-on), such as conducted in a distance education context, can be as effective a learning tool as personally performing the experiment in a laboratory environment. The present paper investigates this issue by comparing the performance of distance education…
ERIC Educational Resources Information Center
Cacciatore, Kristen L.; Sevian, Hannah
2009-01-01
Many institutions are responding to current research about how students learn science by transforming their general chemistry laboratory curricula to be inquiry-oriented. We present a comparison study of student performance after completing either a traditional or an inquiry stoichiometry experiment. This single laboratory experience was the only…
NASA Astrophysics Data System (ADS)
Huerta, N. J.; Fahrman, B.; Rod, K. A.; Fernandez, C. A.; Crandall, D.; Moore, J.
2017-12-01
Laboratory experiments provide a robust method to analyze well integrity. Experiments are relatively cheap, controlled, and repeatable. However, simplifying assumptions, apparatus limitations, and scaling are ubiquitous obstacles for translating results from the bench to the field. We focus on advancing the correlation between laboratory results and field conditions by characterizing how failure varies with specimen geometry using two experimental approaches. The first approach is designed to measure the shear bond strength between steel and cement in a down-scaled (< 3" diameter) well geometry. We use several cylindrical casing-cement-casing geometries that either mimic the scaling ratios found in the field or maximize the amount of metal and cement in the sample. We subject the samples to thermal shock cycles to simulate damage to the interfaces from operations. The bond was then measured via a push-out test. We found that not only did expected parameters, e.g. curing time, play a role in shear-bond strength but also that scaling of the geometry was important. The second approach is designed to observe failure of the well system due to pressure applied on the inside of a lab-scale (1.5" diameter) cylindrical casing-cement-rock geometry. The loading apparatus and sample are housed within an industrial X-ray CT scanner capable of imaging the system while under pressure. Radial tension cracks were observed in the cement after an applied internal pressure of 3000 psi and propagated through the cement and into the rock as pressure was increased. Based on our current suite of tests we find that the relationship between sample diameters and thicknesses is an important consideration when observing the strength and failure of well systems. The test results contribute to our knowledge of well system failure, evaluation and optimization of new cements, as well as the applicability of using scaled-down tests as a proxy for understanding field-scale conditions.
NASA Astrophysics Data System (ADS)
Dunnett, K.; Bartlett, P. A.
2018-01-01
It was planned to introduce online pre-laboratory session activities to a first-year undergraduate physics laboratory course to encourage a minimum level of student preparation for experiments outside the laboratory environment. A group of 16 and 17 year old laboratory work-experience students were tasked to define and design a pre-laboratory activity based on experiments that they had been undertaking. This informed the structure, content and aims of the activities introduced to a first year physics undergraduate laboratory course, with the particular focus on practising the data handling. An implementation study showed how students could try to optimise high grades, rather than gain efficiency-enhancing experience if careful controls were not put in place by assessors. However, the work demonstrated that pre-university and first-year physics students can take an active role in developing scaffolding activities that can help to improve the performance of those that follow their footsteps.
ERIC Educational Resources Information Center
Aydogdu, Cemil
2017-01-01
Chemistry lesson should be supported with experiments to understand the lecture effectively. For safety laboratory environment and to prevent laboratory accidents; chemical substances' properties, working principles for chemical substances' usage should be learnt. Aim of the present study was to analyze the effect of experiments which depend on…
Lab experiments are a major source of knowledge in the social sciences.
Falk, Armin; Heckman, James J
2009-10-23
Laboratory experiments are a widely used methodology for advancing causal knowledge in the physical and life sciences. With the exception of psychology, the adoption of laboratory experiments has been much slower in the social sciences, although during the past two decades the use of lab experiments has accelerated. Nonetheless, there remains considerable resistance among social scientists who argue that lab experiments lack "realism" and generalizability. In this article, we discuss the advantages and limitations of laboratory social science experiments by comparing them to research based on nonexperimental data and to field experiments. We argue that many recent objections against lab experiments are misguided and that even more lab experiments should be conducted.
Description of the Spacecraft Control Laboratory Experiment (SCOLE) facility
NASA Technical Reports Server (NTRS)
Williams, Jeffrey P.; Rallo, Rosemary A.
1987-01-01
A laboratory facility for the study of control laws for large flexible spacecraft is described. The facility fulfills the requirements of the Spacecraft Control Laboratory Experiment (SCOLE) design challenge for a laboratory experiment, which will allow slew maneuvers and pointing operations. The structural apparatus is described in detail sufficient for modelling purposes. The sensor and actuator types and characteristics are described so that identification and control algorithms may be designed. The control implementation computer and real-time subroutines are also described.
Description of the Spacecraft Control Laboratory Experiment (SCOLE) facility
NASA Technical Reports Server (NTRS)
Williams, Jeffrey P.; Rallo, Rosemary A.
1987-01-01
A laboratory facility for the study of control laws for large flexible spacecraft is described. The facility fulfills the requirements of the Spacecraft Control Laboratory Experiment (SCOLE) design challenge for laboratory experiments, which will allow slew maneuvers and pointing operations. The structural apparatus is described in detail sufficient for modelling purposes. The sensor and actuator types and characteristics are described so that identification and control algorithms may be designed. The control implementation computer and real-time subroutines are also described.
Biotechnology Laboratory Methods.
ERIC Educational Resources Information Center
Davis, Robert H.; Kompala, Dhinakar S.
1989-01-01
Describes a course entitled "Biotechnology Laboratory" which introduces a variety of laboratory methods associated with biotechnology. Describes the history, content, and seven experiments of the course. The seven experiments are selected from microbiology and molecular biology, kinetics and fermentation, and downstream…
HOMER Economic Models - US Navy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bush, Jason William; Myers, Kurt Steven
This LETTER REPORT has been prepared by Idaho National Laboratory for US Navy NAVFAC EXWC to support in testing pre-commercial SIREN (Simulated Integration of Renewable Energy Networks) computer software models. In the logistics mode SIREN software simulates the combination of renewable power sources (solar arrays, wind turbines, and energy storage systems) in supplying an electrical demand. NAVFAC EXWC will create SIREN software logistics models of existing or planned renewable energy projects at five Navy locations (San Nicolas Island, AUTEC, New London, & China Lake), and INL will deliver additional HOMER computer models for comparative analysis. In the transient mode SIRENmore » simulates the short time-scale variation of electrical parameters when a power outage or other destabilizing event occurs. In the HOMER model, a variety of inputs are entered such as location coordinates, Generators, PV arrays, Wind Turbines, Batteries, Converters, Grid costs/usage, Solar resources, Wind resources, Temperatures, Fuels, and Electric Loads. HOMER's optimization and sensitivity analysis algorithms then evaluate the economic and technical feasibility of these technology options and account for variations in technology costs, electric load, and energy resource availability. The Navy can then use HOMER’s optimization and sensitivity results to compare to those of the SIREN model. The U.S. Department of Energy (DOE) Idaho National Laboratory (INL) possesses unique expertise and experience in the software, hardware, and systems design for the integration of renewable energy into the electrical grid. NAVFAC EXWC will draw upon this expertise to complete mission requirements.« less
Parker, Jacqui A; Barroso, Filipa; Stanworth, Simon J; Spiby, Helen; Hopewell, Sally; Doree, Carolyn J; Renfrew, Mary J; Allard, Shubha
2012-06-24
Anaemia, in particular due to iron deficiency, is common in pregnancy with associated negative outcomes for mother and infant. However, there is evidence of significant variation in management. The objectives of this review of systematic reviews were to analyse and summarise the evidence base, identify gaps in the evidence and develop a research agenda for this important component of maternity care. Multiple databases were searched, including MEDLINE, EMBASE and The Cochrane Library. All systematic reviews relating to interventions to prevent and treat anaemia in the antenatal and postnatal period were eligible. Two reviewers independently assessed data inclusion, extraction and quality of methodology. 27 reviews were included, all reporting on the prevention and treatment of anaemia in the antenatal (n = 24) and postnatal periods (n = 3). Using AMSTAR as the assessment tool for methodological quality, only 12 of the 27 were rated as high quality reviews. The greatest number of reviews covered antenatal nutritional supplementation for the prevention of anaemia (n = 19). Iron supplementation was the most extensively researched, but with ongoing uncertainty about optimal dose and regimen. Few identified reviews addressed anaemia management post-partum or correlations between laboratory and clinical outcomes, and no reviews reported on clinical symptoms of anaemia. The review highlights evidence gaps including the management of anaemia in the postnatal period, screening for anaemia, and optimal interventions for treatment. Research priorities include developing standardised approaches to reporting of laboratory outcomes, and information on clinical outcomes relevant to the experiences of pregnant women.
Assay optimization: a statistical design of experiments approach.
Altekar, Maneesha; Homon, Carol A; Kashem, Mohammed A; Mason, Steven W; Nelson, Richard M; Patnaude, Lori A; Yingling, Jeffrey; Taylor, Paul B
2007-03-01
With the transition from manual to robotic HTS in the last several years, assay optimization has become a significant bottleneck. Recent advances in robotic liquid handling have made it feasible to reduce assay optimization timelines with the application of statistically designed experiments. When implemented, they can efficiently optimize assays by rapidly identifying significant factors, complex interactions, and nonlinear responses. This article focuses on the use of statistically designed experiments in assay optimization.
Carlyle, Harriet F; Tellam, John H; Parker, Karen E
2004-01-01
An attempt has been made to estimate quantitatively cation concentration changes as estuary water invades a Triassic Sandstone aquifer in northwest England. Cation exchange capacities and selectivity coefficients for Na(+), K(+), Ca(2+), and Mg(2+) were measured in the laboratory using standard techniques. Selectivity coefficients were also determined using a method involving optimized back-calculation from flushing experiments, thus permitting better representation of field conditions; in all cases, the Gaines-Thomas/constant cation exchange capacity (CEC) model was found to be a reasonable, though not perfect, first description. The exchange parameters interpreted from the laboratory experiments were used in a one-dimensional reactive transport mixing cell model, and predictions compared with field pumping well data (Cl and hardness spanning a period of around 40 years, and full major ion analyses in approximately 1980). The concentration patterns predicted using Gaines-Thomas exchange with calcite equilibrium were similar to the observed patterns, but the concentrations of the divalent ions were significantly overestimated, as were 1980 sulphate concentrations, and 1980 alkalinity concentrations were underestimated. Including representation of sulphate reduction in the estuarine alluvium failed to replicate 1980 HCO(3) and pH values. However, by including partial CO(2) degassing following sulphate reduction, a process for which there is 34S and 18O evidence from a previous study, a good match for SO(4), HCO(3), and pH was attained. Using this modified estuary water and averaged values from the laboratory ion exchange parameter determinations, good predictions for the field cation data were obtained. It is concluded that the Gaines-Thomas/constant exchange capacity model with averaged parameter values can be used successfully in ion exchange predictions in this aquifer at a regional scale and over extended time scales, despite the numerous assumptions inherent in the approach; this has also been found to be the case in the few other published studies of regional ion exchanging flow.
NASA Astrophysics Data System (ADS)
Carlyle, Harriet F.; Tellam, John H.; Parker, Karen E.
2004-01-01
An attempt has been made to estimate quantitatively cation concentration changes as estuary water invades a Triassic Sandstone aquifer in northwest England. Cation exchange capacities and selectivity coefficients for Na +, K +, Ca 2+, and Mg 2+ were measured in the laboratory using standard techniques. Selectivity coefficients were also determined using a method involving optimized back-calculation from flushing experiments, thus permitting better representation of field conditions; in all cases, the Gaines-Thomas/constant cation exchange capacity (CEC) model was found to be a reasonable, though not perfect, first description. The exchange parameters interpreted from the laboratory experiments were used in a one-dimensional reactive transport mixing cell model, and predictions compared with field pumping well data (Cl and hardness spanning a period of around 40 years, and full major ion analyses in ˜1980). The concentration patterns predicted using Gaines-Thomas exchange with calcite equilibrium were similar to the observed patterns, but the concentrations of the divalent ions were significantly overestimated, as were 1980 sulphate concentrations, and 1980 alkalinity concentrations were underestimated. Including representation of sulphate reduction in the estuarine alluvium failed to replicate 1980 HCO 3 and pH values. However, by including partial CO 2 degassing following sulphate reduction, a process for which there is 34S and 18O evidence from a previous study, a good match for SO 4, HCO 3, and pH was attained. Using this modified estuary water and averaged values from the laboratory ion exchange parameter determinations, good predictions for the field cation data were obtained. It is concluded that the Gaines-Thomas/constant exchange capacity model with averaged parameter values can be used successfully in ion exchange predictions in this aquifer at a regional scale and over extended time scales, despite the numerous assumptions inherent in the approach; this has also been found to be the case in the few other published studies of regional ion exchanging flow.
SCOUT: a small vacuum chamber for nano-wire grid polarizer tests in the ultraviolet band
NASA Astrophysics Data System (ADS)
Landini, F.; Pancrazzi, M.; Totaro, M.; Pennelli, G.; Romoli, M.
2012-01-01
Within the Section of Astronomy of the Department of Physics and Astronomy of the University of Firenze, Italy), the XUVLab laboratory is active since 1998 dedicated to technological development, mainly UV oriented. The technological research is focused both on electronics and optics. Our last approach is dedicated to the development of innovative wiregrid polarizers optimized to work in transmission at 121.6 nm. The manufacturing of such optical devices requires advanced technological expertise and suitable experimental structures. First, nanotechnology capability is necessary, in order to build several tiny parallel conductive lines separated by tens of nanometers on wide areas to be macroscopically exploitable in an optical laboratory. Moreover, the characterization of such an advanced optical device has to be performed in vacuum, being air absorptive at 121.6 nm. A dedicated small vacuum chamber, SCOUT (Small Chamber for Optical UV Tests) was developed within our laboratory in order to perform practical and fast measurements. SCOUT hosts an optical bench and is equipped with several opening flanges, in order to be as flexible as possible. The flexibility that has been reached with SCOUT allows us to use the chamber beyond the goals it was thought for. It is exploitable by whatever compact (within 1 m) optical experiment that investigates the UV band of the spectrum.
Walking the bridge: Nursing students' learning in clinical skill laboratories.
Ewertsson, Mona; Allvin, Renée; Holmström, Inger K; Blomberg, Karin
2015-07-01
Despite an increasing focus on simulation as a learning strategy in nursing education, there is limited evidence on the transfer of simulated skills into clinical practice. Therefore it's important to increase knowledge of how clinical skills laboratories (CSL) can optimize students' learning for development of professional knowledge and skills, necessary for quality nursing practice and for patient safety. Thus, the aim was to describe nursing students' experiences of learning in the CSL as a preparation for their clinical practice. Interviews with 16 students were analysed with content analysis. An overall theme was identified - walking the bridge - in which the CSL formed a bridge between the university and clinical settings, allowing students to integrate theory and practice and develop a reflective stance. The theme was based on categories: conditions for learning, strategies for learning, tension between learning in the skills laboratory and clinical settings, and development of professional and personal competence. The CSL prepared the students for clinical practice, but a negative tension between learning in CSL and clinical settings was experienced. However, this tension may create reflection. This provides a new perspective that can be used as a pedagogical approach to create opportunities for students to develop their critical thinking. Copyright © 2015 Elsevier Ltd. All rights reserved.
GMOtrack: generator of cost-effective GMO testing strategies.
Novak, Petra Krau; Gruden, Kristina; Morisset, Dany; Lavrac, Nada; Stebih, Dejan; Rotter, Ana; Zel, Jana
2009-01-01
Commercialization of numerous genetically modified organisms (GMOs) has already been approved worldwide, and several additional GMOs are in the approval process. Many countries have adopted legislation to deal with GMO-related issues such as food safety, environmental concerns, and consumers' right of choice, making GMO traceability a necessity. The growing extent of GMO testing makes it important to study optimal GMO detection and identification strategies. This paper formally defines the problem of routine laboratory-level GMO tracking as a cost optimization problem, thus proposing a shift from "the same strategy for all samples" to "sample-centered GMO testing strategies." An algorithm (GMOtrack) for finding optimal two-phase (screening-identification) testing strategies is proposed. The advantages of cost optimization with increasing GMO presence on the market are demonstrated, showing that optimization approaches to analytic GMO traceability can result in major cost reductions. The optimal testing strategies are laboratory-dependent, as the costs depend on prior probabilities of local GMO presence, which are exemplified on food and feed samples. The proposed GMOtrack approach, publicly available under the terms of the General Public License, can be extended to other domains where complex testing is involved, such as safety and quality assurance in the food supply chain.
NASA Astrophysics Data System (ADS)
Swanson, Ryan David
The advection-dispersion equation (ADE) fails to describe non-Fickian solute transport breakthrough curves (BTCs) in saturated porous media in both laboratory and field experiments, necessitating the use of other models. The dual-domain mass transfer (DDMT) model partitions the total porosity into mobile and less-mobile domains with an exchange of mass between the two domains, and this model can reproduce better fits to BTCs in many systems than ADE-based models. However, direct experimental estimation of DDMT model parameters remains elusive and model parameters are often calculated a posteriori by an optimization procedure. Here, we investigate the use of geophysical tools (direct-current resistivity, nuclear magnetic resonance, and complex conductivity) to estimate these model parameters directly. We use two different samples of the zeolite clinoptilolite, a material shown to demonstrate solute mass transfer due to a significant internal porosity, and provide the first evidence that direct-current electrical methods can track solute movement into and out of a less-mobile pore space in controlled laboratory experiments. We quantify the effects of assuming single-rate DDMT for multirate mass transfer systems. We analyze pore structures using material characterization methods (mercury porosimetry, scanning electron microscopy, and X-ray computer tomography), and compare these observations to geophysical measurements. Nuclear magnetic resonance in conjunction with direct-current resistivity measurements can constrain mobile and less-mobile porosities, but complex conductivity may have little value in relation to mass transfer despite the hypothesis that mass transfer and complex conductivity lengths scales are related. Finally, we conduct a geoelectrical monitored tracer test at the Macrodispersion Experiment (MADE) site in Columbus, MS. We relate hydraulic and electrical conductivity measurements to generate a 3D hydraulic conductivity field, and compare to hydraulic conductivity fields estimated through ordinary kriging and sequential Gaussian simulation. Time-lapse electrical measurements are used to verify or dismiss aspects of breakthrough curves for different hydraulic conductivity fields. Our results quantify the potential for geophysical measurements to infer on single-rate DDMT parameters, show site-specific relations between hydraulic and electrical conductivity, and track solute exchange into and out of less-mobile domains.
Lociciro, S; Esseiva, P; Hayoz, P; Dujourdy, L; Besacier, F; Margot, P
2008-05-20
Harmonisation and optimization of analytical and statistical methodologies were carried out between two forensic laboratories (Lausanne, Switzerland and Lyon, France) in order to provide drug intelligence for cross-border cocaine seizures. Part I dealt with the optimization of the analytical method and its robustness. This second part investigates statistical methodologies that will provide reliable comparison of cocaine seizures analysed on two different gas chromatographs interfaced with a flame ionisation detectors (GC-FIDs) in two distinct laboratories. Sixty-six statistical combinations (ten data pre-treatments followed by six different distance measurements and correlation coefficients) were applied. One pre-treatment (N+S: area of each peak is divided by its standard deviation calculated from the whole data set) followed by the Cosine or Pearson correlation coefficients were found to be the best statistical compromise for optimal discrimination of linked and non-linked samples. The centralisation of the analyses in one single laboratory is not a required condition anymore to compare samples seized in different countries. This allows collaboration, but also, jurisdictional control over data.
Adamatzky, Andrew I
2014-01-01
A cellular slime mould Physarum polycephalum is a monstrously large single cell visible by an unaided eye. The slime mold explores space in parallel, is guided by gradients of chemoattractants, and propagates toward sources of nutrients along nearly shortest paths. The slime mold is a living prototype of amorphous biological computers and robotic devices capable of solving a range of tasks of graph optimization and computational geometry. When presented with a distribution of nutrients, the slime mold spans the sources of nutrients with a network of protoplasmic tubes. This protoplasmic network matches a network of major transport routes of a country when configuration of major urban areas is represented by nutrients. A transport route connecting two cities should ideally be a shortest path, and this is usually the case in computer simulations and laboratory experiments with flat substrates. What searching strategies does the slime mold adopt when exploring 3-D terrains? How are optimal and transport routes approximated by protoplasmic tubes? Do the routes built by the slime mold on 3-D terrain match real-world transport routes? To answer these questions, we conducted pioneer laboratory experiments with Nylon terrains of USA and Germany. We used the slime mold to approximate route 20, the longest road in USA, and autobahn 7, the longest national motorway in Europe. We found that slime mold builds longer transport routes on 3-D terrains, compared to flat substrates yet sufficiently approximates man-made transport routes studied. We demonstrate that nutrients placed in destination sites affect performance of slime mold, and show how the mold navigates around elevations. In cellular automaton models of the slime mold, we have shown variability of the protoplasmic routes might depends on physiological states of the slime mold. Results presented will contribute toward development of novel algorithms for sensorial fusion, information processing, and decision making, and will provide inspirations in design of bioinspired amorphous robotic devices.
Flagging threshold optimization for manual blood smear review in primary care laboratory.
Bihl, Pierre-Adrien
2018-04-01
Manual blood smear review is required when an anomaly detected by the automated hematologic analyzer triggers a flag. Our will through this study is to optimize these flagging thresholds for manual slide review in order to limit workload, while insuring clinical care through no extra false-negative. Flagging causes of 4,373 samples were investigated by manual slide review, after having been run on ADVIA 2120i. A set of 6 user-adjustments is proposed. By implementing all recommendations that we made, false-positive rate falls from 81.8% to 58.6%, while PPV increases from 18.2% to 23.7%. Hence, use of such optimized thresholds enables us to maximize efficiency without altering clinical care, but each laboratory should establish its own criteria to take into consideration local distinctive features.
Sarzotti-Kelsoe, Marcella; Bailer, Robert T; Turk, Ellen; Lin, Chen-li; Bilska, Miroslawa; Greene, Kelli M.; Gao, Hongmei; Todd, Christopher A.; Ozaki, Daniel A.; Seaman, Michael S.; Mascola, John R.; Montefiori, David C.
2014-01-01
The TZM-bl assay measures antibody-mediated neutralization of HIV-1 as a function of reductions in HIV-1 Tat-regulated firefly luciferase (Luc) reporter gene expression after a single round of infection with Env-pseudotyped viruses. This assay has become the main endpoint neutralization assay used for the assessment of preclinical and clinical trial samples by a growing number of laboratories worldwide. Here we present the results of the formal optimization and validation of the TZM-bl assay, performed in compliance with Good Clinical Laboratory Practice (GCLP) guidelines. The assay was evaluated for specificity, accuracy, precision, limits of detection and quantitation, linearity, range and robustness. The validated manual TZM-bl assay was also adapted, optimized and qualified to an automated 384-well format. PMID:24291345
Professor Created On-line Biology Laboratory Course
NASA Technical Reports Server (NTRS)
Bowman, Arthur W.
2010-01-01
This paper will share the creation, implementation, and modification of an online college level general biology laboratory course offered for non-science majors as a part of a General Education Curriculum. The ability of professors to develop quality online laboratories will address a growing need in Higher Education as more institutions combine course sections and look for suitable alternative course delivery formats due to declining departmental budgets requiring reductions in staffing, equipment, and supplies. Also, there is an equal or greater need for more professors to develop the ability to create online laboratory experiences because many of the currently available online laboratory course packages from publishers do not always adequately parallel on-campus laboratory courses, or are not as aligned with the companion lecture sections. From a variety of scientific simulation and animation web sites, professors can easily identify material that closely fit the specific needs of their courses, instructional environment, and students that they serve. All too often, on-campus laboratory courses in the sciences provide what are termed confirmation experiences that do NOT allow students to experience science as would be carried out by scientists. Creatively developed online laboratory experiences can often provide the type of authentic investigative experiences that are not possible on-campus due to the time constraints of a typical two-hour, once-per-week-meeting laboratory course. In addition, online laboratory courses can address issues related to the need for students to more easily complete missing laboratory assignments, and to have opportunities to extend introductory exercises into more advanced undertakings where a greater sense of scientific discovery can be experienced. Professors are strongly encourages to begin creating online laboratory exercises for their courses, and to consider issues regarding assessment, copyrights, and Intellectual Property concerns.
Experiential learning in control systems laboratories and engineering project management
NASA Astrophysics Data System (ADS)
Reck, Rebecca Marie
Experiential learning is a process by which a student creates knowledge through the insights gained from an experience. Kolb's model of experiential learning is a cycle of four modes: (1) concrete experience, (2) reflective observation, (3) abstract conceptualization, and (4) active experimentation. His model is used in each of the three studies presented in this dissertation. Laboratories are a popular way to apply the experiential learning modes in STEM courses. Laboratory kits allow students to take home laboratory equipment to complete experiments on their own time. Although students like laboratory kits, no previous studies compared student learning outcomes on assignments using laboratory kits with existing laboratory equipment. In this study, we examined the similarities and differences between the experiences of students who used a portable laboratory kit and students who used the traditional equipment. During the 2014- 2015 academic year, we conducted a quasi-experiment to compare students' achievement of learning outcomes and their experiences in the instructional laboratory for an introductory control systems course. Half of the laboratory sections in each semester used the existing equipment, while the other sections used a new kit. We collected both quantitative data and qualitative data. We did not identify any major differences in the student experience based on the equipment they used. Course objectives, like research objectives and product requirements, help provide clarity and direction for faculty and students. Unfortunately, course and laboratory objectives are not always clearly stated. Without a clear set of objectives, it can be hard to design a learning experience and determine whether students are achieving the intended outcomes of the course or laboratory. In this study, I identified a common set of laboratory objectives, concepts, and components of a laboratory apparatus for undergraduate control systems laboratories. During the summer of 2015, a panel of 40 control systems faculty members, from a variety of institutions, completed a multi-round Delphi survey in order to bring them toward consensus on the common aspects of their laboratories. The following winter, 45 additional faculty members and practitioners from the control systems community completed a follow-up survey to gather feedback on the results of the Delphi survey. During the Delphi study, the panelists identified 15 laboratory objectives, 26 concepts, and 15 components that were common in their laboratories. Then in both the Delphi survey and follow-up survey each participant rated the importance of each of these items. While the average ratings differed slightly between the two groups, the order of each set of items was compared with two different tests and the order was found to be similar. Some of the common and important learning objectives include connecting theory to what is implemented and observed in the laboratory, designing controllers, and modeling and simulating systems. The most common component in both groups was Math-Works software. Some of the common concepts include block diagrams, stability, and PID control. Defining common aspects of undergraduate control systems laboratories enables common development, detailed comparisons, and simplified adaptation of equipment and experiments between campuses and programs. Throughout an undergraduate program in engineering, there are multiple opportunities for hands-on laboratory experiences that are related to course content. However, a similarly immersive experience for project management graduate students is harder to incorporate for all students in a course at once. This study explores an experiential learning opportunity for graduate students in engineering management or project management programs. The project management students enroll in a project management course. Undergraduate students interested in working on a project with a real customer enroll in a different projects course. Two students from the project management course function as project managers and lead a team of undergraduate students in the second course through a project. I studied how closely the project management experience in these courses aligns with engineering project management in industry. In the spring of 2015, I enrolled in the project management course at a large Midwestern university. I used analytic autoethnography to compare my experiences in the course with my experiences as a project engineer at a large aerospace company. I found that the experience in the course provided an authentic and comprehensive opportunity to practice most of the skills listed in the Project Management Book of Knowledge (an industry standard) as necessary for project managers. Some components of the course that made it successful: I was the project manager for the whole term, I worked with a real client, and the team defined and delivered the project before the end of the semester.
NASA Astrophysics Data System (ADS)
Ghatty, Sundara L.
Over the past decade, there has been a dramatic rise in online delivery of higher education in the United States. Recent developments in web technology and access to the internet have led to a vast increase in online courses. For people who work during the day and whose complicated lives prevent them from taking courses on campus, online courses are the only alternatives by which they may achieve their goals in education. The laboratory courses are the major requirements for college and university students who want to pursue degree and certification programs in science. It is noted that there is a lack of laboratory courses in online physics courses. The present study addressed the effectiveness of a virtual science laboratory in physics instruction in terms of learning outcomes, attitudes, and self-efficacy of students in a Historically Black University College. The study included fifty-eight students (36 male and 22 female) of different science majors who were enrolled in a general physics laboratory course. They were divided into virtual and traditional groups. Three experiments were selected from the syllabus. The traditional group performed one experiment in a traditional laboratory, while the virtual group performed the same experiment in a virtual laboratory. For the second experiment, the use of laboratories by both groups was exchanged. Learner's Assessment Test (LAT), Attitudes Toward Physics Laboratories (ATPL), and Self-Efficacy Survey (SES) instruments were used. Additionally, quantitative methods such as an independent t-test, a paired t-test, and correlation statistics were used to analyze the data. The results of the first experiment indicated the learning outcomes were higher in the Virtual Laboratory than in the traditional laboratory, whereas there was no significant difference in learning outcomes with either type of lab instruction. However, significant self-efficacy gains were observed. Students expressed positive attitudes in terms of liking as well as interests in performing experiments in virtual laboratories. No gender differences were observed in learning outcomes or self-efficacy. The results of the study indicated that virtual laboratories may be a substitute for traditional laboratories to some extent, and may play a vital role in online science courses.
Multiparameter optimization of mammography: an update
NASA Astrophysics Data System (ADS)
Jafroudi, Hamid; Muntz, E. P.; Jennings, Robert J.
1994-05-01
Previously in this forum we have reported the application of multiparameter optimization techniques to the design of a minimum dose mammography system. The approach used a reference system to define the physical imaging performance required and the dose to which the dose for the optimized system should be compared. During the course of implementing the resulting design in hardware suitable for laboratory testing, the state of the art in mammographic imaging changed, so that the original reference system, which did not have a grid, was no longer appropriate. A reference system with a grid was selected in response to this change, and at the same time the optimization procedure was modified, to make it more general and to facilitate study of the optimized design under a variety of conditions. We report the changes in the procedure, and the results obtained using the revised procedure and the up- to-date reference system. Our results, which are supported by laboratory measurements, indicate that the optimized design can image small objects as well as the reference system using only about 30% of the dose required by the reference system. Hardware meeting the specification produced by the optimization procedure and suitable for clinical use is currently under evaluation in the Diagnostic Radiology Department at the Clinical Center, NH.
NASA Technical Reports Server (NTRS)
1972-01-01
The selection and definition of candidate experiments and the associated experiment instrumentation requirements are described. Information is presented that addresses the following study objectives: (1) determine specific research and technology needs in the comm/nav field through a survey of the scientific/technical community; (2) develop manned low earth orbit space screening criteria and compile lists of potential candidate experiments; (3) in Blue Book format, define and describe selected candidate experiments in sufficient detail to develop laboratory configuration designs and layouts; and (4) develop experiment time phasing criteria and recommend a payload for sortie can/early laboratory missions.
Operational plans for life science payloads - From experiment selection through postflight reporting
NASA Technical Reports Server (NTRS)
Mccollum, G. W.; Nelson, W. G.; Wells, G. W.
1976-01-01
Key features of operational plans developed in a study of the Space Shuttle era life science payloads program are presented. The data describes the overall acquisition, staging, and integration of payload elements, as well as program implementation methods and mission support requirements. Five configurations were selected as representative payloads: (a) carry-on laboratories - medical emphasis experiments, (b) mini-laboratories - medical/biology experiments, (c) seven-day dedicated laboratories - medical/biology experiments, (d) 30-day dedicated laboratories - Regenerative Life Support Evaluation (RLSE) with selected life science experiments, and (e) Biomedical Experiments Scientific Satellite (BESS) - extended duration primate (Type I) and small vertebrate (Type II) missions. The recommended operational methods described in the paper are compared to the fundamental data which has been developed in the life science Spacelab Mission Simulation (SMS) test series. Areas assessed include crew training, experiment development and integration, testing, data-dissemination, organization interfaces, and principal investigator working relationships.
NASA Technical Reports Server (NTRS)
1972-01-01
This study was undertaken to develop conceptual designs for a manned, space shuttle sortie mission laboratory capable of supporting a wide variety of experiments in conjunction with communications and navigation research. This space/laboratory would be one in which man may effectively increase experiment efficiency by certain observations, modifications, setup, calibration, and limited maintenance steps. In addition, man may monitor experiment progress and perform preliminary data evaluation to verify proper equipment functioning and may terminate or redirect experiments to obtain the most desirable end results. The flexibility and unique capabilities of man as an experimenter in such a laboratory will add greatly to the simplification of space experiments and this provides the basis for commonality in many of the supportive subsystems, thus reaping the benefits of reusability and reduced experiment costs. For Vol. 4, see N73-19268.
NASA Technical Reports Server (NTRS)
Salama, Farid; Tan, Xiaofeng; Cami, Jan; Biennier, Ludovic; Remy, Jerome
2006-01-01
Polycyclic Aromatic Hydrocarbons (PAHs) are an important and ubiquitous component of carbon-bearing materials in space. A long-standing and major challenge for laboratory astrophysics has been to measure the spectra of large carbon molecules in laboratory environments that mimic (in a realistic way) the physical conditions that are associated with the interstellar emission and absorption regions [1]. This objective has been identified as one of the critical Laboratory Astrophysics objectives to optimize the data return from space missions [2]. An extensive laboratory program has been developed to assess the properties of PAHs in such environments and to describe how they influence the radiation and energy balance in space. We present and discuss the gas-phase electronic absorption spectra of neutral and ionized PAHs measured in the UV-Visible-NIR range in astrophysically relevant environments and discuss the implications for astrophysics [1]. The harsh physical conditions of the interstellar medium characterized by a low temperature, an absence of collisions and strong VUV radiation fields - have been simulated in the laboratory by associating a pulsed cavity ringdown spectrometer (CRDS) with a supersonic slit jet seeded with PAHs and an ionizing, penning-type, electronic discharge. We have measured for the {\\it first time} the spectra of a series of neutral [3,4] and ionized [5,6] interstellar PAHs analogs in the laboratory. An effort has also been attempted to quantify the mechanisms of ion and carbon nanoparticles production in the free jet expansion and to model our simulation of the diffuse interstellar medium in the laboratory [7]. These experiments provide {\\it unique} information on the spectra of free, large carbon-containing molecules and ions in the gas phase. We are now, for the first time, in the position to directly compare laboratory spectral data on free, cold, PAH ions and carbon nano-sized carbon particles with astronomical observations in the UV-NIR range (interstellar UV extinction, DIBs in the NUV-NIR range). This new phase offers tremendous opportunities for the data analysis of current and upcoming space missions geared toward the detection of large aromatic systems Le., the "new frontier space missions" (Spitzer, HST, COS, JWST, SOFIA,...).
Low Cost Space Experiments. Study Report
1991-12-06
Air Force Phillips Laboratory with Johns Hopkins University Applied Physics Laboratory . The goals of ALTAIR...Cs<- &l. LOW COST SPACE EXPERIMENTS STUDY REPORT 6 December 1991 19980302 059 Phillips Laboratory /SXL Kirtland AFB, NM 87117-6008 TVPTT" OTT...Report Corporate Author or Publisher: Phillips Laboratory /SXL, Kirtland AFB,NM 87117-6008 Publication Date: Dec 06, 1991 Pages: 176 Comments
ERIC Educational Resources Information Center
Chaytor, Jennifer L.; Al Mughalaq, Mohammad; Butler, Hailee
2017-01-01
Online prelaboratory videos and quizzes were prepared for all experiments in CHEM 231, Organic Chemistry I Laboratory. It was anticipated that watching the videos would help students be better prepared for the laboratory, decrease their anxiety surrounding the laboratory, and increase their understanding of the theories and concepts presented.…
Radon and material radiopurity assessment for the NEXT double beta decay experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cebrián, S.; Dafni, T.; González-Díaz, D.
The ”Neutrino Experiment with a Xenon TPC” (NEXT), intended to investigate the neutrinoless double beta decay using a high-pressure xenon gas TPC filled with Xe enriched in {sup 136}Xe at the Canfranc Underground Laboratory in Spain, requires ultra-low background conditions demanding an exhaustive control of material radiopurity and environmental radon levels. An extensive material screening process is underway for several years based mainly on gamma-ray spectroscopy using ultra-low background germanium detectors in Canfranc but also on mass spectrometry techniques like GDMS and ICPMS. Components from shielding, pressure vessel, electroluminescence and high voltage elements and energy and tracking readout planes havemore » been analyzed, helping in the final design of the experiment and in the construction of the background model. The latest measurements carried out will be presented and the implication on NEXT of their results will be discussed. The commissioning of the NEW detector, as a first step towards NEXT, has started in Canfranc; in-situ measurements of airborne radon levels were taken there to optimize the system for radon mitigation and will be shown too.« less
Development of a mechanism for nitrate photochemistry in snow.
Bock, Josué; Jacobi, Hans-Werner
2010-02-04
A reaction mechanism to reproduce photochemical processes in the snow is reported. We developed a box model to represent snow chemistry. Constrained by laboratory experiments carried out with artificial snow, we deduced first a reaction mechanism for N-containing species including 13 reactions. An optimization tool was developed to adjust systematically unknown photolysis rates of nitrate and nitrite (NO(2)(-)) and transfer rates of nitrogen oxides from the snow to the gas phase resulting in an optimum fit with respect to the experimental data. Further experiments with natural snow samples are presented, indicating that NO(2)(-) concentrations were much lower than in the artificial snow experiments. These observations were used to extend the reaction mechanism into a more general scheme including hydrogen peroxide (H(2)O(2)) and formaldehyde (HCHO) chemistry leading to a set of 18 reactions. The simulations indicate the importance of H(2)O(2) and HCHO as either a source or sink of hydroxyl radicals in the snow photochemistry mechanism. The addition of H(2)O(2) and HCHO in the mechanism allows the reproduction of the observed low NO(2)(-) concentration.
Multi-frame X-ray Phase Contrast Imaging (MPCI) for Dynamic Experiments
NASA Astrophysics Data System (ADS)
Iverson, Adam; Carlson, Carl; Sanchez, Nathaniel; Jensen, Brian
2017-06-01
Recent advances in coupling synchrotron X-ray diagnostics to dynamic experiments are providing new information about the response of materials at extremes. For example, propagation based X-ray Phase Contrast Imaging (PCI) which is sensitive to differences in density has been successfully used to study a wide range of phenomena, e.g. jet-formation, compression of additive manufactured (AM) materials, and detonator dynamics. In this talk, we describe the current multi-frame X-ray phase contrast imaging (MPCI) system which allows up to eight frames per experiment, remote optimization, and an improved optical design that increases optical efficiency and accommodates dual-magnification during a dynamic event. Data will be presented that used the dual-magnification feature to obtain multiple images of an exploding foil initiator. In addition, results from static testing will be presented that used a multiple scintillator configuration required to extend the density retrieval to multi-constituent, or heterogeneous systems. The continued development of this diagnostic is fundamentally important to capabilities at the APS including IMPULSE and the Dynamic Compression Sector (DCS), and will benefit future facilities such as MaRIE at Los Alamos National Laboratory.
1992-06-01
The first United States Microgravity Laboratory (USML-1) was one of NASA's science and technology programs that provided scientists an opportunity to research various scientific investigations in a weightless environment inside the Spacelab module. It also provided demonstrations of new equipment to help prepare for advanced microgravity research and processing aboard the Space Station. The USML-1 flew in orbit for extended periods, providing greater opportunities for research in materials science, fluid dynamics, biotechnology (crystal growth), and combustion science. This photograph shows astronaut Ken Bowersox conducting the Astroculture experiment in the middeck of the orbiter Columbia. This experiment was to evaluate and find effective ways to supply nutrient solutions for optimizing plant growth and avoid releasing solutions into the crew quarters in microgravity. Since fluids behave differently in microgravity, plant watering systems that operate well on Earth do not function effectively in space. Plants can reduce the costs of providing food, oxygen, and pure water as well as lower the costs of removing carbon dioxide in human space habitats. The Astroculture experiment flew aboard the STS-50 mission in June 1992 and was managed by the Marshall Space Flight Center.
Magnetically launched flyer plate technique for probing electrical conductivity of compressed copper
NASA Astrophysics Data System (ADS)
Cochrane, K. R.; Lemke, R. W.; Riford, Z.; Carpenter, J. H.
2016-03-01
The electrical conductivity of materials under extremes of temperature and pressure is of crucial importance for a wide variety of phenomena, including planetary modeling, inertial confinement fusion, and pulsed power based dynamic materials experiments. There is a dearth of experimental techniques and data for highly compressed materials, even at known states such as along the principal isentrope and Hugoniot, where many pulsed power experiments occur. We present a method for developing, calibrating, and validating material conductivity models as used in magnetohydrodynamic (MHD) simulations. The difficulty in calibrating a conductivity model is in knowing where the model should be modified. Our method isolates those regions that will have an impact. It also quantitatively prioritizes which regions will have the most beneficial impact. Finally, it tracks the quantitative improvements to the conductivity model during each incremental adjustment. In this paper, we use an experiment on Sandia National Laboratories Z-machine to isentropically launch multiple flyer plates and, with the MHD code ALEGRA and the optimization code DAKOTA, calibrated the conductivity such that we matched an experimental figure of merit to +/-1%.
Characterizing Background Events in Neutron Transmutation Doped Thermistors for CUORE-0
NASA Astrophysics Data System (ADS)
Dutta, Suryabrata; Cuore Collaboration
2017-09-01
The Cryogenic Underground Observatory for Rare Events (CUORE) is a ton-scale neutrinoless double-beta decay experiment operating at the Laboratori Nazionali del Gran Sasso (LNGS). The experiment is comprised of 988 TeO2 bolometric crystals arranged into 19 towers and operated at a temperature of 15 mK. A neutron-transmutation-doped (NTD) Ge thermistor measures the thermal response from particles incident on the crystals. However, bulk and surface contamination of the NTD thermistors themselves produce distorted thermal responses inside the thermistor volume. Although these pulses are efficiently removed from the double-beta decay analysis by pulse shape cuts, they can be used to extract information about thermistor contamination. I will present a multifaceted approach to characterize these events, in which I implement an improved hot-electron thermal model, Geant4 Monte Carlo simulations of background events, and data from a previous experiment, CUORE-0, reprocessed with a new optimal filter. Using this approach, rates and energy deposition from contamination inside the NTD thermistors are measured, giving us better understanding of a CUORE background source.
Development of modular scalable pulsed power systems for high power magnetized plasma experiments
NASA Astrophysics Data System (ADS)
Bean, I. A.; Weber, T. E.; Adams, C. S.; Henderson, B. R.; Klim, A. J.
2017-10-01
New pulsed power switches and trigger drivers are being developed in order to explore higher energy regimes in the Magnetic Shock Experiment (MSX) at Los Alamos National Laboratory. To achieve the required plasma velocities, high-power (approx. 100 kV, 100s of kA), high charge transfer (approx. 1 C), low-jitter (few ns) gas switches are needed. A study has been conducted on the effects of various electrode geometries and materials, dielectric media, and triggering strategies; resulting in the design of a low-inductance annular field-distortion switch, optimized for use with dry air at 90 psig, and triggered by a low-jitter, rapid rise-time solid-state Linear Transformer Driver. The switch geometry and electrical characteristics are designed to be compatible with Syllac style capacitors, and are intended to be deployed in modular configurations. The scalable nature of this approach will enable the rapid design and implementation of a wide variety of high-power magnetized plasma experiments. This work is supported by the U.S. Department of Energy, National Nuclear Security Administration. Approved for unlimited release, LA-UR-17-2578.
Footstep Planning on Uneven Terrain with Mixed-Integer Convex Optimization
2014-08-01
ORGANIZATION NAME(S) AND ADDRESS(ES) Massachusetts Institute of Technology,Computer Science and Artificial Intellegence Laboratory,Cambridge,MA,02139...the MIT Energy Initiative, MIT CSAIL, and the DARPA Robotics Challenge. 1Robin Deits is with the Computer Science and Artificial Intelligence Laboratory
Gustafson, A-L; Stedman, D B; Ball, J; Hillegass, J M; Flood, A; Zhang, C X; Panzica-Kelly, J; Cao, J; Coburn, A; Enright, B P; Tornesi, M B; Hetheridge, M; Augustine-Rauch, K A
2012-04-01
This report provides a progress update of a consortium effort to develop a harmonized zebrafish developmental toxicity assay. Twenty non-proprietary compounds (10 animal teratogens and 10 animal non-teratogens) were evaluated blinded in 4 laboratories. Zebrafish embryos from pond-derived and cultivated strain wild types were exposed to the test compounds for 5 days and subsequently evaluated for lethality and morphological changes. Each of the testing laboratories achieved similar overall concordance to the animal data (60-70%). Subsequent optimization procedures to improve the overall concordance focused on compound formulation and test concentration adjustments, chorion permeation and number of replicates. These optimized procedures were integrated into a revised protocol and all compounds were retested in one lab using embryos from pond-derived zebrafish and achieved 85% total concordance. To further assess assay performance, a study of additional compounds is currently in progress at two laboratories using embryos from pond-derived and cultivated-strain wild type zebrafish. Copyright © 2011 Elsevier Inc. All rights reserved.
2013-09-01
Optimization of the Nonradiative Lifetime of Molecular- Beam-Epitaxy (MBE)-Grown Undoped GaAs/AlGaAs Double Heterostructures (DH) by P...it to the originator. Army Research Laboratory Adelphi, MD 20783-1197 ARL-TR-6660 September 2013 Optimization of the Nonradiative ...REPORT TYPE Final 3. DATES COVERED (From - To) FY2013 4. TITLE AND SUBTITLE Optimization of the Nonradiative Lifetime of Molecular-Beam-Epitaxy
Seeing the World Through "Pink-Colored Glasses": The Link Between Optimism and Pink.
Kalay-Shahin, Lior; Cohen, Allon; Lemberg, Rachel; Harary, Gil; Lobel, Thalma E
2016-12-01
This study investigated optimism, which is considered a personality trait, from the grounded cognition perspective. Three experiments were conducted to investigate the association between pink and optimism. In Experiment 1A, 22 undergraduates (10 females; M age = 23.68) were asked to classify words as optimistic or pessimistic as fast as possible. Half the words were presented in pink and half in black. Experiment 1B (N = 24; 14 females; M age = 22.82) was identical to 1A except for the color of the words-black and light blue instead of pink-to rule out the possible influence of brightness. Experiment 2 exposed 144 participants (74 females; M age = 25.18) to pink or yellow and then measured their optimism level. The findings for Experiments 1A and 1B indicated an association between pink and optimism regardless of brightness. Experiment 2 found that mere exposure to pink increased optimism levels for females. These results contribute to the dynamic view of personality, current views on optimism, and the growing literature on grounded cognition. © 2015 Wiley Periodicals, Inc.
Increasing power generation in horizontal axis wind turbines using optimized flow control
NASA Astrophysics Data System (ADS)
Cooney, John A., Jr.
In order to effectively realize future goals for wind energy, the efficiency of wind turbines must increase beyond existing technology. One direct method for achieving increased efficiency is by improving the individual power generation characteristics of horizontal axis wind turbines. The potential for additional improvement by traditional approaches is diminishing rapidly however. As a result, a research program was undertaken to assess the potential of using distributed flow control to increase power generation. The overall objective was the development of validated aerodynamic simulations and flow control approaches to improve wind turbine power generation characteristics. BEM analysis was conducted for a general set of wind turbine models encompassing last, current, and next generation designs. This analysis indicated that rotor lift control applied in Region II of the turbine power curve would produce a notable increase in annual power generated. This was achieved by optimizing induction factors along the rotor blade for maximum power generation. In order to demonstrate this approach and other advanced concepts, the University of Notre Dame established the Laboratory for Enhanced Wind Energy Design (eWiND). This initiative includes a fully instrumented meteorological tower and two pitch-controlled wind turbines. The wind turbines are representative in their design and operation to larger multi-megawatt turbines, but of a scale that allows rotors to be easily instrumented and replaced to explore new design concepts. Baseline data detailing typical site conditions and turbine operation is presented. To realize optimized performance, lift control systems were designed and evaluated in CFD simulations coupled with shape optimization tools. These were integrated into a systematic design methodology involving BEM simulations, CFD simulations and shape optimization, and selected experimental validation. To refine and illustrate the proposed design methodology, a complete design cycle was performed for the turbine model incorporated in the wind energy lab. Enhanced power generation was obtained through passive trailing edge shaping aimed at reaching lift and lift-to-drag goals predicted to optimize performance. These targets were determined by BEM analysis to improve power generation characteristics and annual energy production (AEP) for the wind turbine. A preliminary design was validated in wind tunnel experiments on a 2D rotor section in preparation for testing in the full atmospheric environment of the eWiND Laboratory. These tests were performed for the full-scale geometry and atmospheric conditions. Upon making additional improvements to the shape optimization tools, a series of trailing edge additions were designed to optimize power generation. The trailing edge additions were predicted to increase the AEP by up to 4.2% at the White Field site. The pieces were rapid-prototyped and installed on the wind turbine in March, 2014. Field tests are ongoing.
Computational experiments in the optimal slewing of flexible structures
NASA Technical Reports Server (NTRS)
Baker, T. E.; Polak, Lucian Elijah
1989-01-01
Numerical experiments on the problem of moving a flexible beam are discussed. An optimal control problem is formulated and transcribed into a form which can be solved using semi-infinite optimization techniques. All experiments were carried out on a SUN 3 microcomputer.
Harvey, Ronald W.; Kinner, Nancy E.; MacDonald, Dan; Metge, David W.; Bunn, Amoret
1993-01-01
The effect of physical variability upon the relative transport behavior of microbial-sized microspheres, indigenous bacteria, and bromide was examined in field and flow-through column studies for a layered, but relatively well sorted, sandy glaciofluvial aquifer. These investigations involved repacked, sieved, and undisturbed aquifer sediments. In the field, peak abundance of labeled bacteria traveling laterally with groundwater flow 6 m downgradient from point of injection was coincident with the retarded peak of carboxylated microspheres (retardation factor, RF = 1.7) at the 8.8 m depth, but preceded the bromide peak and the retarded microsphere peak (RF = 1.5) at the 9.0 m depth. At the 9.5 m depth, the bacterial peak was coincident with both the bromide and the microsphere peaks. Although sorption appeared to be a predominant mechanism responsible for immobilization of microbial-sized microspheres in the aquifer, straining appeared to be primarily responsible for their removal in 0.6-m-long columns of repacked, unsieved aquifer sediments. The manner in which the columns were packed also affected optimal size for microsphere transport, which in one experiment was near the size of the small (∼2 μm) groundwater protozoa (flagellates). These data suggest that variability in aquifer sediment structure can be important in interpretation of both small-scale field and laboratory experiments examining microbial transport behavior.
NASA Astrophysics Data System (ADS)
Noh, Seong Jin; Lee, Seungsoo; An, Hyunuk; Kawaike, Kenji; Nakagawa, Hajime
2016-11-01
An urban flood is an integrated phenomenon that is affected by various uncertainty sources such as input forcing, model parameters, complex geometry, and exchanges of flow among different domains in surfaces and subsurfaces. Despite considerable advances in urban flood modeling techniques, limited knowledge is currently available with regard to the impact of dynamic interaction among different flow domains on urban floods. In this paper, an ensemble method for urban flood modeling is presented to consider the parameter uncertainty of interaction models among a manhole, a sewer pipe, and surface flow. Laboratory-scale experiments on urban flood and inundation are performed under various flow conditions to investigate the parameter uncertainty of interaction models. The results show that ensemble simulation using interaction models based on weir and orifice formulas reproduces experimental data with high accuracy and detects the identifiability of model parameters. Among interaction-related parameters, the parameters of the sewer-manhole interaction show lower uncertainty than those of the sewer-surface interaction. Experimental data obtained under unsteady-state conditions are more informative than those obtained under steady-state conditions to assess the parameter uncertainty of interaction models. Although the optimal parameters vary according to the flow conditions, the difference is marginal. Simulation results also confirm the capability of the interaction models and the potential of the ensemble-based approaches to facilitate urban flood simulation.
Collaborative development for setup, execution, sharing and analytics of complex NMR experiments.
Irvine, Alistair G; Slynko, Vadim; Nikolaev, Yaroslav; Senthamarai, Russell R P; Pervushin, Konstantin
2014-02-01
Factory settings of NMR pulse sequences are rarely ideal for every scenario in which they are utilised. The optimisation of NMR experiments has for many years been performed locally, with implementations often specific to an individual spectrometer. Furthermore, these optimised experiments are normally retained solely for the use of an individual laboratory, spectrometer or even single user. Here we introduce a web-based service that provides a database for the deposition, annotation and optimisation of NMR experiments. The application uses a Wiki environment to enable the collaborative development of pulse sequences. It also provides a flexible mechanism to automatically generate NMR experiments from deposited sequences. Multidimensional NMR experiments of proteins and other macromolecules consume significant resources, in terms of both spectrometer time and effort required to analyse the results. Systematic analysis of simulated experiments can enable optimal allocation of NMR resources for structural analysis of proteins. Our web-based application (http://nmrplus.org) provides all the necessary information, includes the auxiliaries (waveforms, decoupling sequences etc.), for analysis of experiments by accurate numerical simulation of multidimensional NMR experiments. The online database of the NMR experiments, together with a systematic evaluation of their sensitivity, provides a framework for selection of the most efficient pulse sequences. The development of such a framework provides a basis for the collaborative optimisation of pulse sequences by the NMR community, with the benefits of this collective effort being available to the whole community. Copyright © 2013 Elsevier Inc. All rights reserved.
Rezaei, Nastaran; Karimi, Javad; Hosseini, Mojtaba; Goldani, Morteza; Campos-Herrera, Raquel
2015-03-01
The greenhouse whitefly Trialeurodes vaporariorum (Hemiptera: Aleyrodidae) is a polyphagous pest in greenhouse crops. The efficacy of two entomopathogenic nematodes (EPN), Steinernema feltiae and Heterorhabditis bacteriophora, as biological control agents against T. vaporariorum was evaluated using two model crops typical of vegetable greenhouse productions: cucumber and pepper. Laboratory tests evaluated adults and second nymphal instars for pest susceptibility to different EPN species at different concentrations of infective juveniles (IJ; 0, 25, 50, 100, 150, 200, and 250 IJ per cm(2)); subsequent greenhouse trials against second nymphal instars on cucumber and pepper plants evaluated more natural conditions. Concentrations were applied in combination with Triton X-100 (0.1% v/v), an adjuvant for increasing nematode activity. In laboratory studies, both life stages were susceptible to infection by the two nematode species, but S. feltiae recorded a lower LC50 than H. bacteriophora for both insect stages. Similarly, in greenhouse experiments, S. feltiae required lower concentrations of IJ than H. bacteriophora to reach the same mortality in nymphs. In greenhouse trials, a significant difference was observed in the triple interaction among nematode species × concentration × plant. Furthermore, the highest mortality rate of the second nymphal instars of the T. vaporariorum was obtained from the application of S. feltiae concentrated to 250 IJ/cm(2) on cucumber (49 ± 1.23%). The general mortality caused by nematodes was significantly higher in cucumber than in pepper. These promising results support further investigation for the optimization of the best EPN species/concentration in combination with insecticides or adjuvants to reach a profitable control of this greenhouse pest.
Search for life on Mars in surface samples: Lessons from the 1999 Marsokhod rover field experiment
Newsom, Horton E.; Bishop, J.L.; Cockell, C.; Roush, T.L.; Johnson, J. R.
2001-01-01
The Marsokhod 1999 field experiment in the Mojave Desert included a simulation of a rover-based sample selection mission. As part of this mission, a test was made of strategies and analytical techniques for identifying past or present life in environments expected to be present on Mars. A combination of visual clues from high-resolution images and the detection of an important biomolecule (chlorophyll) with visible/near-infrared (NIR) spectroscopy led to the successful identification of a rock with evidence of cryptoendolithic organisms. The sample was identified in high-resolution images (3 times the resolution of the Imager for Mars Pathfinder camera) on the basis of a green tinge and textural information suggesting the presence of a thin, partially missing exfoliating layer revealing the organisms. The presence of chlorophyll bands in similar samples was observed in visible/NIR spectra of samples in the field and later confirmed in the laboratory using the same spectrometer. Raman spectroscopy in the laboratory, simulating a remote measurement technique, also detected evidence of carotenoids in samples from the same area. Laboratory analysis confirmed that the subsurface layer of the rock is inhabited by a community of coccoid Chroococcidioposis cyanobacteria. The identification of minerals in the field, including carbonates and serpentine, that are associated with aqueous processes was also demonstrated using the visible/NIR spectrometer. Other lessons learned that are applicable to future rover missions include the benefits of web-based programs for target selection and for daily mission planning and the need for involvement of the science team in optimizing image compression schemes based on the retention of visual signature characteristics. Copyright 2000 by the American Geophysical Union.
Conducting interactive experiments online.
Arechar, Antonio A; Gächter, Simon; Molleman, Lucas
2018-01-01
Online labor markets provide new opportunities for behavioral research, but conducting economic experiments online raises important methodological challenges. This particularly holds for interactive designs. In this paper, we provide a methodological discussion of the similarities and differences between interactive experiments conducted in the laboratory and online. To this end, we conduct a repeated public goods experiment with and without punishment using samples from the laboratory and the online platform Amazon Mechanical Turk. We chose to replicate this experiment because it is long and logistically complex. It therefore provides a good case study for discussing the methodological and practical challenges of online interactive experimentation. We find that basic behavioral patterns of cooperation and punishment in the laboratory are replicable online. The most important challenge of online interactive experiments is participant dropout. We discuss measures for reducing dropout and show that, for our case study, dropouts are exogenous to the experiment. We conclude that data quality for interactive experiments via the Internet is adequate and reliable, making online interactive experimentation a potentially valuable complement to laboratory studies.
ERIC Educational Resources Information Center
Lee, Shan-Hu; Mukherjee, Souptik; Brewer, Brittany; Ryan, Raphael; Yu, Huan; Gangoda, Mahinda
2013-01-01
An undergraduate laboratory experiment is described to measure Henry's law constants of organic compounds using a bubble column and gas chromatography flame ionization detector (GC-FID). This experiment is designed for upper-division undergraduate laboratory courses and can be implemented in conjunction with physical chemistry, analytical…
CSI flight experiment projects of the Naval Research Laboratory
NASA Technical Reports Server (NTRS)
Fisher, Shalom
1993-01-01
The Naval Research Laboratory (NRL) is involved in an active program of CSI flight experiments. The first CSI flight experiment of the Naval Research Laboratory, the Low Power Atmospheric Compensation Experiment (LACE) dynamics experiment, has successfully measured vibrations of an orbiting satellite with a ground-based laser radar. The observations, made on January 7, 8 and 10, 1991, represent the first ever measurements of this type. In the tests, a narrowband heterodyne CO2 laser radar, operating at a wavelength of 10.6 microns, detected vibration induced differential-Doppler signatures of the LACE satellite. Power spectral densities of forced oscillations and modal frequencies and damping rates of free-damped vibrations were obtained and compared with finite element structural models of the LACE system. Another manifested flight experiment is the Advanced Controls Technology Experiment (ACTEX) designed to demonstrate active and passive damping with piezo-electric (PZT) sensors and actuators. This experiment was developed under the management of the Air Force Phillips Laboratory with integration of the experiment at NRL. It is to ride as a secondary, or 'piggyback,' experiment on a future Navy satellite.
CSI flight experiment projects of the Naval Research Laboratory
NASA Astrophysics Data System (ADS)
Fisher, Shalom
1993-02-01
The Naval Research Laboratory (NRL) is involved in an active program of CSI flight experiments. The first CSI flight experiment of the Naval Research Laboratory, the Low Power Atmospheric Compensation Experiment (LACE) dynamics experiment, has successfully measured vibrations of an orbiting satellite with a ground-based laser radar. The observations, made on January 7, 8 and 10, 1991, represent the first ever measurements of this type. In the tests, a narrowband heterodyne CO2 laser radar, operating at a wavelength of 10.6 microns, detected vibration induced differential-Doppler signatures of the LACE satellite. Power spectral densities of forced oscillations and modal frequencies and damping rates of free-damped vibrations were obtained and compared with finite element structural models of the LACE system. Another manifested flight experiment is the Advanced Controls Technology Experiment (ACTEX) designed to demonstrate active and passive damping with piezo-electric (PZT) sensors and actuators. This experiment was developed under the management of the Air Force Phillips Laboratory with integration of the experiment at NRL. It is to ride as a secondary, or 'piggyback,' experiment on a future Navy satellite.
Fluid Flow Experiment for Undergraduate Laboratory.
ERIC Educational Resources Information Center
Vilimpochapornkul, Viroj; Obot, Nsima T.
1986-01-01
The undergraduate fluid mechanics laboratory at Clarkson University consists of three experiments: mixing; drag measurements; and fluid flow and pressure drop measurements. The latter experiment is described, considering equipment needed, procedures used, and typical results obtained. (JN)
A Kinetic Experiment for the Biochemistry Laboratory.
ERIC Educational Resources Information Center
Palmer, Richard E.
1986-01-01
Discusses the use of specific reactions of metabolic pathways to make measurements in the laboratory. Describes an adaptation of an experiment used in undergraduate biochemistry laboratories involving the induction of an enzyme in E. coli, as well as its partial purification and characterization. (TW)
Immobilized alpha-Galactosidase in the Biochemistry Laboratory
ERIC Educational Resources Information Center
Mulimani, V. H.; Dhananjay, K.
2007-01-01
This laboratory experiment was designed to demonstrate the application of immobilized galactosidase in food industry to hydrolyze raffinose family oligosaccharides in soymilk. This laboratory experiment was conducted for postgraduate students of biochemistry and developed for graduate and undergraduate students of biochemistry, biotechnology,…
EFFICIENCY OPTIMIZATIN CONTROL OF AC INDUCTION MOTORS: INITIAL LABORATORY RESULTS
The report discusses the development of a fuzzy logic, energy-optimizing controller to improve the efficiency of motor/drive combinations that operate at varying loads and speeds. This energy optimizer is complemented by a sensorless speed controller that maintains motor shaft re...
Optimization of the Neutrino Factory, revisited
NASA Astrophysics Data System (ADS)
Agarwalla, Sanjib K.; Huber, Patrick; Tang, Jian; Winter, Walter
2011-01-01
We perform the baseline and energy optimization of the Neutrino Factory including the latest simulation results on the magnetized iron detector (MIND). We also consider the impact of τ decays, generated by νμ → ντ or ν e → ντ appearance, on the mass hierarchy, CP violation, and θ 13 discovery reaches, which we find to be negligible for the considered detector. For the baseline-energy optimization for small sin2 2 θ 13, we qualitatively recover the results with earlier simulations of the MIND detector. We find optimal baselines of about 2500km to 5000km for the CP violation measurement, where now values of E μ as low as about 12 GeV may be possible. However, for large sin2 2 θ 13, we demonstrate that the lower threshold and the backgrounds reconstructed at lower energies allow in fact for muon energies as low as 5 GeV at considerably shorter baselines, such as FNAL-Homestake. This implies that with the latest MIND analysis, low-and high-energy versions of the Neutrino Factory are just two different versions of the same experiment optimized for different parts of the parameter space. Apart from a green-field study of the updated detector performance, we discuss specific implementations for the two-baseline Neutrino Factory, where the considered detector sites are taken to be currently discussed underground laboratories. We find that reasonable setups can be found for the Neutrino Factory source in Asia, Europe, and North America, and that a triangular-shaped storage ring is possible in all cases based on geometrical arguments only.
DOT National Transportation Integrated Search
2010-04-15
Wet pavement friction is known to be one of the most important roadway safety parameters. In this : research, frictional properties of flexible (asphalt) pavements were investigated. : As a part of this study, a laboratory device to polish asphalt sp...
Irredundant Sequential Machines Via Optimal Logic Synthesis
1989-10-01
1989 Irredundant Sequential Machines Via Optimal Logic Synthesis NSrinivas Devadas , Hi-Keung Tony Ma, A. Richard Newton, and Alberto Sangiovanni- S...Agency under contract N00014-87-K-0825, and a grant from AT & T Bell Laboratories. Author Information Devadas : Department of Electrical Engineering...Sequential Machines Via Optimal Logic Synthesis Srinivas Devadas * Hi-Keung Tony ha. A. Richard Newton and Alberto Sangiovanni-Viucentelli Department of
PLUTONIUM PROCESSING OPTIMIZATION IN SUPPORT OF THE MOX FUEL PROGRAM
DOE Office of Scientific and Technical Information (OSTI.GOV)
GRAY, DEVIN W.; COSTA, DAVID A.
2007-02-02
After Los Alamos National Laboratory (LANL) personnel completed polishing 125 Kg of plutonium as highly purified PuO{sub 2} from surplus nuclear weapons, Duke, COGEMA, Stone, and Webster (DCS) required as the next process stage, the validation and optimization of all phases of the plutonium polishing flow sheet. Personnel will develop the optimized parameters for use in the upcoming 330 kg production mission.
Optimization of integrated impeller mixer via radiotracer experiments.
Othman, N; Kamarudin, S K; Takriff, M S; Rosli, M I; Engku Chik, E M F; Adnan, M A K
2014-01-01
Radiotracer experiments are carried out in order to determine the mean residence time (MRT) as well as percentage of dead zone, V dead (%), in an integrated mixer consisting of Rushton and pitched blade turbine (PBT). Conventionally, optimization was performed by varying one parameter and others were held constant (OFAT) which lead to enormous number of experiments. Thus, in this study, a 4-factor 3-level Taguchi L9 orthogonal array was introduced to obtain an accurate optimization of mixing efficiency with minimal number of experiments. This paper describes the optimal conditions of four process parameters, namely, impeller speed, impeller clearance, type of impeller, and sampling time, in obtaining MRT and V dead (%) using radiotracer experiments. The optimum conditions for the experiments were 100 rpm impeller speed, 50 mm impeller clearance, Type A mixer, and 900 s sampling time to reach optimization.
2005-06-01
AIR FORCE RESEARCH LABORATORY SPACE VEHICLES INTEGRATED EXPERMENTS DIVISION OFFICE SPACE AT KIRTLAND AIR FORCE ... Kirtland Air Force Base (KAFB). The office building would house the Air Force Research Laboratory Space Vehicles Integrated Experiments Division...ADDRESS(ES) Air Force Research Laboratory ,Space Vehicles Directorate,3550 Aberdeen Ave. SE, Kirtland
A GC-MS Analysis of an S[subscript N]2 Reaction for the Organic Laboratory
ERIC Educational Resources Information Center
Clennan, Malgorzata M.; Clennan, Edward L.
2005-01-01
The S[subscript N]2 reaction of 1-bromohexane and 1-bromobutane with potassium acetate is introduced to address the shortage of suitable laboratory experiments in organic laboratory. The experiment offers a review of some common laboratory techniques including the use of infrared spectroscopy to identify functional groups, the use of GC-MS…
ERIC Educational Resources Information Center
Gerczei, Timea
2017-01-01
A laboratory sequence is described that is suitable for upper-level biochemistry or molecular biology laboratories that combines project-based and traditional laboratory experiments. In the project-based sequence, the individual laboratory experiments are thematically linked and aim to show how a bacterial antibiotic sensing noncoding RNA (the…
NASA Astrophysics Data System (ADS)
Pence, Laura E.; Workman, Harry J.; Riecke, Pauline
2003-03-01
Two separate experiences with students whose disabilities significantly limited the number of laboratory activities they could accomplish independently has given us a general experience base for determining successful strategies for accommodating students facing these situatiuons. For a student who had substantially limited physical mobility and for a student who had no visual ability, employing a student laboratory assistant allowed the students with disabilities to have a productive and positive laboratory experience. One of the priorities in these situations should be to avoid depersonalizing the student with a disability. Interactions with the instructor and with other students should focus on the disabled student rather than the student laboratory assistant who may be carrying out specific tasks. One of the most crucial aspects of a successful project is the selection of a laboratory assistant who has excellent interpersonal skills and who will add his or her creativity to that of the student with a disability to meet unforeseen challenges. Other considerations are discussed, such as the importance of advance notification that a disabled student has enrolled in a course as well as factors that should contribute to choosing an optimum laboratory station for each situation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manungu Kiveni, Joseph
2012-12-01
This dissertation describes the results of a WIMP search using CDMS II data sets accumulated at the Soudan Underground Laboratory in Minnesota. Results from the original analysis of these data were published in 2009; two events were observed in the signal region with an expected leakage of 0.9 events. Further investigation revealed an issue with the ionization-pulse reconstruction algorithm leading to a software upgrade and a subsequent reanalysis of the data. As part of the reanalysis, I performed an advanced discrimination technique to better distinguish (potential) signal events from backgrounds using a 5-dimensional chi-square method. This dataanalysis technique combines themore » event information recorded for each WIMP-search event to derive a backgrounddiscrimination parameter capable of reducing the expected background to less than one event, while maintaining high efficiency for signal events. Furthermore, optimizing the cut positions of this 5-dimensional chi-square parameter for the 14 viable germanium detectors yields an improved expected sensitivity to WIMP interactions relative to previous CDMS results. This dissertation describes my improved (and optimized) discrimination technique and the results obtained from a blind application to the reanalyzed CDMS II WIMP-search data.« less
Agarabi, Cyrus D; Schiel, John E; Lute, Scott C; Chavez, Brittany K; Boyne, Michael T; Brorson, Kurt A; Khan, Mansoora; Read, Erik K
2015-06-01
Consistent high-quality antibody yield is a key goal for cell culture bioprocessing. This endpoint is typically achieved in commercial settings through product and process engineering of bioreactor parameters during development. When the process is complex and not optimized, small changes in composition and control may yield a finished product of less desirable quality. Therefore, changes proposed to currently validated processes usually require justification and are reported to the US FDA for approval. Recently, design-of-experiments-based approaches have been explored to rapidly and efficiently achieve this goal of optimized yield with a better understanding of product and process variables that affect a product's critical quality attributes. Here, we present a laboratory-scale model culture where we apply a Plackett-Burman screening design to parallel cultures to study the main effects of 11 process variables. This exercise allowed us to determine the relative importance of these variables and identify the most important factors to be further optimized in order to control both desirable and undesirable glycan profiles. We found engineering changes relating to culture temperature and nonessential amino acid supplementation significantly impacted glycan profiles associated with fucosylation, β-galactosylation, and sialylation. All of these are important for monoclonal antibody product quality. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.
Comparing field investigations with laboratory models to predict landfill leachate emissions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fellner, Johann; Doeberl, Gernot; Allgaier, Gerhard
2009-06-15
Investigations into laboratory reactors and landfills are used for simulating and predicting emissions from municipal solid waste landfills. We examined water flow and solute transport through the same waste body for different volumetric scales (laboratory experiment: 0.08 m{sup 3}, landfill: 80,000 m{sup 3}), and assessed the differences in water flow and leachate emissions of chloride, total organic carbon and Kjeldahl nitrogen. The results indicate that, due to preferential pathways, the flow of water in field-scale landfills is less uniform than in laboratory reactors. Based on tracer experiments, it can be discerned that in laboratory-scale experiments around 40% of pore watermore » participates in advective solute transport, whereas this fraction amounts to less than 0.2% in the investigated full-scale landfill. Consequences of the difference in water flow and moisture distribution are: (1) leachate emissions from full-scale landfills decrease faster than predicted by laboratory experiments, and (2) the stock of materials remaining in the landfill body, and thus the long-term emission potential, is likely to be underestimated by laboratory landfill simulations.« less
Laboratory space physics: Investigating the physics of space plasmas in the laboratory
NASA Astrophysics Data System (ADS)
Howes, Gregory G.
2018-05-01
Laboratory experiments provide a valuable complement to explore the fundamental physics of space plasmas without the limitations inherent to spacecraft measurements. Specifically, experiments overcome the restriction that spacecraft measurements are made at only one (or a few) points in space, enable greater control of the plasma conditions and applied perturbations, can be reproducible, and are orders of magnitude less expensive than launching spacecraft. Here, I highlight key open questions about the physics of space plasmas and identify the aspects of these problems that can potentially be tackled in laboratory experiments. Several past successes in laboratory space physics provide concrete examples of how complementary experiments can contribute to our understanding of physical processes at play in the solar corona, solar wind, planetary magnetospheres, and the outer boundary of the heliosphere. I present developments on the horizon of laboratory space physics, identifying velocity space as a key new frontier, highlighting new and enhanced experimental facilities, and showcasing anticipated developments to produce improved diagnostics and innovative analysis methods. A strategy for future laboratory space physics investigations will be outlined, with explicit connections to specific fundamental plasma phenomena of interest.
Undergraduate Organic Chemistry Laboratory Safety
NASA Astrophysics Data System (ADS)
Luckenbaugh, Raymond W.
1996-11-01
Each organic chemistry student should become familiar with the educational and governmental laboratory safety requirements. One method for teaching laboratory safety is to assign each student to locate safety resources for a specific class laboratory experiment. The student should obtain toxicity and hazardous information for all chemicals used or produced during the assigned experiment. For example, what is the LD50 or LC50 for each chemical? Are there any specific hazards for these chemicals, carcinogen, mutagen, teratogen, neurotixin, chronic toxin, corrosive, flammable, or explosive agent? The school's "Chemical Hygiene Plan", "Prudent Practices for Handling Hazardous Chemicals in the Laboratory" (National Academy Press), and "Laboratory Standards, Part 1910 - Occupational Safety and Health Standards" (Fed. Register 1/31/90, 55, 3227-3335) should be reviewed for laboratory safety requirements for the assigned experiment. For example, what are the procedures for safe handling of vacuum systems, if a vacuum distillation is used in the assigned experiment? The literature survey must be submitted to the laboratory instructor one week prior to the laboratory session for review and approval. The student should then give a short presentation to the class on the chemicals' toxicity and hazards and describe the safety precautions that must be followed. This procedure gives the student first-hand knowledge on how to find and evaluate information to meet laboartory safety requirements.
Lean management and medical laboratory: application in transfusionnal immuno-hematology.
Thibert, Jean-Baptiste; Le Vacon, Françoise; Danic, Bruno
2017-10-01
Despite a common use in industrial applications, only a few studies describe the lean management methods in medical laboratory. These tools have been evaluated in analysis laboratory of blood donors, especially in immuno-hematology sector. The aim was to optimize the organization and maintain team cohesion and strong staff involvement in a restructuring context. The tools used and the results obtained are presented in this study.
Laboratory diagnosis of von Willebrand's disease.
Rick, M E
1994-12-01
The diagnosis of von Willebrand's disease is becoming complex as more is understood about the disease. Clinical information and laboratory data are necessary for the diagnosis because of the overlap of normal and abnormal laboratory values. A complete evaluation including von Willebrand factor multimers, ristocetin-induced platelet aggregation, factor VIII activity level, and a template bleeding time is necessary to correctly classify the patient so that optimal treatment may be given.
Modelling landscape evolution at the flume scale
NASA Astrophysics Data System (ADS)
Cheraghi, Mohsen; Rinaldo, Andrea; Sander, Graham C.; Barry, D. Andrew
2017-04-01
The ability of a large-scale Landscape Evolution Model (LEM) to simulate the soil surface morphological evolution as observed in a laboratory flume (1-m × 2-m surface area) was investigated. The soil surface was initially smooth, and was subjected to heterogeneous rainfall in an experiment designed to avoid rill formation. Low-cohesive fine sand was placed in the flume while the slope and relief height were 5 % and 20 cm, respectively. Non-uniform rainfall with an average intensity of 85 mm h-1 and a standard deviation of 26 % was applied to the sediment surface for 16 h. We hypothesized that the complex overland water flow can be represented by a drainage discharge network, which was calculated via the micro-morphology and the rainfall distribution. Measurements included high resolution Digital Elevation Models that were captured at intervals during the experiment. The calibrated LEM captured the migration of the main flow path from the low precipitation area into the high precipitation area. Furthermore, both model and experiment showed a steep transition zone in soil elevation that moved upstream during the experiment. We conclude that the LEM is applicable under non-uniform rainfall and in the absence of surface incisions, thereby extending its applicability beyond that shown in previous applications. Keywords: Numerical simulation, Flume experiment, Particle Swarm Optimization, Sediment transport, River network evolution model.
Development of a Portable Motor Learning Laboratory (PoMLab)
Shinya, Masahiro
2016-01-01
Most motor learning experiments have been conducted in a laboratory setting. In this type of setting, a huge and expensive manipulandum is frequently used, requiring a large budget and wide open space. Subjects also need to travel to the laboratory, which is a burden for them. This burden is particularly severe for patients with neurological disorders. Here, we describe the development of a novel application based on Unity3D and smart devices, e.g., smartphones or tablet devices, that can be used to conduct motor learning experiments at any time and in any place, without requiring a large budget and wide open space and without the burden of travel on subjects. We refer to our application as POrtable Motor learning LABoratory, or PoMLab. PoMLab is a multiplatform application that is available and sharable for free. We investigated whether PoMLab could be an alternative to the laboratory setting using a visuomotor rotation paradigm that causes sensory prediction error, enabling the investigation of how subjects minimize the error. In the first experiment, subjects could adapt to a constant visuomotor rotation that was abruptly applied at a specific trial. The learning curve for the first experiment could be modeled well using a state space model, a mathematical model that describes the motor leaning process. In the second experiment, subjects could adapt to a visuomotor rotation that gradually increased each trial. The subjects adapted to the gradually increasing visuomotor rotation without being aware of the visuomotor rotation. These experimental results have been reported for conventional experiments conducted in a laboratory setting, and our PoMLab application could reproduce these results. PoMLab can thus be considered an alternative to the laboratory setting. We also conducted follow-up experiments in university physical education classes. A state space model that was fit to the data obtained in the laboratory experiments could predict the learning curves obtained in the follow-up experiments. Further, we investigated the influence of vibration function, weight, and screen size on learning curves. Finally, we compared the learning curves obtained in the PoMLab experiments to those obtained in the conventional reaching experiments. The results of the in-class experiments show that PoMLab can be used to conduct motor learning experiments at any time and place. PMID:27348223
Development of a Portable Motor Learning Laboratory (PoMLab).
Takiyama, Ken; Shinya, Masahiro
2016-01-01
Most motor learning experiments have been conducted in a laboratory setting. In this type of setting, a huge and expensive manipulandum is frequently used, requiring a large budget and wide open space. Subjects also need to travel to the laboratory, which is a burden for them. This burden is particularly severe for patients with neurological disorders. Here, we describe the development of a novel application based on Unity3D and smart devices, e.g., smartphones or tablet devices, that can be used to conduct motor learning experiments at any time and in any place, without requiring a large budget and wide open space and without the burden of travel on subjects. We refer to our application as POrtable Motor learning LABoratory, or PoMLab. PoMLab is a multiplatform application that is available and sharable for free. We investigated whether PoMLab could be an alternative to the laboratory setting using a visuomotor rotation paradigm that causes sensory prediction error, enabling the investigation of how subjects minimize the error. In the first experiment, subjects could adapt to a constant visuomotor rotation that was abruptly applied at a specific trial. The learning curve for the first experiment could be modeled well using a state space model, a mathematical model that describes the motor leaning process. In the second experiment, subjects could adapt to a visuomotor rotation that gradually increased each trial. The subjects adapted to the gradually increasing visuomotor rotation without being aware of the visuomotor rotation. These experimental results have been reported for conventional experiments conducted in a laboratory setting, and our PoMLab application could reproduce these results. PoMLab can thus be considered an alternative to the laboratory setting. We also conducted follow-up experiments in university physical education classes. A state space model that was fit to the data obtained in the laboratory experiments could predict the learning curves obtained in the follow-up experiments. Further, we investigated the influence of vibration function, weight, and screen size on learning curves. Finally, we compared the learning curves obtained in the PoMLab experiments to those obtained in the conventional reaching experiments. The results of the in-class experiments show that PoMLab can be used to conduct motor learning experiments at any time and place.
The assessment of data sources for influenza virologic surveillance in New York State.
Escuyer, Kay L; Waters, Christine L; Gowie, Donna L; Maxted, Angie M; Farrell, Gregory M; Fuschino, Meghan E; St George, Kirsten
2017-03-01
Following the 2013 USA release of the Influenza Virologic Surveillance Right Size Roadmap, the New York State Department of Health (NYSDOH) embarked on an evaluation of data sources for influenza virologic surveillance. To assess NYS data sources, additional to data generated by the state public health laboratory (PHL), which could enhance influenza surveillance at the state and national level. Potential sources of laboratory test data for influenza were analyzed for quantity and quality. Computer models, designed to assess sample sizes and the confidence of data for statistical representation of influenza activity, were used to compare PHL test data to results from clinical and commercial laboratories, reported between June 8, 2013 and May 31, 2014. Sample sizes tested for influenza at the state PHL were sufficient for situational awareness surveillance with optimal confidence levels, only during peak weeks of the influenza season. Influenza data pooled from NYS PHLs and clinical laboratories generated optimal confidence levels for situational awareness throughout the influenza season. For novel influenza virus detection in NYS, combined real-time (rt) RT-PCR data from state and regional PHLs achieved ≥85% confidence during peak influenza activity, and ≥95% confidence for most of low season and all of off-season. In NYS, combined data from clinical, commercial, and public health laboratories generated optimal influenza surveillance for situational awareness throughout the season. Statistical confidence for novel virus detection, which is reliant on only PHL data, was achieved for most of the year. © 2016 The Authors. Influenza and Other Respiratory Viruses Published by John Wiley & Sons Ltd.
The student perspective of high school laboratory experiences
NASA Astrophysics Data System (ADS)
Lambert, R. Mitch
High school science laboratory experiences are an accepted teaching practice across the nation despite a lack of research evidence to support them. The purpose of this study was to examine the perspective of students---stakeholders often ignored---on these experiences. Insight into the students' perspective was explored progressively using a grounded theory methodology. Field observations of science classrooms led to an open-ended survey of high school science students, garnering 665 responses. Twelve student interviews then focused on the data and questions evolving from the survey. The student perspective on laboratory experiences revealed varied information based on individual experience. Concurrent analysis of the data revealed that although most students like (348/665) or sometimes like (270/665) these experiences, some consistent factors yielded negative experiences and prompted suggestions for improvement. The category of responses that emerged as the core idea focused on student understanding of the experience. Students desire to understand the why do, the how to, and the what it means of laboratory experiences. Lacking any one of these, the experience loses educational value for them. This single recurring theme crossed the boundaries of age, level in school, gender, and even the student view of lab experiences as positive or negative. This study suggests reflection on the current laboratory activities in which science teachers engage their students. Is the activity appropriate (as opposed to being merely a favorite), does it encourage learning, does it fit, does it operate at the appropriate level of inquiry, and finally what can science teachers do to integrate these activities into the classroom curriculum more effectively? Simply stated, what can teachers do so that students understand what to do, what's the point, and how that point fits into what they are learning outside the laboratory?
NASA Technical Reports Server (NTRS)
Jackson, J. K.; Yakut, M. M.
1976-01-01
An all-important first step in the development of the Spacelab Life Science Laboratory is the design of the Biological Specimen Holding Facility (BSHF) which will provide accommodation for living specimens for life science research in orbit. As a useful tool in the understanding of physiological and biomedical changes produced in the weightless environment, the BSHF will enable biomedical researchers to conduct in-orbit investigations utilizing techniques that may be impossible to perform on human subjects. The results of a comprehensive study for defining the BSHF, description of its experiment support capabilities, and the planning required for its development are presented. Conceptual designs of the facility, its subsystems and interfaces with the Orbiter and Spacelab are included. Environmental control, life support and data management systems are provided. Interface and support equipment required for specimen transfer, surgical research, and food, water and waste storage is defined. New and optimized concepts are presented for waste collection, feces and urine separation and sampling, environmental control, feeding and watering, lighting, data management and other support subsystems.
A Movable Combined Water Treatment Facility for Rainwater Harvesting
NASA Astrophysics Data System (ADS)
Zhang, L.; Liao, L.
2003-12-01
Alarming water shortage and increased water scarcity world wide has led to increased interests in alternative water sources. Rainwater harvesting is one of them which is getting more and more attention. There is a huge potential for generalization and extension of rainwater harvesting system as an alternative water supply. This is especially important for arid and semi-arid regions where the water shortage blocks further social, economical development. Earlier laboratory experiments and field study showed that harvested rainwater requires treatments of different degrees in order to meet the WHO drinking water standards. The main focus of this study is to ascertain the quality of stored rainwater for drinking purposes with emphasis on water disinfection and pollutants removal. A movable, low-cost, fully functional small scale treatment facility is proposed and tested under simulated field condition. A number of actual and potential hazardous pollutants were identified in the collected water samples together with laboratory test. The corresponding water purification procedure and fresh-keeping methods are discussed. The final proposal of this movable facility needs to be further examined to achieve optimal combined treatment efficiency.
NASA Astrophysics Data System (ADS)
Sibillano, T.; de Caro, L.; Altamura, D.; Siliqi, D.; Ramella, M.; Boccafoschi, F.; Ciasca, G.; Campi, G.; Tirinato, L.; di Fabrizio, E.; Giannini, C.
2014-11-01
The paper shows how a table top superbright microfocus laboratory X-ray source and an innovative restoring-data algorithm, used in combination, allow to analyze the super molecular structure of soft matter by means of Small Angle X-ray Scattering ex-situ experiments. The proposed theoretical approach is aimed to restore diffraction features from SAXS profiles collected from low scattering biomaterials or soft tissues, and therefore to deal with extremely noisy diffraction SAXS profiles/maps. As biological test cases we inspected: i) residues of exosomes' drops from healthy epithelial colon cell line and colorectal cancer cells; ii) collagen/human elastin artificial scaffolds developed for vascular tissue engineering applications; iii) apoferritin protein in solution. Our results show how this combination can provide morphological/structural nanoscale information to characterize new artificial biomaterials and/or to get insight into the transition between healthy and pathological tissues during the progression of a disease, or to morphologically characterize nanoscale proteins, based on SAXS data collected in a room-sized laboratory.
Perturbations and gradients as fundamental tests for modeling the soil carbon cycle
NASA Astrophysics Data System (ADS)
Bond-Lamberty, B. P.; Bailey, V. L.; Becker, K.; Fansler, S.; Hinkle, C.; Liu, C.
2013-12-01
An important step in matching process-level knowledge to larger-scale measurements and model results is to challenge those models with site-specific perturbations and/or changing environmental conditions. Here we subject modified versions of an ecosystem process model to two stringent tests: replicating a long-term climate change dryland experiment (Rattlesnake Mountain) and partitioning the carbon fluxes of a soil drainage gradient in the northern Everglades (Disney Wilderness Preserve). For both sites, on-site measurements were supplemented by laboratory incubations of soil columns. We used a parameter-space search algorithm to optimize, within observational limits, the model's influential inputs, so that the spun-up carbon stocks and fluxes matched observed values. Modeled carbon fluxes (net primary production and net ecosystem exchange) agreed with measured values, within observational error limits, but the model's partitioning of soil fluxes (autotrophic versus heterotrophic), did not match laboratory measurements from either site. Accounting for site heterogeneity at DWP, modeled carbon exchange was reasonably consistent with values from eddy covariance. We discuss the implications of this work for ecosystem- to global scale modeling of ecosystems in a changing climate.
Optimized star sensors laboratory calibration method using a regularization neural network.
Zhang, Chengfen; Niu, Yanxiong; Zhang, Hao; Lu, Jiazhen
2018-02-10
High-precision ground calibration is essential to ensure the performance of star sensors. However, the complex distortion and multi-error coupling have brought great difficulties to traditional calibration methods, especially for large field of view (FOV) star sensors. Although increasing the complexity of models is an effective way to improve the calibration accuracy, it significantly increases the demand for calibration data. In order to achieve high-precision calibration of star sensors with large FOV, a novel laboratory calibration method based on a regularization neural network is proposed. A multi-layer structure neural network is designed to represent the mapping of the star vector and the corresponding star point coordinate directly. To ensure the generalization performance of the network, regularization strategies are incorporated into the net structure and the training algorithm. Simulation and experiment results demonstrate that the proposed method can achieve high precision with less calibration data and without any other priori information. Compared with traditional methods, the calibration error of the star sensor decreased by about 30%. The proposed method can satisfy the precision requirement for large FOV star sensors.
ICF target 2D modeling using Monte Carlo SNB electron thermal transport in DRACO
NASA Astrophysics Data System (ADS)
Chenhall, Jeffrey; Cao, Duc; Moses, Gregory
2016-10-01
The iSNB (implicit Schurtz Nicolai Busquet multigroup diffusion electron thermal transport method is adapted into a Monte Carlo (MC) transport method to better model angular and long mean free path non-local effects. The MC model was first implemented in the 1D LILAC code to verify consistency with the iSNB model. Implementation of the MC SNB model in the 2D DRACO code enables higher fidelity non-local thermal transport modeling in 2D implosions such as polar drive experiments on NIF. The final step is to optimize the MC model by hybridizing it with a MC version of the iSNB diffusion method. The hybrid method will combine the efficiency of a diffusion method in intermediate mean free path regions with the accuracy of a transport method in long mean free path regions allowing for improved computational efficiency while maintaining accuracy. Work to date on the method will be presented. This work was supported by Sandia National Laboratories and the Univ. of Rochester Laboratory for Laser Energetics.
Management and care of African dormice (Graphiurus kelleni).
Kastenmayer, Robin J; Moak, Hannah B; Jeffress, Erin J; Elkins, William R
2010-03-01
African dormice (Graphiurus spp.) are small nocturnal rodents that currently are uncommon in laboratory settings. Their use may increase as they have recently been shown to develop an infection with monkeypox virus and may prove to be a valuable animal model for infectious disease research. Because African dormice are not commercially available, an extensive breeding colony is required to produce the animals needed for research use. Husbandry modifications that increased the production of offspring were the use of a high-protein diet, increased cage enrichment, and decreased animal density. To optimize consumption of a high-protein diet, we tested the palatability of several high-protein foods in a series of preference trials. Dormice preferred wax worm larva, cottage cheese, roasted soy nuts, and canned chicken. Issues related to medical management of Graphiurus kelleni include potential complications from traumatic injury. The development of a program for the husbandry and care of African dormice at our institution typifies the experiences of many laboratory animal facilities that are asked to support the development of animal models using novel species.
Zhao, Jin Hui; Chen, Wei; Zhao, Yaqian; Liu, Cuiyun; Liu, Ranbin
2015-01-01
The occurrence of carbon-bacteria complexes in activated carbon filtered water has posed a public health problem regarding the biological safety of drinking water. The application of combined process of ultraviolet radiation and nanostructure titanium dioxide (UV/TiO2) photocatalysis for the disinfection of carbon-bacteria complexes were assessed in this study. Results showed that a 1.07 Lg disinfection rate can be achieved using a UV dose of 20 mJ cm(-2), while the optimal UV intensity was 0.01 mW cm(-2). Particle sizes ≥8 μm decreased the disinfection efficiency, whereas variation in particle number in activated carbon-filtered water did not significantly affect the disinfection efficiency. Photoreactivation ratio was reduced from 12.07% to 1.69% when the UV dose was increased from 5 mJ cm(-2) to 20 mJ cm(-2). Laboratory and on-site pilot-scale experiments have demonstrated that UV/TiO2 photocatalytic disinfection technology is capable of controlling the risk posed by carbon-bacteria complexes and securing drinking water safety.
The Nova Upgrade Facility for ICF ignition and gain
NASA Astrophysics Data System (ADS)
Lowdermilk, W. H.; Campbell, E. M.; Hunt, J. T.; Murray, J. R.; Storm, E.; Tobin, M. T.; Trenholme, J. B.
1992-01-01
Research on Inertial Confinement Fusion (ICF) is motivated by its potential defense and civilian applications, including ultimately the generation of electric power. The U.S. ICF Program was reviewed recently by the National Academy of Science (NAS) and the Fusion Policy Advisory Committee (FPAC). Both committees issued final reports in 1991 which recommended that first priority in the ICF program be placed on demonstrating fusion ignition and modest gain (G less than 10). The U.S. Department of Energy and Lawrence Livermore National Laboratory (LLNL) have proposed an upgrade of the existing Nova Laser Facility at LLNL to accomplish these goals. Both the NAS and FPAC have endorsed the upgrade of Nova as the optimal path to achieving ignition and gain. Results from Nova Upgrade Experiments will be used to define requirements for driver and target technology both for future high-yield military applications, such as the Laboratory Microfusion Facility (LMF) proposed by the Department of Energy, and for high-gain energy applications leading to an ICF engineering test facility. The central role and modifications which Nova Upgrade would play in the national ICF strategy are described.
Linshiz, Gregory; Goldberg, Alex; Konry, Tania; Hillson, Nathan J
2012-01-01
Synthetic biology is a nascent field that emerged in earnest only around the turn of the millennium. It aims to engineer new biological systems and impart new biological functionality, often through genetic modifications. The design and construction of new biological systems is a complex, multistep process, requiring multidisciplinary collaborative efforts from "fusion" scientists who have formal training in computer science or engineering, as well as hands-on biological expertise. The public has high expectations for synthetic biology and eagerly anticipates the development of solutions to the major challenges facing humanity. This article discusses laboratory practices and the conduct of research in synthetic biology. It argues that the fusion science approach, which integrates biology with computer science and engineering best practices, including standardization, process optimization, computer-aided design and laboratory automation, miniaturization, and systematic management, will increase the predictability and reproducibility of experiments and lead to breakthroughs in the construction of new biological systems. The article also discusses several successful fusion projects, including the development of software tools for DNA construction design automation, recursive DNA construction, and the development of integrated microfluidics systems.
ERIC Educational Resources Information Center
Ozog, J. Z.; Morrison, J. A.
1983-01-01
Presents information, laboratory procedures, and results of an undergraduate experiment in which activity coefficients for a two-component liquid-vapor system are determined. Working in pairs, students can perform the experiment with 10 solutions in a given three-hour laboratory period. (Author/JN)
Optimized Non-Obstructive Particle Damping (NOPD) Treatment for Composite Honeycomb Structures
NASA Technical Reports Server (NTRS)
Panossian, H.
2008-01-01
Non-Obstructive Particle Damping (NOPD) technology is a passive vibration damping approach whereby metallic or non-metallic particles in spherical or irregular shapes, of heavy or light consistency, and even liquid particles are placed inside cavities or attached to structures by an appropriate means at strategic locations, to absorb vibration energy. The objective of the work described herein is the development of a design optimization procedure and discussion of test results for such a NOPD treatment on honeycomb (HC) composite structures, based on finite element modeling (FEM) analyses, optimization and tests. Modeling and predictions were performed and tests were carried out to correlate the test data with the FEM. The optimization procedure consisted of defining a global objective function, using finite difference methods, to determine the optimal values of the design variables through quadratic linear programming. The optimization process was carried out by targeting the highest dynamic displacements of several vibration modes of the structure and finding an optimal treatment configuration that will minimize them. An optimal design was thus derived and laboratory tests were conducted to evaluate its performance under different vibration environments. Three honeycomb composite beams, with Nomex core and aluminum face sheets, empty (untreated), uniformly treated with NOPD, and optimally treated with NOPD, according to the analytically predicted optimal design configuration, were tested in the laboratory. It is shown that the beam with optimal treatment has the lowest response amplitude. Described below are results of modal vibration tests and FEM analyses from predictions of the modal characteristics of honeycomb beams under zero, 50% uniform treatment and an optimal NOPD treatment design configuration and verification with test data.
Barriers to Implementation of Optimal Laboratory Biosafety Practices in Pakistan
Shafaq, Humaira; Hasan, Rumina; Qureshi, Shahida M.; Dojki, Maqboola; Hughes, Molly A.; Zaidi, Anita K. M.; Khan, Erum
2016-01-01
The primary goal of biosafety education is to ensure safe practices among workers in biomedical laboratories. Despite several educational workshops by the Pakistan Biological Safety Association (PBSA), compliance with safe practices among laboratory workers remains low. To determine barriers to implementation of recommended biosafety practices among biomedical laboratory workers in Pakistan, we conducted a questionnaire-based survey of participants attending 2 workshops focusing on biosafety practices in Karachi and Lahore in February 2015. Questionnaires were developed by modifying the BARRIERS scale in which respondents are required to rate barriers on a 1-4 scale. Nineteen of the original 29 barriers were included and subcategorized into 4 groups: awareness, material quality, presentation, and workplace barriers. Workshops were attended by 64 participants. Among barriers that were rated as moderate to great barriers by at least 50% of respondents were: lack of time to read biosafety guidelines (workplace subscale), lack of staff authorization to change/improve practice (workplace subscale), no career or self-improvement advantages to the staff for implementing optimal practices (workplace subscale), and unclear practice implications (presentation subscale). A lack of recognition for employees' rights and benefits in the workplace was found to be a predominant reason for a lack of compliance. Based on perceived barriers, substantial improvement in work environment, worker facilitation, and enabling are needed for achieving improved or optimal biosafety practices in Pakistan. PMID:27400192
NASA Astrophysics Data System (ADS)
Akchurin, Georgy G.; Garif, Akchurin G.; Maksimova, Irina L.; Skaptsov, Alexander A.; Terentyuk, Georgy S.; Khlebtsov, Boris N.; Khlebtsov, Nikolai G.; Tuchin, Valery V.
2010-02-01
We describe applications of silica (core)/gold (shell) nanoparticles and ICG dye to photothermal treatment of phantoms, biotissue and spontaneous tumor of cats and dogs. The laser irradiation parameters were optimized by preliminary experiments with laboratory rats. Three dimensional dynamics of temperature fields in tissue and solution samples was measured with a thermal imaging system. It is shown that the temperature in the volume region of nanoparticles localization can substantially exceed the surface temperature recorded by the thermal imaging system. We have demonstrated effective optical destruction of cancer cells by local injection of plasmon-resonant gold nanoshells and ICG dye followed by continuous wave (CW) diode laser irradiation at wavelength 808 nm.
Studies of the intermediate and deep circulation in the western equatorial Atlantic
NASA Technical Reports Server (NTRS)
Desaubies, Yves; Frankignoul, C.; Merle, Jacques
1991-01-01
This proposal concerns the preparation and design of an experiment, the objective of which is to improve our knowledge of the intermediate and deep circulation in the western equatorial Atlantic Ocean. We shall focus on the description of the western boundary currents, of their crossing with the equator, on the estimation of their mass and heat fluxes, and their seasonal and interannual variations. We will use satellite altimetric data, tomographic measurements, and in situ observations (current measurements, hydrology, and floaters). We propose a feasibility study and the definition of a strategy based on a high-resolution Geophysical Fluid Dynamics Laboratory (GFDL) numerical model to define which in situ measurements are necessary to optimally complete the altimetric observations.
Optimizing multi-dimensional high throughput screening using zebrafish.
Truong, Lisa; Bugel, Sean M; Chlebowski, Anna; Usenko, Crystal Y; Simonich, Michael T; Simonich, Staci L Massey; Tanguay, Robert L
2016-10-01
The use of zebrafish for high throughput screening (HTS) for chemical bioactivity assessments is becoming routine in the fields of drug discovery and toxicology. Here we report current recommendations from our experiences in zebrafish HTS. We compared the effects of different high throughput chemical delivery methods on nominal water concentration, chemical sorption to multi-well polystyrene plates, transcription responses, and resulting whole animal responses. We demonstrate that digital dispensing consistently yields higher data quality and reproducibility compared to standard plastic tip-based liquid handling. Additionally, we illustrate the challenges in using this sensitive model for chemical assessment when test chemicals have trace impurities. Adaptation of these better practices for zebrafish HTS should increase reproducibility across laboratories. Copyright © 2016 Elsevier Inc. All rights reserved.
Organism support for life sciences spacelab experiments
NASA Technical Reports Server (NTRS)
Drake, G. L.; Heppner, D. B.
1976-01-01
This paper presents an overview of the U.S. life sciences laboratory concepts envisioned for the Shuttle/Spacelab era. The basic development approach is to provide a general laboratory facility supplemented by specific experiment hardware as required. The laboratory concepts range from small carry-on laboratories to fully dedicated laboratories in the Spacelab pressurized module. The laboratories will encompass a broad spectrum of research in biology and biomedicine requiring a variety of research organisms. The environmental control and life support of these organisms is a very important aspect of the success of the space research missions. Engineering prototype organism habitats have been designed and fabricated to be compatible with the Spacelab environment and the experiment requirements. These first-generation habitat designs and their subsystems have supported plants, cells/tissues, invertebrates, and small vertebrates in limited evaluation tests. Special handling and transport equipment required for the ground movement of the experiment organisms at the launch/landing site have been built and tested using these initial habitat prototypes.
NASA Technical Reports Server (NTRS)
Greco, R. V.; Eaton, L. R.; Wilkinson, H. C.
1974-01-01
The work is summarized which was accomplished from January 1974 to October 1974 for the Zero-Gravity Atmospheric Cloud Physics Laboratory. The definition and development of an atmospheric cloud physics laboratory and the selection and delineation of candidate experiments that require the unique environment of zero gravity or near zero gravity are reported. The experiment program and the laboratory concept for a Spacelab payload to perform cloud microphysics research are defined. This multimission laboratory is planned to be available to the entire scientific community to utilize in furthering the basic understanding of cloud microphysical processes and phenomenon, thereby contributing to improved weather prediction and ultimately to provide beneficial weather control and modification.
Sampling is the act of selecting items from a specified population in order to estimate the parameters of that population (e.g., selecting soil samples to characterize the properties at an environmental site). Sampling occurs at various levels and times throughout an environmenta...
When "Less is More": The Optimal Design of Language Laboratory Hardware.
ERIC Educational Resources Information Center
Kershaw, Gary; Boyd, Gary
1980-01-01
The results of a process of designing, building, and "de-bugging" two replacement language laboratory hardware systems at Concordia University (Montreal) are described. Because commercially available systems did not meet specifications within budgetary constraints, the systems were built by the university technical department. The systems replaced…
Large-N Over the Source Physics Experiment (SPE) Phase I and Phase II Test Beds
NASA Astrophysics Data System (ADS)
Snelson, C. M.; Carmichael, J. D.; Mellors, R. J.; Abbott, R. E.
2014-12-01
One of the current challenges in the field of monitoring and verification is source discrimination of low-yield nuclear explosions from background seismicity, both natural and anthropogenic. Work is underway at the Nevada National Security Site to conduct a series of chemical explosion experiments using a multi-institutional, multi-disciplinary approach. The goal of this series of experiments, called the Source Physics Experiments (SPE), is to refine the understanding of the effect of earth structures on source phenomenology and energy partitioning in the source region, the transition of seismic energy from the near field to the far field, and the development of S waves observed in the far field. To fully explore these problems, the SPE series includes tests in both hard and soft rock geologic environments. The project comprises a number of activities, which range from characterizing the shallow subsurface to acquiring new explosion data from both the near field (<100 m) and the far field (>100 m). SPE includes a series of planned explosions (with different yields and depths of burials), which are conducted in the same hole and monitored by a diverse set of sensors recording characteristics of the explosions, ground-shock, seismo-acoustic energy propagation. This presentation focuses on imaging the full 3D wavefield over hard rock and soft rock test beds using a large number of seismic sensors. This overview presents statistical analyses of optimal sensor layout required to estimate wavefield discriminants and the planned deployment for the upcoming experiments. This work was conducted under Contract No. DE-AC52-06NA25946 with the U.S. Department of Energy. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
High fidelity kinetic modeling of magnetic reconnection in laboratory plasma
NASA Astrophysics Data System (ADS)
Stanier, A.; Daughton, W. S.
2017-12-01
Over the past decade, a great deal of progress has been made towards understanding the physics of magnetic reconnection in weakly collisional regimes of relevance to both fusion devices, and to space and astrophysical plasmas. However, there remain some outstanding unsolved problems in reconnection physics, such as the generation and influence of plasmoids (flux ropes) within reconnection layers, the development of magnetic turbulence, the role of current driven and streaming instabilities, and the influence of electron pressure anisotropy on the layer structure. Due to the importance of these questions, new laboratory reconnection experiments are being built to allow controlled and reproducible study of such questions with the simultaneous acquisition of high time resolution measurements at a large number of spatial points. These experiments include the FLARE facility at Princeton University and the T-REX experiment at the University of Wisconsin. To guide and interpret these new experiments, and to extrapolate the results to space applications, new investments in kinetic modeling tools are required. We have recently developed a cylindrical version of the VPIC Particle-In-Cell code with the capability to perform first-principles kinetic simulations that approach experimental device size with more realistic geometry and drive coils. This cylindrical version inherits much of the optimization work that has been done recently for the next generation many-cores architectures with wider vector registers, and achieves comparable conservation properties as the Cartesian code. Namely it features exact discrete charge conservation, and a so-called "energy-conserving" scheme where the energy is conserved in the limit of continuous time, i.e. without contribution from spatial discretization (Lewis, 1970). We will present initial results of modeling magnetic reconnection in the experiments mentioned above. Since the VPIC code is open source (https://github.com/losalamos/vpic), this new cylindrical version will also be freely available to the community.
Lo, Sheng-Ying; Baird, Geoffrey S; Greene, Dina N
2015-12-07
Proper utilization of resources is an important operational objective for clinical laboratories. To reduce unnecessary manual interventions on automated instruments, we conducted a workflow analysis that optimized dilution parameters and reporting of abnormally high chemistry results for the Beckman AU series of chemistry analyzers while maintaining clinically acceptable reportable ranges. Workflow analysis for the Beckman AU680/5812 and DxC800 chemistry analyzers was performed using historical data. Clinical reportable ranges for 53 chemistry analytes were evaluated. Optimized dilution parameters and upper limit of reportable ranges for the AU680/5812 instruments were derived and validated to meet these reportable ranges. The number of specimens that required manual dilutions before and after optimization was determined for both the AU680/5812 and DxC800, with the DxC800 serving as the reference instrument. Retrospective data analysis revealed that 7700 specimens required manual dilutions on the DxC over a 2-y period. Using our optimized AU-specific dilution and reporting parameters, the data-driven simulation analysis showed a 61% reduction in manual dilutions. For the specimens that required manual dilutions on the AU680/5812, we developed standardized dilution procedures to further streamline workflow. We provide a data-driven, practical outline for clinical laboratories to efficiently optimize their use of automated chemistry analyzers. The outcomes can be used to assist laboratories wishing to improve their existing procedures or to facilitate transitioning into a new line of instrumentation, regardless of the instrument model or manufacturer. Copyright © 2015 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Zabzdyr, Jennifer L.; Lillard, Sheri J.
2001-01-01
Introduces a laboratory experiment for determining blood alcohol content using a combination of instrumental analysis and forensic science. Teaches the importance of careful laboratory technique and that experiments are conducted for a reason. Includes the procedure of the experiment. (Contains 17 references.) (YDS)
A Simple Photochemical Experiment for the Advanced Laboratory.
ERIC Educational Resources Information Center
Rosenfeld, Stuart M.
1986-01-01
Describes an experiment to provide students with: (1) an introduction to photochemical techniques and theory; (2) an experience with semimicro techniques; (3) an application of carbon-14 nuclear magnetic resonance; and (4) a laboratory with some qualities of a genuine experiment. These criteria are met in the photooxidation of 9,…
International Co-Operation in Control Engineering Education Using Online Experiments
ERIC Educational Resources Information Center
Henry, Jim; Schaedel, Herbert M.
2005-01-01
This paper describes the international co-operation experience in teaching control engineering with laboratories being conducted remotely by students via the Internet. This paper describes how the students ran the experiments and their personal experiences with the laboratory. A tool for process identification and controller tuning based on…
NASA Astrophysics Data System (ADS)
Simon, Nicole A.
Virtual laboratory experiments using interactive computer simulations are not being employed as viable alternatives to laboratory science curriculum at extensive enough rates within higher education. Rote traditional lab experiments are currently the norm and are not addressing inquiry, Critical Thinking, and cognition throughout the laboratory experience, linking with educational technologies (Pyatt & Sims, 2007; 2011; Trundle & Bell, 2010). A causal-comparative quantitative study was conducted with 150 learners enrolled at a two-year community college, to determine the effects of simulation laboratory experiments on Higher-Order Learning, Critical Thinking Skills, and Cognitive Load. The treatment population used simulated experiments, while the non-treatment sections performed traditional expository experiments. A comparison was made using the Revised Two-Factor Study Process survey, Motivated Strategies for Learning Questionnaire, and the Scientific Attitude Inventory survey, using a Repeated Measures ANOVA test for treatment or non-treatment. A main effect of simulated laboratory experiments was found for both Higher-Order Learning, [F (1, 148) = 30.32,p = 0.00, eta2 = 0.12] and Critical Thinking Skills, [F (1, 148) = 14.64,p = 0.00, eta 2 = 0.17] such that simulations showed greater increases than traditional experiments. Post-lab treatment group self-reports indicated increased marginal means (+4.86) in Higher-Order Learning and Critical Thinking Skills, compared to the non-treatment group (+4.71). Simulations also improved the scientific skills and mastery of basic scientific subject matter. It is recommended that additional research recognize that learners' Critical Thinking Skills change due to different instructional methodologies that occur throughout a semester.
NASA Astrophysics Data System (ADS)
Utama, D. N.; Ani, N.; Iqbal, M. M.
2018-03-01
Optimization is a process for finding parameter (parameters) that is (are) able to deliver an optimal value for an objective function. Seeking an optimal generic model for optimizing is a computer science study that has been being practically conducted by numerous researchers. Generic model is a model that can be technically operated to solve any varieties of optimization problem. By using an object-oriented method, the generic model for optimizing was constructed. Moreover, two types of optimization method, simulated-annealing and hill-climbing, were functioned in constructing the model and compared to find the most optimal one then. The result said that both methods gave the same result for a value of objective function and the hill-climbing based model consumed the shortest running time.
Alladio, Eugenio; Biosa, Giulia; Seganti, Fabrizio; Di Corcia, Daniele; Salomone, Alberto; Vincenti, Marco; Baumgartner, Markus R
2018-05-11
The quantitative determination of ethyl glucuronide (EtG) in hair samples is consistently used throughout the world to assess chronic excessive alcohol consumption. For administrative and legal purposes, the analytical results are compared with cut-off values recognized by regulatory authorities and scientific societies. However, it has been recently recognized that the analytical results depend on the hair sample pretreatment procedures, including the crumbling and extraction conditions. A systematic evaluation of the EtG extraction conditions from pulverized scalp hair was conducted by design of experiments (DoE) considering the extraction time, temperature, pH, and solvent composition as potential influencing factors. It was concluded that an overnight extraction at 60°C with pure water at neutral pH represents the most effective conditions to achieve high extraction yields. The absence of differential degradation of the internal standard (isotopically-labeled EtG) under such conditions was confirmed and the overall analytical method was validated according to SGWTOX and ISO17025 criteria. Twenty real hair samples with different EtG content were analyzed with three commonly accepted procedures: (a) hair manually cut in snippets and extracted at room temperature; (b) pulverized hair extracted at room temperature; (c) hair treated with the optimized method. Average increments of EtG concentration around 69% (from a to c) and 29% (from b to c) were recorded. In light of these results, the authors urge the scientific community to undertake an inter-laboratory study with the aim of defining more in detail the optimal hair EtG detection method and verifying the corresponding cut-off level for legal enforcements. This article is protected by copyright. All rights reserved.
Scheinker, Alexander; Huang, Xiaobiao; Wu, Juhao
2017-02-20
Here, we report on a beam-based experiment performed at the SPEAR3 storage ring of the Stanford Synchrotron Radiation Lightsource at the SLAC National Accelerator Laboratory, in which a model-independent extremum-seeking optimization algorithm was utilized to minimize betatron oscillations in the presence of a time-varying kicker magnetic field, by automatically tuning the pulsewidth, voltage, and delay of two other kicker magnets, and the current of two skew quadrupole magnets, simultaneously, in order to optimize injection kick matching. Adaptive tuning was performed on eight parameters simultaneously. The scheme was able to continuously maintain the match of a five-magnet lattice while the fieldmore » strength of a kicker magnet was continuously varied at a rate much higher (±6% sinusoidal voltage change over 1.5 h) than typically experienced in operation. Lastly, the ability to quickly tune or compensate for time variation of coupled components, as demonstrated here, is very important for the more general, more difficult problem of global accelerator tuning to quickly switch between various experimental setups.« less
Figueroa-Torres, Gonzalo M; Pittman, Jon K; Theodoropoulos, Constantinos
2017-10-01
Microalgal starch and lipids, carbon-based storage molecules, are useful as potential biofuel feedstocks. In this work, cultivation strategies maximising starch and lipid formation were established by developing a multi-parameter kinetic model describing microalgal growth as well as starch and lipid formation, in conjunction with laboratory-scale experiments. Growth dynamics are driven by nitrogen-limited mixotrophic conditions, known to increase cellular starch and lipid contents whilst enhancing biomass growth. Model parameters were computed by fitting model outputs to a range of experimental datasets from batch cultures of Chlamydomonas reinhardtii. Predictive capabilities of the model were established against different experimental data. The model was subsequently used to compute optimal nutrient-based cultivation strategies in terms of initial nitrogen and carbon concentrations. Model-based optimal strategies yielded a significant increase of 261% for starch (0.065gCL -1 ) and 66% for lipid (0.08gCL -1 ) production compared to base-case conditions (0.018gCL -1 starch, 0.048gCL -1 lipids). Copyright © 2017 Elsevier Ltd. All rights reserved.
Bhatt, Jwalant K; Ghevariya, Chirag M; Dudhagara, Dushyant R; Rajpara, Rahul K; Dave, Bharti P
2014-11-01
For the first time, Cochliobolus lunatus strain CHR4D, a marine-derived ascomycete fungus isolated from historically contaminated crude oil polluted shoreline of Alang-Sosiya ship-breaking yard, at Bhavnagar coast, Gujarat has been reported showing the rapid and enhanced biodegradation of chrysene, a four ringed high molecular weight (HMW) polycyclic aromatic hydrocarbon (PAH). Mineral Salt Broth (MSB) components such as ammonium tartrate and glucose along with chrysene, pH and trace metal solution have been successfully optimized by Response Surface Methodology (RSM) using central composite design (CCD). A validated, two-step optimization protocol has yielded a substantial 93.10% chrysene degradation on the 4(th) day, against unoptimized 56.37% degradation on the 14(th) day. The results depict 1.65 fold increase in chrysene degradation and 1.40 fold increase in biomass with a considerable decrement in time. Based on the successful laboratory experiments, C. lunatus strain CHR4D can thus be predicted as a potential candidate for mycoremediation of HMW PAHs impacted environments.
Evaluation of on-line pulse control for vibration suppression in flexible spacecraft
NASA Technical Reports Server (NTRS)
Masri, Sami F.
1987-01-01
A numerical simulation was performed, by means of a large-scale finite element code capable of handling large deformations and/or nonlinear behavior, to investigate the suitability of the nonlinear pulse-control algorithm to suppress the vibrations induced in the Spacecraft Control Laboratory Experiment (SCOLE) components under realistic maneuvers. Among the topics investigated were the effects of various control parameters on the efficiency and robustness of the vibration control algorithm. Advanced nonlinear control techniques were applied to an idealized model of some of the SCOLE components to develop an efficient algorithm to determine the optimal locations of point actuators, considering the hardware on the SCOLE project as distributed in nature. The control was obtained from a quadratic optimization criterion, given in terms of the state variables of the distributed system. An experimental investigation was performed on a model flexible structure resembling the essential features of the SCOLE components, and electrodynamic and electrohydraulic actuators were used to investigate the applicability of the control algorithm with such devices in addition to mass-ejection pulse generators using compressed air.
Operating environmental laboratories--an overview of analysis equipment procurement and management.
Pandya, G H; Shinde, V M; Kanade, G S; Kondawar, V K
2003-10-01
Management of equipment in an environmental laboratory requires planning involving assessment of the workload on a particular equipment, establishment of criteria and specification for the purchase of equipment, creation of infrastructure for installation and testing of the equipment, optimization of analysis conditions, development of preventive maintenance procedures and establishment of in-house repair facilities. The paper reports the results of such an analysis carried for operating environmental laboratories associated with R& D work, serving as an Govt. laboratory or attached to an Industry for analysing industrial emissions.
Paulovich, Amanda G.; Billheimer, Dean; Ham, Amy-Joan L.; Vega-Montoto, Lorenzo; Rudnick, Paul A.; Tabb, David L.; Wang, Pei; Blackman, Ronald K.; Bunk, David M.; Cardasis, Helene L.; Clauser, Karl R.; Kinsinger, Christopher R.; Schilling, Birgit; Tegeler, Tony J.; Variyath, Asokan Mulayath; Wang, Mu; Whiteaker, Jeffrey R.; Zimmerman, Lisa J.; Fenyo, David; Carr, Steven A.; Fisher, Susan J.; Gibson, Bradford W.; Mesri, Mehdi; Neubert, Thomas A.; Regnier, Fred E.; Rodriguez, Henry; Spiegelman, Cliff; Stein, Stephen E.; Tempst, Paul; Liebler, Daniel C.
2010-01-01
Optimal performance of LC-MS/MS platforms is critical to generating high quality proteomics data. Although individual laboratories have developed quality control samples, there is no widely available performance standard of biological complexity (and associated reference data sets) for benchmarking of platform performance for analysis of complex biological proteomes across different laboratories in the community. Individual preparations of the yeast Saccharomyces cerevisiae proteome have been used extensively by laboratories in the proteomics community to characterize LC-MS platform performance. The yeast proteome is uniquely attractive as a performance standard because it is the most extensively characterized complex biological proteome and the only one associated with several large scale studies estimating the abundance of all detectable proteins. In this study, we describe a standard operating protocol for large scale production of the yeast performance standard and offer aliquots to the community through the National Institute of Standards and Technology where the yeast proteome is under development as a certified reference material to meet the long term needs of the community. Using a series of metrics that characterize LC-MS performance, we provide a reference data set demonstrating typical performance of commonly used ion trap instrument platforms in expert laboratories; the results provide a basis for laboratories to benchmark their own performance, to improve upon current methods, and to evaluate new technologies. Additionally, we demonstrate how the yeast reference, spiked with human proteins, can be used to benchmark the power of proteomics platforms for detection of differentially expressed proteins at different levels of concentration in a complex matrix, thereby providing a metric to evaluate and minimize preanalytical and analytical variation in comparative proteomics experiments. PMID:19858499
Overview of DOE Oil and Gas Field Laboratory Projects
NASA Astrophysics Data System (ADS)
Bromhal, G.; Ciferno, J.; Covatch, G.; Folio, E.; Melchert, E.; Ogunsola, O.; Renk, J., III; Vagnetti, R.
2017-12-01
America's abundant unconventional oil and natural gas (UOG) resources are critical components of our nation's energy portfolio. These resources need to be prudently developed to derive maximum benefits. In spite of the long history of hydraulic fracturing, the optimal number of fracturing stages during multi-stage fracture stimulation in horizontal wells is not known. In addition, there is the dire need of a comprehensive understanding of ways to improve the recovery of shale gas with little or no impacts on the environment. Research that seeks to expand our view of effective and environmentally sustainable ways to develop our nation's oil and natural gas resources can be done in the laboratory or at a computer; but, some experiments must be performed in a field setting. The Department of Energy (DOE) Field Lab Observatory projects are designed to address those research questions that must be studied in the field. The Department of Energy (DOE) is developing a suite of "field laboratory" test sites to carry out collaborative research that will help find ways of improving the recovery of energy resources as much as possible, with as little environmental impact as possible, from "unconventional" formations, such as shale and other low permeability rock formations. Currently there are three field laboratories in various stages of development and operation. Work is on-going at two of the sites: The Hydraulic Fracturing Test Site (HFTS) in the Permian Basin and the Marcellus Shale Energy and Environmental Lab (MSEEL) project in the Marcellus Shale Play. Agreement on the third site, the Utica Shale Energy and Environmental Lab (USEEL) project in the Utica Shale Play, was just recently finalized. Other field site opportunities may be forthcoming. This presentation will give an overview of the three field laboratory projects.
When teams shift among processes: insights from simulation and optimization.
Kennedy, Deanna M; McComb, Sara A
2014-09-01
This article introduces process shifts to study the temporal interplay among transition and action processes espoused in the recurring phase model proposed by Marks, Mathieu, and Zacarro (2001). Process shifts are those points in time when teams complete a focal process and change to another process. By using team communication patterns to measure process shifts, this research explores (a) when teams shift among different transition processes and initiate action processes and (b) the potential of different interventions, such as communication directives, to manipulate process shift timing and order and, ultimately, team performance. Virtual experiments are employed to compare data from observed laboratory teams not receiving interventions, simulated teams receiving interventions, and optimal simulated teams generated using genetic algorithm procedures. Our results offer insights about the potential for different interventions to affect team performance. Moreover, certain interventions may promote discussions about key issues (e.g., tactical strategies) and facilitate shifting among transition processes in a manner that emulates optimal simulated teams' communication patterns. Thus, we contribute to theory regarding team processes in 2 important ways. First, we present process shifts as a way to explore the timing of when teams shift from transition to action processes. Second, we use virtual experimentation to identify those interventions with the greatest potential to affect performance by changing when teams shift among processes. Additionally, we employ computational methods including neural networks, simulation, and optimization, thereby demonstrating their applicability in conducting team research. PsycINFO Database Record (c) 2014 APA, all rights reserved.
Inexpensive Audio Activities: Earbud-based Sound Experiments
NASA Astrophysics Data System (ADS)
Allen, Joshua; Boucher, Alex; Meggison, Dean; Hruby, Kate; Vesenka, James
2016-11-01
Inexpensive alternatives to a number of classic introductory physics sound laboratories are presented including interference phenomena, resonance conditions, and frequency shifts. These can be created using earbuds, economical supplies such as Giant Pixie Stix® wrappers, and free software available for PCs and mobile devices. We describe two interference laboratories (beat frequency and two-speaker interference) and two resonance laboratories (quarter- and half-wavelength). Lastly, a Doppler laboratory using rotating earbuds is explained. The audio signal captured by all experiments is analyzed on free spectral analysis software and many of the experiments incorporate the unifying theme of measuring the speed of sound in air.
An Optimized Combined Wave and Current Bottom Boundary Layer Model for Arbitrary Bed Roughness
2017-06-30
Engineer Research and Development Center (ERDC), Coastal and Hydraulics Laboratory (CHL), Flood and Storm Protection Division (HF), Coastal ...ER D C/ CH L TR -1 7- 11 Coastal Inlets Research Program An Optimized Combined Wave and Current Bottom Boundary Layer Model for...client/default. Coastal Inlets Research Program ERDC/CHL TR-17-11 June 2017 An Optimized Combined Wave and Current Bottom Boundary Layer Model
A system-level cost-of-energy wind farm layout optimization with landowner modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Le; MacDonald, Erin
This work applies an enhanced levelized wind farm cost model, including landowner remittance fees, to determine optimal turbine placements under three landowner participation scenarios and two land-plot shapes. Instead of assuming a continuous piece of land is available for the wind farm construction, as in most layout optimizations, the problem formulation represents landowner participation scenarios as a binary string variable, along with the number of turbines. The cost parameters and model are a combination of models from the National Renewable Energy Laboratory (NREL), Lawrence Berkeley National Laboratory, and Windustiy. The system-level cost-of-energy (COE) optimization model is also tested under twomore » land-plot shapes: equally-sized square land plots and unequal rectangle land plots. The optimal COEs results are compared to actual COE data and found to be realistic. The results show that landowner remittances account for approximately 10% of farm operating costs across all cases. Irregular land-plot shapes are easily handled by the model. We find that larger land plots do not necessarily receive higher remittance fees. The model can help site developers identify the most crucial land plots for project success and the optimal positions of turbines, with realistic estimates of costs and profitability. (C) 2013 Elsevier Ltd. All rights reserved.« less
NASA Astrophysics Data System (ADS)
Sawicki, Jean-Paul; Saint-Eve, Frédéric; Petit, Pierre; Aillerie, Michel
2017-02-01
This paper presents results of experiments aimed to verify a formula able to compute duty cycle in the case of pulse width modulation control for a DC-DC converter designed and realized in laboratory. This converter, called Magnetically Coupled Boost (MCB) is sized to step up only one photovoltaic module voltage to supply directly grid inverters. Duty cycle formula will be checked in a first time by identifying internal parameter, auto-transformer ratio, and in a second time by checking stability of operating point on the side of photovoltaic module. Thinking on nature of generator source and load connected to converter leads to imagine additional experiments to decide if auto-transformer ratio parameter could be used with fixed value or on the contrary with adaptive value. Effects of load variations on converter behavior or impact of possible shading on photovoltaic module are also mentioned, with aim to design robust control laws, in the case of parallel association, designed to compensate unwanted effects due to output voltage coupling.
H-dibaryon search via Ξ- capture on the deuteron
NASA Astrophysics Data System (ADS)
Merrill, F.; Iijima, T.; Koran, P.; Barnes, P. D.; Bassalleck, B.; Berdoz, A. R.; Bürger, T.; Burger, M.; Chrien, R. E.; Davis, C. A.; Diebold, G. E.; En'yo, H.; Fischer, H.; Franklin, G. B.; Franz, J.; Gan, L.; Gill, D. R.; Imai, K.; Kondo, Y.; Landry, M.; Lee, L.; Lowe, J.; Magahiz, R.; Masaike, A.; McCrady, R.; Meyer, C. A.; Nelson, J. M.; Okada, K.; Page, S. A.; Paschke, K.; Pile, P. H.; Quinn, B. P.; Ramsay, W. D.; Rössle, E.; Rusek, A.; Sawafta, R.; Schmitt, H.; Schumacher, R. A.; Stearns, R. L.; Stotzer, R. W.; Sukaton, I. R.; Sum, V.; Sutter, R.; Szymanski, J. J.; Takeutchi, F.; van Oers, W. T.; Yamamoto, K.; Zeps, V. J.; Zybert, R.
2001-03-01
A search for the H dibaryon has been conducted at the Brookhaven National Laboratory AGS, using a 1.8 GeV/c K- beam. Ξ- hyperons were produced in a liquid-hydrogen target via the reaction K-+p-->K++Ξ-. The hyperons were slowed in degraders and those most likely to stop in an adjacent liquid-deuterium target were tagged by silicon detectors. Monoenergetic neutrons were sought as the signature for H formation in (Ξ-,d)atom-->H+n. The experiment was designed for optimal sensitivity to a loosely-bound H, complementing recent (K-,K+) measurements on nuclear targets. In addition, the experiment's sensitivity was independent of lifetime and of decay modes of the H. No statistically significant evidence for H formation was seen. Upper limits on the branching ratio for H formation in the above reaction have been set in a mass range extending from slightly above ΛΛ threshold to ~100 MeV of binding and are compared with a corresponding theoretical prediction.
Chiavazzo, Eliodoro; Isaia, Marco; Mammola, Stefano; Lepore, Emiliano; Ventola, Luigi; Asinari, Pietro; Pugno, Nicola Maria
2015-01-01
The choice of a suitable area to spiders where to lay eggs is promoted in terms of Darwinian fitness. Despite its importance, the underlying factors behind this key decision are generally poorly understood. Here, we designed a multidisciplinary study based both on in-field data and laboratory experiments focusing on the European cave spider Meta menardi (Araneae, Tetragnathidae) and aiming at understanding the selective forces driving the female in the choice of the depositional area. Our in-field data analysis demonstrated a major role of air velocity and distance from the cave entrance within a particular cave in driving the female choice. This has been interpreted using a model based on the Entropy Generation Minimization - EGM - method, without invoking best fit parameters and thanks to independent lab experiments, thus demonstrating that the female chooses the depositional area according to minimal level of thermo-fluid-dynamic irreversibility. This methodology may pave the way to a novel approach in understanding evolutionary strategies for other living organisms. PMID:25556697