Maeda, Jin; Suzuki, Tatsuya; Takayama, Kozo
2012-12-01
A large-scale design space was constructed using a Bayesian estimation method with a small-scale design of experiments (DoE) and small sets of large-scale manufacturing data without enforcing a large-scale DoE. The small-scale DoE was conducted using various Froude numbers (X(1)) and blending times (X(2)) in the lubricant blending process for theophylline tablets. The response surfaces, design space, and their reliability of the compression rate of the powder mixture (Y(1)), tablet hardness (Y(2)), and dissolution rate (Y(3)) on a small scale were calculated using multivariate spline interpolation, a bootstrap resampling technique, and self-organizing map clustering. The constant Froude number was applied as a scale-up rule. Three experiments under an optimal condition and two experiments under other conditions were performed on a large scale. The response surfaces on the small scale were corrected to those on a large scale by Bayesian estimation using the large-scale results. Large-scale experiments under three additional sets of conditions showed that the corrected design space was more reliable than that on the small scale, even if there was some discrepancy in the pharmaceutical quality between the manufacturing scales. This approach is useful for setting up a design space in pharmaceutical development when a DoE cannot be performed at a commercial large manufacturing scale.
Designing for Scale: Reflections on Rolling Out Reading Improvement in Kenya and Liberia.
Gove, Amber; Korda Poole, Medina; Piper, Benjamin
2017-03-01
Since 2008, the Ministries of Education in Liberia and Kenya have undertaken transitions from small-scale pilot programs to improve reading outcomes among primary learners to the large-scale implementation of reading interventions. The effects of the pilots on learning outcomes were significant, but questions remained regarding whether such large gains could be sustained at scale. In this article, the authors dissect the Liberian and Kenyan experiences with implementing large-scale reading programs, documenting the critical components and conditions of the program designs that affected the likelihood of successfully transitioning from pilot to scale. They also review the design, deployment, and effectiveness of each pilot program and the scale, design, duration, enabling conditions, and initial effectiveness results of the scaled programs in each country. The implications of these results for the design of both pilot and large-scale reading programs are discussed in light of the experiences of both the Liberian and Kenyan programs. © 2017 Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Finsterle, S.; Moridis, G.J.; Pruess, K.
1994-01-01
The emplacement of liquids under controlled viscosity conditions is investigated by means of numerical simulations. Design calculations are performed for a laboratory experiment on a decimeter scale, and a field experiment on a meter scale. The purpose of the laboratory experiment is to study the behavior of multiple gout plumes when injected in a porous medium. The calculations for the field trial aim at designing a grout injection test from a vertical well in order to create a grout plume of a significant extent in the subsurface.
A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project.
Ewers, Robert M; Didham, Raphael K; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L; Turner, Edgar C
2011-11-27
Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification.
A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project
Ewers, Robert M.; Didham, Raphael K.; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D.; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L.; Turner, Edgar C.
2011-01-01
Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification. PMID:22006969
NASA Astrophysics Data System (ADS)
Alberts, Samantha J.
The investigation of microgravity fluid dynamics emerged out of necessity with the advent of space exploration. In particular, capillary research took a leap forward in the 1960s with regards to liquid settling and interfacial dynamics. Due to inherent temperature variations in large spacecraft liquid systems, such as fuel tanks, forces develop on gas-liquid interfaces which induce thermocapillary flows. To date, thermocapillary flows have been studied in small, idealized research geometries usually under terrestrial conditions. The 1 to 3m lengths in current and future large tanks and hardware are designed based on hardware rather than research, which leaves spaceflight systems designers without the technological tools to effectively create safe and efficient designs. This thesis focused on the design and feasibility of a large length-scale thermocapillary flow experiment, which utilizes temperature variations to drive a flow. The design of a helical channel geometry ranging from 1 to 2.5m in length permits a large length-scale thermocapillary flow experiment to fit in a seemingly small International Space Station (ISS) facility such as the Fluids Integrated Rack (FIR). An initial investigation determined the proposed experiment produced measurable data while adhering to the FIR facility limitations. The computational portion of this thesis focused on the investigation of functional geometries of fuel tanks and depots using Surface Evolver. This work outlines the design of a large length-scale thermocapillary flow experiment for the ISS FIR. The results from this work improve the understanding thermocapillary flows and thus improve technological tools for predicting heat and mass transfer in large length-scale thermocapillary flows. Without the tools to understand the thermocapillary flows in these systems, engineers are forced to design larger, heavier vehicles to assure safety and mission success.
U.S. perspective on technology demonstration experiments for adaptive structures
NASA Technical Reports Server (NTRS)
Aswani, Mohan; Wada, Ben K.; Garba, John A.
1991-01-01
Evaluation of design concepts for adaptive structures is being performed in support of several focused research programs. These include programs such as Precision Segmented Reflector (PSR), Control Structure Interaction (CSI), and the Advanced Space Structures Technology Research Experiment (ASTREX). Although not specifically designed for adaptive structure technology validation, relevant experiments can be performed using the Passive and Active Control of Space Structures (PACOSS) testbed, the Space Integrated Controls Experiment (SPICE), the CSI Evolutionary Model (CEM), and the Dynamic Scale Model Test (DSMT) Hybrid Scale. In addition to the ground test experiments, several space flight experiments have been planned, including a reduced gravity experiment aboard the KC-135 aircraft, shuttle middeck experiments, and the Inexpensive Flight Experiment (INFLEX).
Scaling and design of landslide and debris-flow experiments
Iverson, Richard M.
2015-01-01
Scaling plays a crucial role in designing experiments aimed at understanding the behavior of landslides, debris flows, and other geomorphic phenomena involving grain-fluid mixtures. Scaling can be addressed by using dimensional analysis or – more rigorously – by normalizing differential equations that describe the evolving dynamics of the system. Both of these approaches show that, relative to full-scale natural events, miniaturized landslides and debris flows exhibit disproportionately large effects of viscous shear resistance and cohesion as well as disproportionately small effects of excess pore-fluid pressure that is generated by debris dilation or contraction. This behavioral divergence grows in proportion to H3, where H is the thickness of a moving mass. Therefore, to maximize geomorphological relevance, experiments with wet landslides and debris flows must be conducted at the largest feasible scales. Another important consideration is that, unlike stream flows, landslides and debris flows accelerate from statically balanced initial states. Thus, no characteristic macroscopic velocity exists to guide experiment scaling and design. On the other hand, macroscopic gravity-driven motion of landslides and debris flows evolves over a characteristic time scale (L/g)1/2, where g is the magnitude of gravitational acceleration and L is the characteristic length of the moving mass. Grain-scale stress generation within the mass occurs on a shorter time scale, H/(gL)1/2, which is inversely proportional to the depth-averaged material shear rate. A separation of these two time scales exists if the criterion H/L < < 1 is satisfied, as is commonly the case. This time scale separation indicates that steady-state experiments can be used to study some details of landslide and debris-flow behavior but cannot be used to study macroscopic landslide or debris-flow dynamics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rhinefrank, Kenneth E.; Lenee-Bluhm, Pukha; Prudell, Joseph H.
The most prudent path to a full-scale design, build and deployment of a wave energy conversion (WEC) system involves establishment of validated numerical models using physical experiments in a methodical scaling program. This Project provides essential additional rounds of wave tank testing at 1:33 scale and ocean/bay testing at a 1:7 scale, necessary to validate numerical modeling that is essential to a utility-scale WEC design and associated certification.
A community-based, interdisciplinary rehabilitation engineering course.
Lundy, Mary; Aceros, Juan
2016-08-01
A novel, community-based course was created through collaboration between the School of Engineering and the Physical Therapy program at the University of North Florida. This course offers a hands-on, interdisciplinary training experience for undergraduate engineering students through team-based design projects where engineering students are partnered with physical therapy students. Students learn the process of design, fabrication and testing of low-tech and high-tech rehabilitation technology for children with disabilities, and are exposed to a clinical experience under the guidance of licensed therapists. This course was taught in two consecutive years and pre-test/post-test data evaluating the impact of this interprofessional education experience on the students is presented using the Public Service Motivation Scale, Civic Actions Scale, Civic Attitudes Scale, and the Interprofessional Socialization and Valuing Scale.
Design, construction, and evaluation of a 1:8 scale model binaural manikin.
Robinson, Philip; Xiang, Ning
2013-03-01
Many experiments in architectural acoustics require presenting listeners with simulations of different rooms to compare. Acoustic scale modeling is a feasible means to create accurate simulations of many rooms at reasonable cost. A critical component in a scale model room simulation is a receiver that properly emulates a human receiver. For this purpose, a scale model artificial head has been constructed and tested. This paper presents the design and construction methods used, proper equalization procedures, and measurements of its response. A headphone listening experiment examining sound externalization with various reflection conditions is presented that demonstrates its use for psycho-acoustic testing.
Dissociative Experiences, Creative Imagination, and Artistic Production in Students of Fine Arts
ERIC Educational Resources Information Center
Perez-Fabello, Maria Jose; Campos, Alfredo
2011-01-01
The current research was designed to assess the influence of dissociative experiences and creative imagination on the artistic production of Fine Arts students of the University of Vigo (Spain). The sample consisted of 81 students who were administered the Creative Imagination Scale and The Dissociative Experiences Scale. To measure artistic…
Thermal Destruction Of CB Contaminants Bound On Building ...
Symposium Paper An experimental and theoretical program has been initiated by the U.S. EPA to investigate issues of chemical/biological agent destruction in incineration systems when the agent in question is bound on common porous building interior materials. This program includes 3-dimensional computational fluid dynamics modeling with matrix-bound agent destruction kinetics, bench-scale experiments to determine agent destruction kinetics while bound on various matrices, and pilot-scale experiments to scale-up the bench-scale experiments to a more practical scale. Finally, model predictions are made to predict agent destruction and combustion conditions in two full-scale incineration systems that are typical of modern combustor design.
Discrete choice experiments of pharmacy services: a systematic review.
Vass, Caroline; Gray, Ewan; Payne, Katherine
2016-06-01
Background Two previous systematic reviews have summarised the application of discrete choice experiments to value preferences for pharmacy services. These reviews identified a total of twelve studies and described how discrete choice experiments have been used to value pharmacy services but did not describe or discuss the application of methods used in the design or analysis. Aims (1) To update the most recent systematic review and critically appraise current discrete choice experiments of pharmacy services in line with published reporting criteria and; (2) To provide an overview of key methodological developments in the design and analysis of discrete choice experiments. Methods The review used a comprehensive strategy to identify eligible studies (published between 1990 and 2015) by searching electronic databases for key terms related to discrete choice and best-worst scaling (BWS) experiments. All healthcare choice experiments were then hand-searched for key terms relating to pharmacy. Data were extracted using a published checklist. Results A total of 17 discrete choice experiments eliciting preferences for pharmacy services were identified for inclusion in the review. No BWS studies were identified. The studies elicited preferences from a variety of populations (pharmacists, patients, students) for a range of pharmacy services. Most studies were from a United Kingdom setting, although examples from Europe, Australia and North America were also identified. Discrete choice experiments for pharmacy services tended to include more attributes than non-pharmacy choice experiments. Few studies reported the use of qualitative research methods in the design and interpretation of the experiments (n = 9) or use of new methods of analysis to identify and quantify preference and scale heterogeneity (n = 4). No studies reported the use of Bayesian methods in their experimental design. Conclusion Incorporating more sophisticated methods in the design of pharmacy-related discrete choice experiments could help researchers produce more efficient experiments which are better suited to valuing complex pharmacy services. Pharmacy-related discrete choice experiments could also benefit from more sophisticated analytical techniques such as investigations into scale and preference heterogeneity. Employing these sophisticated methods for both design and analysis could extend the usefulness of discrete choice experiments to inform health and pharmacy policy.
Measuring the Experience and Perception of Suffering
ERIC Educational Resources Information Center
Schulz, Richard; Monin, Joan K.; Czaja, Sara J.; Lingler, Jennifer H.; Beach, Scott R.; Martire, Lynn M.; Dodds, Angela; Hebert, Randy S.; Zdaniuk, Bozena; Cook, Thomas B.
2010-01-01
Purpose: Assess psychometric properties of scales developed to assess experience and perception of physical, psychological, and existential suffering in older individuals. Design and Methods: Scales were administered to 3 populations of older persons and/or their family caregivers: individuals with Alzheimer's disease (AD) and their family…
Yoshino, Hiroyuki; Hara, Yuko; Dohi, Masafumi; Yamashita, Kazunari; Hakomori, Tadashi; Kimura, Shin-Ichiro; Iwao, Yasunori; Itai, Shigeru
2018-04-01
Scale-up approaches for film coating process have been established for each type of film coating equipment from thermodynamic and mechanical analyses for several decades. The objective of the present study was to establish a versatile scale-up approach for film coating process applicable to commercial production that is based on critical quality attribute (CQA) using the Quality by Design (QbD) approach and is independent of the equipment used. Experiments on a pilot scale using the Design of Experiment (DoE) approach were performed to find a suitable CQA from surface roughness, contact angle, color difference, and coating film properties by terahertz spectroscopy. Surface roughness was determined to be a suitable CQA from a quantitative appearance evaluation. When surface roughness was fixed as the CQA, the water content of the film-coated tablets was determined to be the critical material attribute (CMA), a parameter that does not depend on scale or equipment. Finally, to verify the scale-up approach determined from the pilot scale, experiments on a commercial scale were performed. The good correlation between the surface roughness (CQA) and the water content (CMA) identified at the pilot scale was also retained at the commercial scale, indicating that our proposed method should be useful as a scale-up approach for film coating process.
Fava, Joseph L.; Rosen, Rochelle K.; Vargas, Sara; Shaw, Julia G.; Kojic, E. Milu; Kiser, Patrick F.; Friend, David R.; Katz, David F.
2014-01-01
Abstract The effectiveness of any biomedical prevention technology relies on both biological efficacy and behavioral adherence. Microbicide trials have been hampered by low adherence, limiting the ability to draw meaningful conclusions about product effectiveness. Central to this problem may be an inadequate conceptualization of how product properties themselves impact user experience and adherence. Our goal is to expand the current microbicide development framework to include product “perceptibility,” the objective measurement of user sensory perceptions (i.e., sensations) and experiences of formulation performance during use. For vaginal gels, a set of biophysical properties, including rheological properties and measures of spreading and retention, may critically impact user experiences. Project LINK sought to characterize the user experience in this regard, and to validate measures of user sensory perceptions and experiences (USPEs) using four prototype topical vaginal gel formulations designed for pericoital use. Perceptibility scales captured a range of USPEs during the product application process (five scales), ambulation after product insertion (six scales), and during sexual activity (eight scales). Comparative statistical analyses provided empirical support for hypothesized relationships between gel properties, spreading performance, and the user experience. Project LINK provides preliminary evidence for the utility of evaluating USPEs, introducing a paradigm shift in the field of microbicide formulation design. We propose that these user sensory perceptions and experiences initiate cognitive processes in users resulting in product choice and willingness-to-use. By understanding the impact of USPEs on that process, formulation development can optimize both drug delivery and adherence. PMID:24180360
Morrow, Kathleen M; Fava, Joseph L; Rosen, Rochelle K; Vargas, Sara; Shaw, Julia G; Kojic, E Milu; Kiser, Patrick F; Friend, David R; Katz, David F
2014-01-01
Abstract The effectiveness of any biomedical prevention technology relies on both biological efficacy and behavioral adherence. Microbicide trials have been hampered by low adherence, limiting the ability to draw meaningful conclusions about product effectiveness. Central to this problem may be an inadequate conceptualization of how product properties themselves impact user experience and adherence. Our goal is to expand the current microbicide development framework to include product "perceptibility," the objective measurement of user sensory perceptions (i.e., sensations) and experiences of formulation performance during use. For vaginal gels, a set of biophysical properties, including rheological properties and measures of spreading and retention, may critically impact user experiences. Project LINK sought to characterize the user experience in this regard, and to validate measures of user sensory perceptions and experiences (USPEs) using four prototype topical vaginal gel formulations designed for pericoital use. Perceptibility scales captured a range of USPEs during the product application process (five scales), ambulation after product insertion (six scales), and during sexual activity (eight scales). Comparative statistical analyses provided empirical support for hypothesized relationships between gel properties, spreading performance, and the user experience. Project LINK provides preliminary evidence for the utility of evaluating USPEs, introducing a paradigm shift in the field of microbicide formulation design. We propose that these user sensory perceptions and experiences initiate cognitive processes in users resulting in product choice and willingness-to-use. By understanding the impact of USPEs on that process, formulation development can optimize both drug delivery and adherence.
Detector Development for the MARE Neutrino Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galeazzi, M.; Bogorin, D.; Molina, R.
2009-12-16
The MARE experiment is designed to measure the mass of the neutrino with sub-eV sensitivity by measuring the beta decay of {sup 187}Re with cryogenic microcalorimeters. A preliminary analysis shows that, to achieve the necessary statistics, between 10,000 and 50,000 detectors are likely necessary. We have fabricated and characterized Iridium transition edge sensors with high reproducibility and uniformity for such a large scale experiment. We have also started a full scale simulation of the experimental setup for MARE, including thermalization in the absorber, detector response, and optimum filter analysis, to understand the issues related to reaching a sub-eV sensitivity andmore » to optimize the design of the MARE experiment. We present our characterization of the Ir devices, including reproducibility, uniformity, and sensitivity, and we discuss the implementation and capabilities of our full scale simulation.« less
Flume experimentation and simulation of bedrock channel processes
NASA Astrophysics Data System (ADS)
Thompson, Douglas; Wohl, Ellen
Flume experiments can provide cost effective, physically manageable miniature representations of complex bedrock channels. The inherent change in scale in such experiments requires a corresponding change in the scale of the forces represented in the flume system. Three modeling approaches have been developed that either ignore the scaling effects, utilize the change in scaled forces, or assume similarity of process between scales. An understanding of the nonlinear influence of a change in scale on all the forces involved is important to correctly analyze model results. Similarly, proper design and operation of flume experiments requires knowledge of the fundamental components of flume systems. Entrance and exit regions of the flume are used to provide good experimental conditions in the measurement region of the flume where data are collected. To insure reproducibility, large-scale turbulence must be removed in the head of the flume and velocity profiles must become fully developed in the entrance region. Water-surface slope and flow acceleration effects from downstream water-depth control must also be isolated in the exit region. Statistical design and development of representative channel substrate also influence model results in these systems. With proper experimental design, flumes may be used to investigate bedrock channel hydraulics, sediment-transport relations, and morphologic evolution. In particular, researchers have successfully used flume experiments to demonstrate the importance of turbulence and substrate characteristics in bedrock channel evolution. Turbulence often operates in a self perpetuating fashion, can erode bedrock walls with clear water and increase the mobility of sediment particles. Bedrock substrate influences channel evolution by offering varying resistance to erosion, controlling the location or type of incision and modifying the local influence of turbulence. An increased usage of scaled flume models may help to clarify the remaining uncertainties involving turbulence, channel substrate and bedrock channel evolution.
NASA Astrophysics Data System (ADS)
Ji, H.; Bhattacharjee, A.; Prager, S.; Daughton, W. S.; Bale, S. D.; Carter, T. A.; Crocker, N.; Drake, J. F.; Egedal, J.; Sarff, J.; Wallace, J.; Chen, Y.; Cutler, R.; Fox, W. R., II; Heitzenroeder, P.; Kalish, M.; Jara-Almonte, J.; Myers, C. E.; Ren, Y.; Yamada, M.; Yoo, J.
2015-12-01
The FLARE device (flare.pppl.gov) is a new intermediate-scale plasma experiment under construction at Princeton to study magnetic reconnection in regimes directly relevant to space, solar and astrophysical plasmas. The existing small-scale experiments have been focusing on the single X-line reconnection process either with small effective sizes or at low Lundquist numbers, but both of which are typically very large in natural plasmas. The configuration of the FLARE device is designed to provide experimental access to the new regimes involving multiple X-lines, as guided by a reconnection "phase diagram" [Ji & Daughton, PoP (2011)]. Most of major components of the FLARE device have been designed and are under construction. The device will be assembled and installed in 2016, followed by commissioning and operation in 2017. The planned research on FLARE as a user facility will be discussed on topics including the multiple scale nature of magnetic reconnection from global fluid scales to ion and electron kinetic scales. Results from scoping simulations based on particle and fluid codes and possible comparative research with space measurements will be presented.
NASA Astrophysics Data System (ADS)
Shi, Ao; Lu, Bo; Yang, Dangguo; Wang, Xiansheng; Wu, Junqiang; Zhou, Fangqi
2018-05-01
Coupling between aero-acoustic noise and structural vibration under high-speed open cavity flow-induced oscillation may bring about severe random vibration of the structure, and even cause structure to fatigue destruction, which threatens the flight safety. Carrying out the research on vibro-acoustic experiments of scaled down model is an effective means to clarify the effects of high-intensity noise of cavity on structural vibration. Therefore, in allusion to the vibro-acoustic experiments of cavity in wind tunnel, taking typical elastic cavity as the research object, dimensional analysis and finite element method were adopted to establish the similitude relations of structural inherent characteristics and dynamics for distorted model, and verifying the proposed similitude relations by means of experiments and numerical simulation. Research shows that, according to the analysis of scale-down model, the established similitude relations can accurately simulate the structural dynamic characteristics of actual model, which provides theoretic guidance for structural design and vibro-acoustic experiments of scaled down elastic cavity model.
NASA Astrophysics Data System (ADS)
Andreeva, J.; Dzhunov, I.; Karavakis, E.; Kokoszkiewicz, L.; Nowotka, M.; Saiz, P.; Tuckett, D.
2012-12-01
Improvements in web browser performance and web standards compliance, as well as the availability of comprehensive JavaScript libraries, provides an opportunity to develop functionally rich yet intuitive web applications that allow users to access, render and analyse data in novel ways. However, the development of such large-scale JavaScript web applications presents new challenges, in particular with regard to code sustainability and team-based work. We present an approach that meets the challenges of large-scale JavaScript web application design and development, including client-side model-view-controller architecture, design patterns, and JavaScript libraries. Furthermore, we show how the approach leads naturally to the encapsulation of the data source as a web API, allowing applications to be easily ported to new data sources. The Experiment Dashboard framework is used for the development of applications for monitoring the distributed computing activities of virtual organisations on the Worldwide LHC Computing Grid. We demonstrate the benefits of the approach for large-scale JavaScript web applications in this context by examining the design of several Experiment Dashboard applications for data processing, data transfer and site status monitoring, and by showing how they have been ported for different virtual organisations and technologies.
Binary optical filters for scale invariant pattern recognition
NASA Technical Reports Server (NTRS)
Reid, Max B.; Downie, John D.; Hine, Butler P.
1992-01-01
Binary synthetic discriminant function (BSDF) optical filters which are invariant to scale changes in the target object of more than 50 percent are demonstrated in simulation and experiment. Efficient databases of scale invariant BSDF filters can be designed which discriminate between two very similar objects at any view scaled over a factor of 2 or more. The BSDF technique has considerable advantages over other methods for achieving scale invariant object recognition, as it also allows determination of the object's scale. In addition to scale, the technique can be used to design recognition systems invariant to other geometric distortions.
Takase, Miyuki; Imai, Takiko; Uemura, Chizuru
2016-06-01
This paper examines the psychometric properties of the Learning Experience Scale. A survey method was used to collect data from a total of 502 nurses. Data were analyzed by factor analysis and the known-groups technique to examine the construct validity of the scale. In addition, internal consistency was evaluated by Cronbach's alpha, and stability was examined by test-retest correlation. Factor analysis showed that the Learning Experience Scale consisted of five factors: learning from practice, others, training, feedback, and reflection. The scale also had the power to discriminate between nurses with high and low levels of nursing competence. The internal consistency and the stability of the scale were also acceptable. The Learning Experience Scale is a valid and reliable instrument, and helps organizations to effectively design learning interventions for nurses. © 2015 Wiley Publishing Asia Pty Ltd.
Micro-Bubble Experiments at the Van de Graaff Accelerator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Z. J.; Wardle, Kent E.; Quigley, K. J.
In order to test and verify the experimental designs at the linear accelerator (LINAC), several micro-scale bubble ("micro-bubble") experiments were conducted with the 3-MeV Van de Graaff (VDG) electron accelerator. The experimental setups included a square quartz tube, sodium bisulfate solution with different concentrations, cooling coils, gas chromatography (GC) system, raster magnets, and two high-resolution cameras that were controlled by a LabVIEW program. Different beam currents were applied in the VDG irradiation. Bubble generation (radiolysis), thermal expansion, thermal convection, and radiation damage were observed in the experiments. Photographs, videos, and gas formation (O 2 + H 2) data were collected.more » The micro-bubble experiments at VDG indicate that the design of the full-scale bubble experiments at the LINAC is reasonable.« less
Similarity Rules for Scaling Solar Sail Systems
NASA Technical Reports Server (NTRS)
Canfield, Stephen L.; Peddieson, John; Garbe, Gregory
2010-01-01
Future science missions will require solar sails on the order of 200 square meters (or larger). However, ground demonstrations and flight demonstrations must be conducted at significantly smaller sizes, due to limitations of ground-based facilities and cost and availability of flight opportunities. For this reason, the ability to understand the process of scalability, as it applies to solar sail system models and test data, is crucial to the advancement of this technology. This paper will approach the problem of scaling in solar sail models by developing a set of scaling laws or similarity criteria that will provide constraints in the sail design process. These scaling laws establish functional relationships between design parameters of a prototype and model sail that are created at different geometric sizes. This work is applied to a specific solar sail configuration and results in three (four) similarity criteria for static (dynamic) sail models. Further, it is demonstrated that even in the context of unique sail material requirements and gravitational load of earth-bound experiments, it is possible to develop appropriate scaled sail experiments. In the longer term, these scaling laws can be used in the design of scaled experimental tests for solar sails and in analyzing the results from such tests.
The Variance of Intraclass Correlations in Three- and Four-Level Models
ERIC Educational Resources Information Center
Hedges, Larry V.; Hedberg, E. C.; Kuyper, Arend M.
2012-01-01
Intraclass correlations are used to summarize the variance decomposition in populations with multilevel hierarchical structure. There has recently been considerable interest in estimating intraclass correlations from surveys or designed experiments to provide design parameters for planning future large-scale randomized experiments. The large…
The Variance of Intraclass Correlations in Three and Four Level
ERIC Educational Resources Information Center
Hedges, Larry V.; Hedberg, Eric C.; Kuyper, Arend M.
2012-01-01
Intraclass correlations are used to summarize the variance decomposition in popula- tions with multilevel hierarchical structure. There has recently been considerable interest in estimating intraclass correlations from surveys or designed experiments to provide design parameters for planning future large-scale randomized experiments. The large…
Bench-scale microcosm experiments were designed to provide a better understanding of the potential for Hg methylation in sediments from an aquatic environment. Experiments were conducted to examine the function of sulfate concentration, lactate concentration, the presence/absenc...
ERIC Educational Resources Information Center
Konstantopoulos, Spyros
2013-01-01
Large-scale experiments that involve nested structures may assign treatment conditions either to subgroups such as classrooms or to individuals such as students within subgroups. Key aspects of the design of such experiments include knowledge of the variance structure in higher levels and the sample sizes necessary to reach sufficient power to…
Coal desulfurization by low temperature chlorinolysis, phase 1
NASA Technical Reports Server (NTRS)
Kalvinskas, J. J.; Hsu, G. C.; Ernest, J. B.; Andress, D. F.; Feller, D. R.
1977-01-01
The reported activity covers laboratory scale experiments on twelve bituminous, sub-bituminous and lignite coals, and preliminary design and specifications for bench-scale and mini-pilot plant equipment.
ERIC Educational Resources Information Center
Caprara, Gian Vittorio; Alessandri, Guido; Eisenberg, Nancy; Kupfer, A.; Steca, Patrizia; Caprara, Maria Giovanna; Yamaguchi, Susumu; Fukuzawa, Ai; Abela, John
2012-01-01
Five studies document the validity of a new 8-item scale designed to measure "positivity," defined as the tendency to view life and experiences with a positive outlook. In the first study (N = 372), the psychometric properties of Positivity Scale (P Scale) were examined in accordance with classical test theory using a large number of…
NASA Astrophysics Data System (ADS)
Amann, Florian; Gischig, Valentin; Evans, Keith; Doetsch, Joseph; Jalali, Reza; Valley, Benoît; Krietsch, Hannes; Dutler, Nathan; Villiger, Linus; Brixel, Bernard; Klepikova, Maria; Kittilä, Anniina; Madonna, Claudio; Wiemer, Stefan; Saar, Martin O.; Loew, Simon; Driesner, Thomas; Maurer, Hansruedi; Giardini, Domenico
2018-02-01
In this contribution, we present a review of scientific research results that address seismo-hydromechanically coupled processes relevant for the development of a sustainable heat exchanger in low-permeability crystalline rock and introduce the design of the In situ Stimulation and Circulation (ISC) experiment at the Grimsel Test Site dedicated to studying such processes under controlled conditions. The review shows that research on reservoir stimulation for deep geothermal energy exploitation has been largely based on laboratory observations, large-scale projects and numerical models. Observations of full-scale reservoir stimulations have yielded important results. However, the limited access to the reservoir and limitations in the control on the experimental conditions during deep reservoir stimulations is insufficient to resolve the details of the hydromechanical processes that would enhance process understanding in a way that aids future stimulation design. Small-scale laboratory experiments provide fundamental insights into various processes relevant for enhanced geothermal energy, but suffer from (1) difficulties and uncertainties in upscaling the results to the field scale and (2) relatively homogeneous material and stress conditions that lead to an oversimplistic fracture flow and/or hydraulic fracture propagation behavior that is not representative of a heterogeneous reservoir. Thus, there is a need for intermediate-scale hydraulic stimulation experiments with high experimental control that bridge the various scales and for which access to the target rock mass with a comprehensive monitoring system is possible. The ISC experiment is designed to address open research questions in a naturally fractured and faulted crystalline rock mass at the Grimsel Test Site (Switzerland). Two hydraulic injection phases were executed to enhance the permeability of the rock mass. During the injection phases the rock mass deformation across fractures and within intact rock, the pore pressure distribution and propagation, and the microseismic response were monitored at a high spatial and temporal resolution.
Direct Down-scale Experiments of Concentration Column Designs for SHINE Process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Youker, Amanda J.; Stepinski, Dominique C.; Vandegrift, George F.
Argonne is assisting SHINE Medical Technologies in their efforts to become a domestic Mo-99 producer. The SHINE accelerator-driven process uses a uranyl-sulfate target solution for the production of fission-product Mo-99. Argonne has developed a molybdenum recovery and purification process for this target solution. The process includes an initial Mo recovery column followed by a concentration column to reduce the product volume from 15-25 L to < 1 L prior to entry into the LEU Modified Cintichem (LMC) process for purification.1 This report discusses direct down-scale experiments of the plant-scale concentration column design, where the effects of loading velocity and temperaturemore » were investigated.« less
Flexible twist for pitch control in a high altitude long endurance aircraft with nonlinear response
NASA Astrophysics Data System (ADS)
Bond, Vanessa L.
Information dominance is the key motivator for employing high-altitude long-endurance (HALE) aircraft to provide continuous coverage in the theaters of operation. A joined-wing configuration of such a craft gives the advantage of a platform for higher resolution sensors. Design challenges emerge with structural flexibility that arise from a long-endurance aircraft design. The goal of this research was to demonstrate that scaling the nonlinear response of a full-scale finite element model was possible if the model was aeroelastically and "nonlinearly" scaled. The research within this dissertation showed that using the first three modes and the first bucking modes was not sufficient for proper scaling. In addition to analytical scaling several experiments were accomplished to understand and overcome design challenges of HALE aircraft. One such challenge is combated by eliminating pitch control surfaces and replacing them with an aft-wing twist concept. This design option was physically realized through wind tunnel measurement of forces, moments and pressures on a subscale experimental model. This design and experiment demonstrated that pitch control with aft-wing twist is feasible. Another challenge is predicting the nonlinear response of long-endurance aircraft. This was addressed by experimental validation of modeling nonlinear response on a subscale experimental model. It is important to be able to scale nonlinear behavior in this type of craft due to its highly flexible nature. The validation accomplished during this experiment on a subscale model will reduce technical risk for full-scale development of such pioneering craft. It is also important to experimentally reproduce the air loads following the wing as it deforms. Nonlinearities can be attributed to these follower forces that might otherwise be overlooked. This was found to be a significant influence in HALE aircraft to include the case study of the FEM and experimental models herein.
Nuclear Power Plant Mechanical Component Flooding Fragility Experiments Status
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, C. L.; Savage, B.; Johnson, B.
This report describes progress on Nuclear Power Plant mechanical component flooding fragility experiments and supporting research. The progress includes execution of full scale fragility experiments using hollow-core doors, design of improvements to the Portal Evaluation Tank, equipment procurement and initial installation of PET improvements, designation of experiments exploiting the improved PET capabilities, fragility mathematical model development, Smoothed Particle Hydrodynamic simulations, wave impact simulation device research, and pipe rupture mechanics research.
Geotechnical Centrifuge Experiments to Evaluate Piping in Foundation Soils
2014-05-01
verifiable results. These tests were successful in design , construction, and execution of a realistic simulation of internal erosion leading to failure...possible “scale effects,” “modeling of models” testing protocol should be included in the test program. Also, the model design should minimize the scale...recommendations for improving the centrifuge tests include the following: • Design improved system for reservoir control to provide definitive and
ERIC Educational Resources Information Center
Hedberg, E. C.; Hedges, Larry V.
2014-01-01
Randomized experiments are often considered the strongest designs to study the impact of educational interventions. Perhaps the most prevalent class of designs used in large scale education experiments is the cluster randomized design in which entire schools are assigned to treatments. In cluster randomized trials (CRTs) that assign schools to…
Designing an External Evaluation of a Large-Scale Software Development Project.
ERIC Educational Resources Information Center
Collis, Betty; Moonen, Jef
This paper describes the design and implementation of the evaluation of the POCO Project, a large-scale national software project in the Netherlands which incorporates the perspective of an evaluator throughout the entire span of the project, and uses the experiences gained from it to suggest an evaluation procedure that could be applied to other…
The Design of PSB-VVER Experiments Relevant to Accident Management
NASA Astrophysics Data System (ADS)
Nevo, Alessandro Del; D'Auria, Francesco; Mazzini, Marino; Bykov, Michael; Elkin, Ilya V.; Suslov, Alexander
Experimental programs carried-out in integral test facilities are relevant for validating the best estimate thermal-hydraulic codes(1), which are used for accident analyses, design of accident management procedures, licensing of nuclear power plants, etc. The validation process, in fact, is based on well designed experiments. It consists in the comparison of the measured and calculated parameters and the determination whether a computer code has an adequate capability in predicting the major phenomena expected to occur in the course of transient and/or accidents. University of Pisa was responsible of the numerical design of the 12 experiments executed in PSB-VVER facility (2), operated at Electrogorsk Research and Engineering Center (Russia), in the framework of the TACIS 2.03/97 Contract 3.03.03 Part A, EC financed (3). The paper describes the methodology adopted at University of Pisa, starting form the scenarios foreseen in the final test matrix until the execution of the experiments. This process considers three key topics: a) the scaling issue and the simulation, with unavoidable distortions, of the expected performance of the reference nuclear power plants; b) the code assessment process involving the identification of phenomena challenging the code models; c) the features of the concerned integral test facility (scaling limitations, control logics, data acquisition system, instrumentation, etc.). The activities performed in this respect are discussed, and emphasis is also given to the relevance of the thermal losses to the environment. This issue affects particularly the small scaled facilities and has relevance on the scaling approach related to the power and volume of the facility.
ASTP fluid transfer measurement experiment. [using breadboard model
NASA Technical Reports Server (NTRS)
Fogal, G. L.
1974-01-01
The ASTP fluid transfer measurement experiment flight system design concept was verified by the demonstration and test of a breadboard model. In addition to the breadboard effort, a conceptual design of the corresponding flight system was generated and a full scale mockup fabricated. A preliminary CEI specification for the flight system was also prepared.
Small-Scale Design Experiments as Working Space for Larger Mobile Communication Challenges
ERIC Educational Resources Information Center
Lowe, Sarah; Stuedahl, Dagny
2014-01-01
In this paper, a design experiment using Instagram as a cultural probe is submitted as a method for analyzing the challenges that arise when considering the implementation of social media within a distributed communication space. It outlines how small, iterative investigations can reveal deeper research questions relevant to the education of…
Aerodynamic Simulation of the MARINTEK Braceless Semisubmersible Wave Tank Tests
NASA Astrophysics Data System (ADS)
Stewart, Gordon; Muskulus, Michael
2016-09-01
Model scale experiments of floating offshore wind turbines are important for both platform design for the industry as well as numerical model validation for the research community. An important consideration in the wave tank testing of offshore wind turbines are scaling effects, especially the tension between accurate scaling of both hydrodynamic and aerodynamic forces. The recent MARINTEK braceless semisubmersible wave tank experiment utilizes a novel aerodynamic force actuator to decouple the scaling of the aerodynamic forces. This actuator consists of an array of motors that pull on cables to provide aerodynamic forces that are calculated by a blade-element momentum code in real time as the experiment is conducted. This type of system has the advantage of supplying realistically scaled aerodynamic forces that include dynamic forces from platform motion, but does not provide the insights into the accuracy of the aerodynamic models that an actual model-scale rotor could provide. The modeling of this system presents an interesting challenge, as there are two ways to simulate the aerodynamics; either by using the turbulent wind fields as inputs to the aerodynamic model of the design code, or by surpassing the aerodynamic model and using the forces applied to the experimental turbine as direct inputs to the simulation. This paper investigates the best practices of modeling this type of novel aerodynamic actuator using a modified wind turbine simulation tool, and demonstrates that bypassing the dynamic aerodynamics solver of design codes can lead to erroneous results.
Global Modeling, Field Campaigns, Upscaling and Ray Desjardins
NASA Technical Reports Server (NTRS)
Sellers, P. J.; Hall, F. G.
2012-01-01
In the early 1980's, it became apparent that land surface radiation and energy budgets were unrealistically represented in Global Circulation models (GCM's), Shortly thereafter, it became clear that the land carbon budget was also poorly represented in Earth System Models (ESM's), A number of scientific communities, including GCM/ESM modelers, micrometeorologists, satellite data specialists and plant physiologists, came together to design field experiments that could be used to develop and validate the contemporary prototype land surface models. These experiments were designed to measure land surface fluxes of radiation, heat, water vapor and CO2 using a network of flux towers and other plot-scale techniques, coincident with satellite measurements of related state variables, The interdisciplinary teams involved in these experiments quickly became aware of the scale gap between plot-scale measurements (approx 10 - 100m), satellite measurements (100m - 10 km), and GCM grid areas (l0 - 200km). At the time, there was no established flux measurement capability to bridge these scale gaps. Then, a Canadian science learn led by Ray Desjardins started to actively participate in the design and execution of the experiments, with airborne eddy correlation providing the radically innovative bridge across the scale gaps, In a succession of brilliantly executed field campaigns followed up by convincing scientific analyses, they demonstrated that airborne eddy correlation allied with satellite data was the most powerful upscaling tool available to the community, The rest is history: the realism and credibility of weather and climate models has been enormously improved enormously over the last 25 years with immense benefits to the public and policymakers.
Aerodynamic design of the National Rotor Testbed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kelley, Christopher Lee
2015-10-01
A new wind turbine blade has been designed for the National Rotor Testbed (NRT) project and for future experiments at the Scaled Wind Farm Technology (SWiFT) facility with a specific focus on scaled wakes. This report shows the aerodynamic design of new blades that can produce a wake that has similitude to utility scale blades despite the difference in size and location in the atmospheric boundary layer. Dimensionless quantities circulation, induction, thrust coefficient, and tip-speed-ratio were kept equal between rotor scales in region 2 of operation. The new NRT design matched the aerodynamic quantities of the most common wind turbinemore » in the United States, the GE 1.5sle turbine with 37c model blades. The NRT blade design is presented along with its performance subject to the winds at SWiFT. The design requirements determined by the SWiFT experimental test campaign are shown to be met.« less
Multi Length Scale Finite Element Design Framework for Advanced Woven Fabrics
NASA Astrophysics Data System (ADS)
Erol, Galip Ozan
Woven fabrics are integral parts of many engineering applications spanning from personal protective garments to surgical scaffolds. They provide a wide range of opportunities in designing advanced structures because of their high tenacity, flexibility, high strength-to-weight ratios and versatility. These advantages result from their inherent multi scale nature where the filaments are bundled together to create yarns while the yarns are arranged into different weave architectures. Their highly versatile nature opens up potential for a wide range of mechanical properties which can be adjusted based on the application. While woven fabrics are viable options for design of various engineering systems, being able to understand the underlying mechanisms of the deformation and associated highly nonlinear mechanical response is important and necessary. However, the multiscale nature and relationships between these scales make the design process involving woven fabrics a challenging task. The objective of this work is to develop a multiscale numerical design framework using experimentally validated mesoscopic and macroscopic length scale approaches by identifying important deformation mechanisms and recognizing the nonlinear mechanical response of woven fabrics. This framework is exercised by developing mesoscopic length scale constitutive models to investigate plain weave fabric response under a wide range of loading conditions. A hyperelastic transversely isotropic yarn material model with transverse material nonlinearity is developed for woven yarns (commonly used in personal protection garments). The material properties/parameters are determined through an inverse method where unit cell finite element simulations are coupled with experiments. The developed yarn material model is validated by simulating full scale uniaxial tensile, bias extension and indentation experiments, and comparing to experimentally observed mechanical response and deformation mechanisms. Moreover, mesoscopic unit cell finite elements are coupled with a design-of-experiments method to systematically identify the important yarn material properties for the macroscale response of various weave architectures. To demonstrate the macroscopic length scale approach, two new material models for woven fabrics were developed. The Planar Material Model (PMM) utilizes two important deformation mechanisms in woven fabrics: (1) yarn elongation, and (2) relative yarn rotation due to shear loads. The yarns' uniaxial tensile response is modeled with a nonlinear spring using constitutive relations while a nonlinear rotational spring is implemented to define fabric's shear stiffness. The second material model, Sawtooth Material Model (SMM) adopts the sawtooth geometry while recognizing the biaxial nature of woven fabrics by implementing the interactions between the yarns. Material properties/parameters required by both PMM and SMM can be directly determined from standard experiments. Both macroscopic material models are implemented within an explicit finite element code and validated by comparing to the experiments. Then, the developed macroscopic material models are compared under various loading conditions to determine their accuracy. Finally, the numerical models developed in the mesoscopic and macroscopic length scales are linked thus demonstrating the new systematic design framework involving linked mesoscopic and macroscopic length scale modeling approaches. The approach is demonstrated with both Planar and Sawtooth Material Models and the simulation results are verified by comparing the results obtained from meso and macro models.
The KATRIN experiment The KATRIN experiment is designed to make a direct measurement of the mass experiment, scaled up by an order of magnitude in size, precision and tritium source intensity from previous experiments. Visit the experiment home page for more information. Gallery SimpleViewer requires JavaScript and
ERIC Educational Resources Information Center
Tharayil, Davis Porinchu
2012-01-01
As the existing scales to measure loneliness are almost all Western and there is no single scale developed cross-culturally for this purpose, this study is designed to develop a reliable and valid scale to measure the experience of loneliness of individuals from individualistic or collectivistic cultures. There are three samples for this study…
High-speed inlet research program and supporting analysis
NASA Technical Reports Server (NTRS)
Coltrin, Robert E.
1990-01-01
The technology challenges faced by the high speed inlet designer are discussed by describing the considerations that went into the design of the Mach 5 research inlet. It is shown that the emerging three dimensional viscous computational fluid dynamics (CFD) flow codes, together with small scale experiments, can be used to guide larger scale full inlet systems research. Then, in turn, the results of the large scale research, if properly instrumented, can be used to validate or at least to calibrate the CFD codes.
NASA Technical Reports Server (NTRS)
Avizienis, A.; Gunningberg, P.; Kelly, J. P. J.; Strigini, L.; Traverse, P. J.; Tso, K. S.; Voges, U.
1986-01-01
To establish a long-term research facility for experimental investigations of design diversity as a means of achieving fault-tolerant systems, a distributed testbed for multiple-version software was designed. It is part of a local network, which utilizes the Locus distributed operating system to operate a set of 20 VAX 11/750 computers. It is used in experiments to measure the efficacy of design diversity and to investigate reliability increases under large-scale, controlled experimental conditions.
The POLARBEAR Experiment: Design and Characterization
NASA Astrophysics Data System (ADS)
Kermish, Zigmund David
We present the design and characterization of the
Alshammasi, Hussain; Buchanan, Heather; Ashley, Paul
2018-01-01
Assessing anxiety is an important part of the assessment of a child presenting for dental treatment; however, the use of dental anxiety scales in practice is not well-documented. To introduce child dental anxiety scales, and to monitor the extent to which dentists used them; to explore the experience and views of dentists regarding anxiety assessment. A mixed-methods design was employed. A protocol for child anxiety assessment was introduced to paediatric dentists in Eastman Dental Hospital. After 6 months, 100 patient files were audited to examine compliance with the protocol. Fourteen dentists were interviewed to explore their experience and views regarding anxiety assessment. Only five patients were assessed using the scales. Thematic analysis of the dentist interviews revealed three themes: 'Clinical observations and experience: The gold standard'; 'Scales as an estimate or adjunct'; and 'Shortcomings and barriers to using scales'. The dentists in our study did not use anxiety scales, instead they rely on their own experience/judgement. Therefore, scales should be recommended as an adjunct to judgement. Brief scales are recommended as clinicians lack time and expertise in administering anxiety questionnaires. Advantages of using scales and hands-on experience could be incorporated more in undergraduate training. © 2017 BSPD, IAPD and John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Chernicharo, C A L; Almeida, P G S; Lobato, L C S; Couto, T C; Borges, J M; Lacerda, Y S
2009-01-01
This paper discusses the main drawbacks and enhancements experienced with the design and start up of two full-scale UASB plants in Brazil. The topics addressed are related to blockage of inlet pipes, scum accumulation, seed sludge for the start-up, corrosion and gas leakage, odour generation and sludge management. The paper describes the main improvements achieved.
Straight scaling FFAG beam line
NASA Astrophysics Data System (ADS)
Lagrange, J.-B.; Planche, T.; Yamakawa, E.; Uesugi, T.; Ishi, Y.; Kuriyama, Y.; Qin, B.; Okabe, K.; Mori, Y.
2012-11-01
Fixed field alternating gradient (FFAG) accelerators are recently subject to a strong revival. They are usually designed in a circular shape; however, it would be an asset to guide particles with no overall bend in this type of accelerator. An analytical development of a straight FFAG cell which keeps zero-chromaticity is presented here. A magnetic field law is thus obtained, called "straight scaling law", and an experiment has been conducted to confirm this zero-chromatic law. A straight scaling FFAG prototype has been designed and manufactured, and horizontal phase advances of two different energies are measured. Results are analyzed to clarify the straight scaling law.
NASA Astrophysics Data System (ADS)
Ji, H.; Bhattacharjee, A.; Goodman, A.; Prager, S.; Daughton, W.; Cutler, R.; Fox, W.; Hoffmann, F.; Kalish, M.; Kozub, T.; Jara-Almonte, J.; Myers, C.; Ren, Y.; Sloboda, P.; Yamada, M.; Yoo, J.; Bale, S. D.; Carter, T.; Dorfman, S.; Drake, J.; Egedal, J.; Sarff, J.; Wallace, J.
2017-10-01
The FLARE device (Facility for Laboratory Reconnection Experiments; flare.pppl.gov) is a new laboratory experiment under construction at Princeton with first plasmas expected in the fall of 2017, based on the design of Magnetic Reconnection Experiment (MRX; mrx.pppl.gov) with much extended parameter ranges. Its main objective is to provide an experimental platform for the studies of magnetic reconnection and related phenomena in the multiple X-line regimes directly relevant to space, solar, astrophysical and fusion plasmas. The main diagnostics is an extensive set of magnetic probe arrays, simultaneously covering multiple scales from local electron scales ( 2 mm), to intermediate ion scales ( 10 cm), and global MHD scales ( 1 m). Specific example space physics topics which can be studied on FLARE will be discussed.
ERIC Educational Resources Information Center
Smith, York R.; Fuchs, Alan; Meyyappan, M.
2010-01-01
Senior year chemical engineering students designed a process to produce 10 000 tonnes per annum of single wall carbon nanotubes (SWNT) and also conducted bench-top experiments to synthesize SWNTs via fluidized bed chemical vapor deposition techniques. This was an excellent pedagogical experience because it related to the type of real world design…
ERIC Educational Resources Information Center
Floyd, Randy G.; Shands, Elizabeth I.; Alfonso, Vincent C.; Phillips, Jessica F.; Autry, Beth K.; Mosteller, Jessica A.; Skinner, Mary; Irby, Sarah
2015-01-01
Adaptive behavior scales are vital in assessing children and adolescents who experience a range of disabling conditions in school settings. This article presents the results of an evaluation of the design characteristics, norming, scale characteristics, reliability and validity evidence, and bias identification studies supporting 14…
Unmanned Vehicle Material Flammability Test
NASA Technical Reports Server (NTRS)
Urban, David; Ruff, Gary A.; Fernandez-Pello, A. Carlos; T’ien, James S.; Torero, Jose L.; Cowlard, Adam; Rouvreau, Sebastian; Minster, Olivier; Toth, Balazs; Legros, Guillaume;
2013-01-01
Microgravity combustion phenomena have been an active area of research for the past 3 decades however, there have been very few experiments directly studying spacecraft fire safety under low-gravity conditions. Furthermore, none of these experiments have studied sample and environment sizes typical of those expected in a spacecraft fire. All previous experiments have been limited to samples of the order of 10 cm in length and width or smaller. Terrestrial fire safety standards for all other habitable volumes on earth, e.g. mines, buildings, airplanes, ships, etc., are based upon testing conducted with full-scale fires. Given the large differences between fire behavior in normal and reduced gravity, this lack of an experimental data base at relevant length scales forces spacecraft designers to base their designs using 1-g understanding. To address this question a large scale spacecraft fire experiment has been proposed by an international team of investigators. This poster presents the objectives, status and concept of this collaborative international project to examine spacecraft material flammability at realistic scales. The concept behind this project is to utilize an unmanned spacecraft such as Orbital Cygnus vehicle after it has completed its delivery of cargo to the ISS and it has begun its return journey to earth. This experiment will consist of a flame spread test involving a meter scale sample ignited in the pressurized volume of the spacecraft and allowed to burn to completion while measurements are made. A computer modeling effort will complement the experimental effort. Although the experiment will need to meet rigorous safety requirements to ensure the carrier vehicle does not sustain damage, the absence of a crew removes the need for strict containment of combustion products. This will facilitate the examination of fire behavior on a scale that is relevant to spacecraft fire safety and will provide unique data for fire model validation. This will be the first opportunity to examine microgravity flame behavior at scales approximating a spacecraft fire.
Comparison of batch sorption tests, pilot studies, and modeling for estimating GAC bed life.
Scharf, Roger G; Johnston, Robert W; Semmens, Michael J; Hozalski, Raymond M
2010-02-01
Saint Paul Regional Water Services (SPRWS) in Saint Paul, MN experiences annual taste and odor episodes during the warm summer months. These episodes are attributed primarily to geosmin that is produced by cyanobacteria growing in the chain of lakes used to convey and store the source water pumped from the Mississippi River. Batch experiments, pilot-scale experiments, and model simulations were performed to determine the geosmin removal performance and bed life of a granular activated carbon (GAC) filter-sorber. Using batch adsorption isotherm parameters, the estimated bed life for the GAC filter-sorber ranged from 920 to 1241 days when challenged with a constant concentration of 100 ng/L of geosmin. The estimated bed life obtained using the AdDesignS model and the actual pilot-plant loading history was 594 days. Based on the pilot-scale GAC column data, the actual bed life (>714 days) was much longer than the simulated values because bed life was extended by biological degradation of geosmin. The continuous feeding of high concentrations of geosmin (100-400 ng/L) in the pilot-scale experiments enriched for a robust geosmin-degrading culture that was sustained when the geosmin feed was turned off for 40 days. It is unclear, however, whether a geosmin-degrading culture can be established in a full-scale filter that experiences taste and odor episodes for only 1 or 2 months per year. The results of this research indicate that care must be exercised in the design and interpretation of pilot-scale experiments and model simulations for predicting taste and odor removal in full-scale GAC filter-sorbers. Adsorption and the potential for biological degradation must be considered to estimate GAC bed life for the conditions of intermittent geosmin loading typically experienced by full-scale systems. (c) 2009 Elsevier Ltd. All rights reserved.
Recent "Ground Testing" Experiences in the National Full-Scale Aerodynamics Complex
NASA Technical Reports Server (NTRS)
Zell, Peter; Stich, Phil; Sverdrup, Jacobs; George, M. W. (Technical Monitor)
2002-01-01
The large test sections of the National Full-scale Aerodynamics Complex (NFAC) wind tunnels provide ideal controlled wind environments to test ground-based objects and vehicles. Though this facility was designed and provisioned primarily for aeronautical testing requirements, several experiments have been designed to utilize existing model mount structures to support "non-flying" systems. This presentation will discuss some of the ground-based testing capabilities of the facility and provide examples of groundbased tests conducted in the facility to date. It will also address some future work envisioned and solicit input from the SATA membership on ways to improve the service that NASA makes available to customers.
Ishikawa, Tomohiro; Mori, Yojiro; Hasegawa, Hiroshi; Subramaniam, Suresh; Sato, Ken-Ichi; Moriwaki, Osamu
2017-07-10
A novel compact OXC node architecture that combines WSSs and arrays of small scale optical delivery-coupling type switches ("DCSWs") is proposed. Unlike conventional OXC nodes, the WSSs are only responsible for dynamic path bundling ("flexible waveband") while the small scale optical switches route bundled path groups. A network design algorithm that is aware of the routing scheme is also proposed, and numerical experiments elucidate that the necessary number of WSSs and amplifiers can be significantly reduced. A prototype of the proposed OXC is also developed using monolithic arrayed DCSWs. Transmission experiments on the prototype verify the proposal's technical feasibility.
Innovative Water Management Technology to Reduce Environmental Impacts of Produced Water
DOE Office of Scientific and Technical Information (OSTI.GOV)
Castle, James; Rodgers, John; Alley, Bethany
2013-05-15
Clemson University with Chevron as an industry partner developed and applied treatment technology using constructed wetland systems to decrease targeted constituents in simulated and actual produced waters to achieve reuse criteria and discharge limits. Pilot-scale and demonstration constructed wetland treatment system (CWTS) experiments led to design strategies for treating a variety of constituents of concern (COCs) in produced waters including divalent metals, metalloids, oil and grease, and ammonia. Targeted biogeochemical pathways for treatment of COCs in pilot-scale CWTS experiments included divalent metal sulfide precipitation through dissimilatory sulfate reduction, metal precipitation through oxidation, reduction of selenite to insoluble elemental selenium, aerobicmore » biodegradation of oil, nitrification of ammonia to nitrate, denitrification of nitrate to nitrogen gas, separation of oil using an oilwater separator, and sorption of ammonia to zeolite. Treatment performance results indicated that CWTSs can be designed and built to promote specific environmental and geochemical conditions in order for targeted biogeochemical pathways to operate. The demonstration system successfully achieved consistent removal extents even while inflow concentrations of COCs in the produced water differed by orders of magnitude. Design strategies used in the pilot-scale and demonstration CWTSs to promote specific conditions that can be applied to designing full-scale CWTSs include plant and soil selection, water-depth selection, addition of amendments, and hydraulic retention time (HRT). These strategies allow conditions within a CWTS to be modified to achieve ranges necessary for the preferred biogeochemical treatment pathways. In the case of renovating a produced water containing COCs that require different biogeochemical pathways for treatment, a CWTS can be designed with sequential cells that promote different conditions. For example, the pilot-scale CWTS for post-reverse osmosis produced water was designed to promote oxidizing conditions within the first wetland cell for nitrification of ammonia, and the subsequent three cells were designed to promote reducing conditions for denitrification of nitrate. By incorporating multiple wetland cells in a CWTS, the conditions within each cell can be modified for removal of specific COCs. In addition, a CWTS designed with multiple cells allows for convenient sample collection points so that biogeochemical conditions of individual cells can be monitored and performance evaluated. Removal rate coefficients determined from the pilot-scale CWTS experiments and confirmed by the demonstration system can be used to calculate HRTs required to treat COCs in full-scale CWTSs. The calculated HRTs can then be used to determine the surface area or ?footprint? of a full-size CWTS for a given inflow rate of produced water.« less
Innovative Water Management Technology to Reduce Environment Impacts of Produced Water
DOE Office of Scientific and Technical Information (OSTI.GOV)
Castle, James W.; Rodgers, John H.; Alley, Bethany
2013-08-08
Clemson University with Chevron as an industry partner developed and applied treatment technology using constructed wetland systems to decrease targeted constituents in simulated and actual produced waters to achieve reuse criteria and discharge limits. Pilot-scale and demonstration constructed wetland treatment system (CWTS) experiments led to design strategies for treating a variety of constituents of concern (COCs) in produced waters including divalent metals, metalloids, oil and grease, and ammonia. Targeted biogeochemical pathways for treatment of COCs in pilot-scale CWTS experiments included divalent metal sulfide precipitation through dissimilatory sulfate reduction, metal precipitation through oxidation, reduction of selenite to insoluble elemental selenium, aerobicmore » biodegradation of oil, nitrification of ammonia to nitrate, denitrification of nitrate to nitrogen gas, separation of oil using an oilwater separator, and sorption of ammonia to zeolite. Treatment performance results indicated that CWTSs can be designed and built to promote specific environmental and geochemical conditions in order for targeted biogeochemical pathways to operate. The demonstration system successfully achieved consistent removal extents even while inflow concentrations of COCs in the produced water differed by orders of magnitude. Design strategies used in the pilot-scale and demonstration CWTSs to promote specific conditions that can be applied to designing full-scale CWTSs include plant and soil selection, water-depth selection, addition of amendments, and hydraulic retention time (HRT). These strategies allow conditions within a CWTS to be modified to achieve ranges necessary for the preferred biogeochemical treatment pathways. In the case of renovating a produced water containing COCs that require different biogeochemical pathways for treatment, a CWTS can be designed with sequential cells that promote different conditions. For example, the pilot-scale CWTS for post-reverse osmosis produced water was designed to promote oxidizing conditions within the first wetland cell for nitrification of ammonia, and the subsequent three cells were designed to promote reducing conditions for denitrification of nitrate. By incorporating multiple wetland cells in a CWTS, the conditions within each cell can be modified for removal of specific COCs. In addition, a CWTS designed with multiple cells allows for convenient sample collection points so that biogeochemical conditions of individual cells can be monitored and performance evaluated. Removal rate coefficients determined from the pilot-scale CWTS experiments and confirmed by the demonstration system can be used to calculate HRTs required to treat COCs in full-scale CWTSs. The calculated HRTs can then be used to determine the surface area or footprint of a full-size CWTS for a given inflow rate of produced water.« less
Innovative Water Management Technology to Reduce Environment Impacts of Produced Water
DOE Office of Scientific and Technical Information (OSTI.GOV)
Castle, James; Rodgers, John; Alley, Bethany
2013-05-15
Clemson University with Chevron as an industry partner developed and applied treatment technology using constructed wetland systems to decrease targeted constituents in simulated and actual produced waters to achieve reuse criteria and discharge limits. Pilot-scale and demonstration constructed wetland treatment system (CWTS) experiments led to design strategies for treating a variety of constituents of concern (COCs) in produced waters including divalent metals, metalloids, oil and grease, and ammonia. Targeted biogeochemical pathways for treatment of COCs in pilot-scale CWTS experiments included divalent metal sulfide precipitation through dissimilatory sulfate reduction, metal precipitation through oxidation, reduction of selenite to insoluble elemental selenium, aerobicmore » biodegradation of oil, nitrification of ammonia to nitrate, denitrification of nitrate to nitrogen gas, separation of oil using an oilwater separator, and sorption of ammonia to zeolite. Treatment performance results indicated that CWTSs can be designed and built to promote specific environmental and geochemical conditions in order for targeted biogeochemical pathways to operate. The demonstration system successfully achieved consistent removal extents even while inflow concentrations of COCs in the produced water differed by orders of magnitude. Design strategies used in the pilot-scale and demonstration CWTSs to promote specific conditions that can be applied to designing full-scale CWTSs include plant and soil selection, water-depth selection, addition of amendments, and hydraulic retention time (HRT). These strategies allow conditions within a CWTS to be modified to achieve ranges necessary for the preferred biogeochemical treatment pathways. In the case of renovating a produced water containing COCs that require different biogeochemical pathways for treatment, a CWTS can be designed with sequential cells that promote different conditions. For example, the pilot-scale CWTS for post-reverse osmosis produced water was designed to promote oxidizing conditions within the first wetland cell for nitrification of ammonia, and the subsequent three cells were designed to promote reducing conditions for denitrification of nitrate. By incorporating multiple wetland cells in a CWTS, the conditions within each cell can be modified for removal of specific COCs. In addition, a CWTS designed with multiple cells allows for convenient sample collection points so that biogeochemical conditions of individual cells can be monitored and performance evaluated. Removal rate coefficients determined from the pilot-scale CWTS experiments and confirmed by the demonstration system can be used to calculate HRTs required to treat COCs in full-scale CWTSs. The calculated HRTs can then be used to determine the surface area or footprint of a full-size CWTS for a given inflow rate of produced water.« less
NASA Cold Land Processes Experiment (CLPX 2002/03): Local scale observation site
Janet Hardy; Robert Davis; Yeohoon Koh; Don Cline; Kelly Elder; Richard Armstrong; Hans-Peter Marshall; Thomas Painter; Gilles Castres Saint-Martin; Roger DeRoo; Kamal Sarabandi; Tobias Graf; Toshio Koike; Kyle McDonald
2008-01-01
The local scale observation site (LSOS) is the smallest study site (0.8 ha) of the 2002/03 Cold Land Processes Experiment (CLPX) and is located within the Fraser mesocell study area. It was the most intensively measured site of the CLPX, and measurements here had the greatest temporal component of all CLPX sites. Measurements made at the LSOS were designed to produce a...
Apollo-Soyuz pamphlet no. 4: Gravitational field. [experimental design
NASA Technical Reports Server (NTRS)
Page, L. W.; From, T. P.
1977-01-01
Two Apollo Soyuz experiments designed to detect gravity anomalies from spacecraft motion are described. The geodynamics experiment (MA-128) measured large-scale gravity anomalies by detecting small accelerations of Apollo in the 222 km orbit, using Doppler tracking from the ATS-6 satellite. Experiment MA-089 measured 300 km anomalies on the earth's surface by detecting minute changes in the separation between Apollo and the docking module. Topics discussed in relation to these experiments include the Doppler effect, gravimeters, and the discovery of mascons on the moon.
Assessment of Scaled Rotors for Wind Tunnel Experiments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maniaci, David Charles; Kelley, Christopher Lee; Chiu, Phillip
2015-07-01
Rotor design and analysis work has been performed to support the conceptualization of a wind tunnel test focused on studying wake dynamics. This wind tunnel test would serve as part of a larger model validation campaign that is part of the Department of Energy Wind and Water Power Program’s Atmosphere to electrons (A2e) initiative. The first phase of this effort was directed towards designing a functionally scaled rotor based on the same design process and target full-scale turbine used for new rotors for the DOE/SNL SWiFT site. The second phase focused on assessing the capabilities of an already available rotor,more » the G1, designed and built by researchers at the Technical University of München.« less
NASA Astrophysics Data System (ADS)
Oblath, Noah; Project 8 Collaboration
2016-09-01
We report on the design concept for Phase III of the Project 8 experiment. In the third phase of Project 8 we aim to place a limit on the neutrino mass that is similar to the current limits set by tritium beta-decay experiments, mν < 2eV . From the first two phases of Project 8 we move to a novel design consisting of a 100cm3 cylindrical volume of tritium gas instrumented with two 30-element rings of inward-facing antennas. Beam-forming techniques similar to those used in radioastronomy will be employed to search for and track electron signals in the fiducial volume. This talk will present the quantitative design concept for the phased-array receiver, and illustrate how we are progressing towards the Phase IV experiment, which will have sensitivity to the neutrino mass scale allowed by the inverted mass hierarchy. This work is supported by the DOE Office of Science Early Career Research Program, and the Laboratory Directed Research and Development Program at Pacific Northwest National Laboratory.
NASA Astrophysics Data System (ADS)
Jiao, J.; Trautz, A.; Zhang, Y.; Illangasekera, T.
2017-12-01
Subsurface flow and transport characterization under data-sparse condition is addressed by a new and computationally efficient inverse theory that simultaneously estimates parameters, state variables, and boundary conditions. Uncertainty in static data can be accounted for while parameter structure can be complex due to process uncertainty. The approach has been successfully extended to inverting transient and unsaturated flows as well as contaminant source identification under unknown initial and boundary conditions. In one example, by sampling numerical experiments simulating two-dimensional steady-state flow in which tracer migrates, a sequential inversion scheme first estimates the flow field and permeability structure before the evolution of tracer plume and dispersivities are jointly estimated. Compared to traditional inversion techniques, the theory does not use forward simulations to assess model-data misfits, thus the knowledge of the difficult-to-determine site boundary condition is not required. To test the general applicability of the theory, data generated during high-precision intermediate-scale experiments (i.e., a scale intermediary to the field and column scales) in large synthetic aquifers can be used. The design of such experiments is not trivial as laboratory conditions have to be selected to mimic natural systems in order to provide useful data, thus requiring a variety of sensors and data collection strategies. This paper presents the design of such an experiment in a synthetic, multi-layered aquifer with dimensions of 242.7 x 119.3 x 7.7 cm3. Different experimental scenarios that will generate data to validate the theory are presented.
Large-scale flow experiments for managing river systems
Konrad, Christopher P.; Olden, Julian D.; Lytle, David A.; Melis, Theodore S.; Schmidt, John C.; Bray, Erin N.; Freeman, Mary C.; Gido, Keith B.; Hemphill, Nina P.; Kennard, Mark J.; McMullen, Laura E.; Mims, Meryl C.; Pyron, Mark; Robinson, Christopher T.; Williams, John G.
2011-01-01
Experimental manipulations of streamflow have been used globally in recent decades to mitigate the impacts of dam operations on river systems. Rivers are challenging subjects for experimentation, because they are open systems that cannot be isolated from their social context. We identify principles to address the challenges of conducting effective large-scale flow experiments. Flow experiments have both scientific and social value when they help to resolve specific questions about the ecological action of flow with a clear nexus to water policies and decisions. Water managers must integrate new information into operating policies for large-scale experiments to be effective. Modeling and monitoring can be integrated with experiments to analyze long-term ecological responses. Experimental design should include spatially extensive observations and well-defined, repeated treatments. Large-scale flow manipulations are only a part of dam operations that affect river systems. Scientists can ensure that experimental manipulations continue to be a valuable approach for the scientifically based management of river systems.
Designing Trend-Monitoring Sounds for Helicopters: Methodological Issues and an Application
ERIC Educational Resources Information Center
Edworthy, Judy; Hellier, Elizabeth; Aldrich, Kirsteen; Loxley, Sarah
2004-01-01
This article explores methodological issues in sonification and sound design arising from the design of helicopter monitoring sounds. Six monitoring sounds (each with 5 levels) were tested for similarity and meaning with 3 different techniques: hierarchical cluster analysis, linkage analysis, and multidimensional scaling. In Experiment 1,…
What controls deposition rate in electron-beam chemical vapor deposition?
White, William B; Rykaczewski, Konrad; Fedorov, Andrei G
2006-08-25
The key physical processes governing electron-beam-assisted chemical vapor deposition are analyzed via a combination of theoretical modeling and supporting experiments. The scaling laws that define growth of the nanoscale deposits are developed and verified using carefully designed experiments of carbon deposition from methane onto a silicon substrate. The results suggest that the chamber-scale continuous transport of the precursor gas is the rate controlling process in electron-beam chemical vapor deposition.
Preparing university students to lead K-12 engineering outreach programmes: a design experiment
NASA Astrophysics Data System (ADS)
Anthony, Anika B.; Greene, Howard; Post, Paul E.; Parkhurst, Andrew; Zhan, Xi
2016-11-01
This paper describes an engineering outreach programme designed to increase the interest of under-represented youth in engineering and to disseminate pre-engineering design challenge materials to K-12 educators and volunteers. Given university students' critical role as facilitators of the outreach programme, researchers conducted a two-year design experiment to examine the programme's effectiveness at preparing university students to lead pre-engineering activities. Pre- and post-surveys incorporated items from the Student Engagement sub-scale of the Teacher Sense of Efficacy Scale. Surveys were analysed using paired-samples t-test. Interview and open-ended survey data were analysed using discourse analysis and the constant comparative method. As a result of participation in the programme, university students reported a gain in efficacy to lead pre-engineering activities. The paper discusses programme features that supported efficacy gains and concludes with a set of design principles for developing learning environments that effectively prepare university students to facilitate pre-engineering outreach programmes.
Field-scale experiments reveal persistent yield gaps in low-input and organic cropping systems
Kravchenko, Alexandra N.; Snapp, Sieglinde S.; Robertson, G. Philip
2017-01-01
Knowledge of production-system performance is largely based on observations at the experimental plot scale. Although yield gaps between plot-scale and field-scale research are widely acknowledged, their extent and persistence have not been experimentally examined in a systematic manner. At a site in southwest Michigan, we conducted a 6-y experiment to test the accuracy with which plot-scale crop-yield results can inform field-scale conclusions. We compared conventional versus alternative, that is, reduced-input and biologically based–organic, management practices for a corn–soybean–wheat rotation in a randomized complete block-design experiment, using 27 commercial-size agricultural fields. Nearby plot-scale experiments (0.02-ha to 1.0-ha plots) provided a comparison of plot versus field performance. We found that plot-scale yields well matched field-scale yields for conventional management but not for alternative systems. For all three crops, at the plot scale, reduced-input and conventional managements produced similar yields; at the field scale, reduced-input yields were lower than conventional. For soybeans at the plot scale, biological and conventional managements produced similar yields; at the field scale, biological yielded less than conventional. For corn, biological management produced lower yields than conventional in both plot- and field-scale experiments. Wheat yields appeared to be less affected by the experimental scale than corn and soybean. Conventional management was more resilient to field-scale challenges than alternative practices, which were more dependent on timely management interventions; in particular, mechanical weed control. Results underscore the need for much wider adoption of field-scale experimentation when assessing new technologies and production-system performance, especially as related to closing yield gaps in organic farming and in low-resourced systems typical of much of the developing world. PMID:28096409
Field-scale experiments reveal persistent yield gaps in low-input and organic cropping systems.
Kravchenko, Alexandra N; Snapp, Sieglinde S; Robertson, G Philip
2017-01-31
Knowledge of production-system performance is largely based on observations at the experimental plot scale. Although yield gaps between plot-scale and field-scale research are widely acknowledged, their extent and persistence have not been experimentally examined in a systematic manner. At a site in southwest Michigan, we conducted a 6-y experiment to test the accuracy with which plot-scale crop-yield results can inform field-scale conclusions. We compared conventional versus alternative, that is, reduced-input and biologically based-organic, management practices for a corn-soybean-wheat rotation in a randomized complete block-design experiment, using 27 commercial-size agricultural fields. Nearby plot-scale experiments (0.02-ha to 1.0-ha plots) provided a comparison of plot versus field performance. We found that plot-scale yields well matched field-scale yields for conventional management but not for alternative systems. For all three crops, at the plot scale, reduced-input and conventional managements produced similar yields; at the field scale, reduced-input yields were lower than conventional. For soybeans at the plot scale, biological and conventional managements produced similar yields; at the field scale, biological yielded less than conventional. For corn, biological management produced lower yields than conventional in both plot- and field-scale experiments. Wheat yields appeared to be less affected by the experimental scale than corn and soybean. Conventional management was more resilient to field-scale challenges than alternative practices, which were more dependent on timely management interventions; in particular, mechanical weed control. Results underscore the need for much wider adoption of field-scale experimentation when assessing new technologies and production-system performance, especially as related to closing yield gaps in organic farming and in low-resourced systems typical of much of the developing world.
1993-04-01
wave buoy provided by SEATEX, Norway (Figure 3). The modified Mills-cross array was designed to provide spatial estimates of the variation in wave, wind... designed for SWADE to examine the wave physics at different spatial and temporal scales, and the usefulness of a nested system. Each grid is supposed to...field specification. SWADE Model This high-resolution grid was designed to simulate the small scale wave physics and to improve and verify the source
Oxygen and iron production by electrolytic smelting of lunar soil
NASA Technical Reports Server (NTRS)
Colson, R. O.; Haskin, L. A.
1992-01-01
Work during the past year involved two aspects: (1) electrolysis experiments on a larger scale than done before, and (2) collaboration with Carbotek Inc. on design for a lunar magma electrolysis cell. It was demonstrated previously that oxygen can be produced by direct electrolysis of silicate melts. Previous experiments using 50-100 mg of melt have succeeded in measuring melt resistivities, oxygen production efficiencies, and have identified the character of metal products. A series of experiments using 1-8 grams of silicate melt, done in alumina and spinel containers sufficiently large that surface tension effects between the melt and the wall are expected to have minor effect on the behavior of the melt in the region of the electrodes were completed. The purpose of these experiments was to demonstrate the durability of the electrode and container materials, demonstrate the energy efficiency of the electrolysis process, further characterize the nature of the expected metal and spinel products, measure the efficiency of oxygen production and compare to that predicted on the basis of the smaller-scale experiments, and identify any unexpected benefits or problems of the process. Four experimental designs were employed. Detailed results of these experiments are given in the appendix ('Summary of scaling-up experiments'); a general report of the results is given in terms of implications of the experiments on container materials, cathode materials, anode materials, bubble formation and frothing of the melt, cell potential, anode-cathode distance, oxygen efficiency, and energy efficiency.
Charles E. Peterson; Douglas Maguire
2005-01-01
Balancing Ecosystem Values: Innovative Experiments for Sustainable Forestry is a compendium of more than 40 contributions from Asia, Europe, and North America. The theme encompasses experiments implemented at an operational scale to test ecological, social, or economic responses to silvicultural treatments designed to balance the complex set of...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Corradin, Michael; Anderson, M.; Muci, M.
This experimental study investigates the thermal hydraulic behavior and the heat removal performance for a scaled Reactor Cavity Cooling System (RCCS) with air. A quarter-scale RCCS facility was designed and built based on a full-scale General Atomics (GA) RCCS design concept for the Modular High Temperature Gas Reactor (MHTGR). The GA RCCS is a passive cooling system that draws in air to use as the cooling fluid to remove heat radiated from the reactor pressure vessel to the air-cooled riser tubes and discharged the heated air into the atmosphere. Scaling laws were used to preserve key aspects and to maintainmore » similarity. The scaled air RCCS facility at UW-Madison is a quarter-scale reduced length experiment housing six riser ducts that represent a 9.5° sector slice of the full-scale GA air RCCS concept. Radiant heaters were used to simulate the heat radiation from the reactor pressure vessel. The maximum power that can be achieved with the radiant heaters is 40 kW with a peak heat flux of 25 kW per meter squared. The quarter-scale RCCS was run under different heat loading cases and operated successfully. Instabilities were observed in some experiments in which one of the two exhaust ducts experienced a flow reversal for a period of time. The data and analysis presented show that the RCCS has promising potential to be a decay heat removal system during an accident scenario.« less
Comparison of the hedonic general Labeled Magnitude Scale with the hedonic 9-point scale.
Kalva, Jaclyn J; Sims, Charles A; Puentes, Lorenzo A; Snyder, Derek J; Bartoshuk, Linda M
2014-02-01
The hedonic 9-point scale was designed to compare palatability among different food items; however, it has also been used occasionally to compare individuals and groups. Such comparisons can be invalid because scale labels (for example, "like extremely") can denote systematically different hedonic intensities across some groups. Addressing this problem, the hedonic general Labeled Magnitude Scale (gLMS) frames affective experience in terms of the strongest imaginable liking/disliking of any kind, which can yield valid group comparisons of food palatability provided extreme hedonic experiences are unrelated to food. For each scale, 200 panelists rated affect for remembered food products (including favorite and least favorite foods) and sampled foods; they also sampled taste stimuli (quinine, sucrose, NaCl, citric acid) and rated their intensity. Finally, subjects identified experiences representing the endpoints of the hedonic gLMS. Both scales were similar in their ability to detect within-subject hedonic differences across a range of food experiences, but group comparisons favored the hedonic gLMS. With the 9-point scale, extreme labels were strongly associated with extremes in food affect. In contrast, gLMS data showed that scale extremes referenced nonfood experiences. Perceived taste intensity significantly influenced differences in food liking/disliking (for example, those experiencing the most intense tastes, called supertasters, showed more extreme liking and disliking for their favorite and least favorite foods). Scales like the hedonic gLMS are suitable for across-group comparisons of food palatability. © 2014 Institute of Food Technologists®
Photocatalytic destruction of chlorinated solvents with solar energy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pacheco, J.; Prairie, M.; Yellowhorse, L.
1990-01-01
Sandia National Laboratories and the Solar Energy Research Institute are developing a photocatalytic process to destroy organic contaminants in water. Tests with common water pollutants are being conducted at Sandia's Solar Thermal Test Facility using a near commercial-scale single-axis tracking parabolic trough system with glass pipe mounted at its focus. Experiments at this scale provide verification of laboratory studies and allow examination of design and operation issues at a real-life scale. The catalyst, titanium dioxide (TiO{sub 2}), is a harmless material found in paint, cosmetics and toothpaste. Experiments were conducted to determine the effect of key process parameters on destructionmore » rates of two chlorinated organic compounds which are common water pollutants: trichloroethylene and trichloroethane. In this paper, we summarize the engineering-scale results of these experiments and analyses. 21 refs., 8 figs.« less
Balasubramanian, M; Spencer, A J; Short, S D; Watkins, K; Chrisopoulos, S; Brennan, D S
2016-09-01
The integration of qualitative and quantitative approaches introduces new avenues to bridge strengths, and address weaknesses of both methods. To develop measure(s) for migrant dentist experiences in Australia through a mixed methods approach. The sequential qualitative-quantitative design involved first the harvesting of data items from qualitative study, followed by a national survey of migrant dentists in Australia. Statements representing unique experiences in migrant dentists' life stories were deployed the survey questionnaire, using a five-point Likert scale. Factor analysis was used to examine component factors. Eighty-two statements from 51 participants were harvested from the qualitative analysis. A total of 1,022 of 1,977 migrant dentists (response rate 54.5%) returned completed questionnaires. Factor analysis supported an initial eight-factor solution; further scale development and reliability analysis led to five scales with a final list of 38 life story experience (LSE) items. Three scales were based on home country events: health system and general lifestyle concerns (LSE1; 10 items), society and culture (LSE4; 4 items) and career development (LSE5; 4 items). Two scales included migrant experiences in Australia: appreciation towards Australian way of life (LSE2; 13 items) and settlement concerns (LSE3; 7 items). The five life story experience scales provided necessary conceptual clarity and empirical grounding to explore migrant dentist experiences in Australia. Being based on original migrant dentist narrations, these scales have the potential to offer in-depth insights for policy makers and support future research on dentist migration. Copyright© 2016 Dennis Barber Ltd
Bullock, Robin J; Aggarwal, Srijan; Perkins, Robert A; Schnabel, William
2017-04-01
In the event of a marine oil spill in the Arctic, government agencies, industry, and the public have a stake in the successful implementation of oil spill response. Because large spills are rare events, oil spill response techniques are often evaluated with laboratory and meso-scale experiments. The experiments must yield scalable information sufficient to understand the operability and effectiveness of a response technique under actual field conditions. Since in-situ burning augmented with surface collecting agents ("herders") is one of the few viable response options in ice infested waters, a series of oil spill response experiments were conducted in Fairbanks, Alaska, in 2014 and 2015 to evaluate the use of herders to assist in-situ burning and the role of experimental scale. This study compares burn efficiency and herder application for three experimental designs for in-situ burning of Alaska North Slope crude oil in cold, fresh waters with ∼10% ice cover. The experiments were conducted in three project-specific constructed venues with varying scales (surface areas of approximately 0.09 square meters, 9 square meters and 8100 square meters). The results from the herder assisted in-situ burn experiments performed at these three different scales showed good experimental scale correlation and no negative impact due to the presence of ice cover on burn efficiency. Experimental conclusions are predominantly associated with application of the herder material and usability for a given experiment scale to make response decisions. Copyright © 2016 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Museus, Samuel D.; Zhang, Duan; Kim, Mee Joo
2016-01-01
The purpose of the current examination was to develop a scale to measure campus environments and their impact on the experiences and outcomes of diverse student populations. The Culturally Engaging Campus Environments (CECE) Scale was designed to measure the nine elements of college environments that foster success among diverse populations.…
Kartnaller, Vinicius; Venâncio, Fabrício; F do Rosário, Francisca; Cajaiba, João
2018-04-10
To avoid gas hydrate formation during oil and gas production, companies usually employ thermodynamic inhibitors consisting of hydroxyl compounds, such as monoethylene glycol (MEG). However, these inhibitors may cause other types of fouling during production such as inorganic salt deposits (scale). Calcium carbonate is one of the main scaling salts and is a great concern, especially for the new pre-salt wells being explored in Brazil. Hence, it is important to understand how using inhibitors to control gas hydrate formation may be interacting with the scale formation process. Multiple regression and design of experiments were used to mathematically model the calcium carbonate scaling process and its evolution in the presence of MEG. It was seen that MEG, although inducing the precipitation by increasing the supersaturation ratio, actually works as a scale inhibitor for calcium carbonate in concentrations over 40%. This effect was not due to changes in the viscosity, as suggested in the literature, but possibly to the binding of MEG to the CaCO₃ particles' surface. The interaction of the MEG inhibition effect with the system's variables was also assessed, when temperature' and calcium concentration were more relevant.
Bell, M A; Fox, N A
1997-12-01
This work was designed to investigate individual differences in hands-and-knees crawling and frontal brain electrical activity with respect to object permanence performance in 76 eight-month-old infants. Four groups of infants (one prelocomotor and 3 with varying lengths of hands-and-knees crawling experience) were tested on an object permanence scale in a research design similar to that used by Kermoian and Campos (1988). In addition, baseline EEG was recorded and used as an indicator of brain development, as in the Bell and Fox (1992) longitudinal study. Individual differences in frontal and occipital EEG power and in locomotor experience were associated with performance on the object permanence task. Infants successful at A-not-B exhibited greater frontal EEG power and greater occipital EEG power than unsuccessful infants. In contrast to Kermoian and Campos (1988), who noted that long-term crawling experience was associated with higher performance on an object permanence scale, infants in this study with any amount of hands and knees crawling experience performed at a higher level on the object permanence scale than prelocomotor infants. There was no interaction among brain electrical activity, locomotor experience, and object permanence performance. These data highlight the value of electrophysiological research and the need for a brain-behavior model of object permanence performance that incorporates both electrophysiological and behavioral factors.
NASA Astrophysics Data System (ADS)
Nassar, Mohamed K.; Gurung, Deviyani; Bastani, Mehrdad; Ginn, Timothy R.; Shafei, Babak; Gomez, Michael G.; Graddy, Charles M. R.; Nelson, Doug C.; DeJong, Jason T.
2018-01-01
Design of in situ microbially induced calcite precipitation (MICP) strategies relies on a predictive capability. To date much of the mathematical modeling of MICP has focused on small-scale experiments and/or one-dimensional flow in porous media, and successful parameterizations of models in these settings may not pertain to larger scales or to nonuniform, transient flows. Our objective in this article is to report on modeling to test our ability to predict behavior of MICP under controlled conditions in a meter-scale tank experiment with transient nonuniform transport in a natural soil, using independently determined parameters. Flow in the tank was controlled by three wells, via a complex cycle of injection/withdrawals followed by no-flow intervals. Different injection solution recipes were used in sequence for transport characterization, biostimulation, cementation, and groundwater rinse phases of the 17 day experiment. Reaction kinetics were calibrated using separate column experiments designed with a similar sequence of phases. This allowed for a parsimonious modeling approach with zero fitting parameters for the tank experiment. These experiments and data were simulated using PHT3-D, involving transient nonuniform flow, alternating low and high Damköhler reactive transport, and combined equilibrium and kinetically controlled biogeochemical reactions. The assumption that microbes mediating the reaction were exclusively sessile, and with constant activity, in conjunction with the foregoing treatment of the reaction network, provided for efficient and accurate modeling of the entire process leading to nonuniform calcite precipitation. This analysis suggests that under the biostimulation conditions applied here the assumption of steady state sessile biocatalyst suffices to describe the microbially mediated calcite precipitation.
An economy of scale system's mensuration of large spacecraft
NASA Technical Reports Server (NTRS)
Deryder, L. J.
1981-01-01
The systems technology and cost particulars of using multipurpose platforms versus several sizes of bus type free flyer spacecraft to accomplish the same space experiment missions. Computer models of these spacecraft bus designs were created to obtain data relative to size, weight, power, performance, and cost. To answer the question of whether or not large scale does produce economy, the dominant cost factors were determined and the programmatic effect on individual experiment costs were evaluated.
VISUALIZATION AND SIMULATION OF NON-AQUEOUS PHASE LIQUIDS SOLUBILIZATION IN PORE NETWORKS
The design of in-situ remediation of contaminated soils is mostly based on a description at the macroscopic scale using a averaged quantities. These cannot address issues at the pore and pore network scales. In this paper, visualization experiments and numerical simulations in ...
Guthrie, Kate Morrow; Dunsiger, Shira; Vargas, Sara E; Fava, Joseph L; Shaw, Julia G; Rosen, Rochelle K; Kiser, Patrick F; Kojic, E Milu; Friend, David R; Katz, David F
The development of pericoital (on demand) vaginal HIV prevention technologies remains a global health priority. Clinical trials to date have been challenged by nonadherence, leading to an inability to demonstrate product efficacy. The work here provides new methodology and results to begin to address this limitation. We created validated scales that allow users to characterize sensory perceptions and experiences when using vaginal gel formulations. In this study, we sought to understand the user sensory perceptions and experiences (USPEs) that characterize the preferred product experience for each participant. Two hundred four women evaluated four semisolid vaginal formulations using the USPE scales at four randomly ordered formulation evaluation visits. Women were asked to select their preferred formulation experience for HIV prevention among the four formulations evaluated. The scale scores on the Sex-associated USPE scales (e.g., Initial Penetration and Leakage) for each participant's selected formulation were used in a latent class model analysis. Four classes of preferred formulation experiences were identified. Sociodemographic and sexual history variables did not predict class membership; however, four specific scales were significantly related to class: Initial Penetration, Perceived Wetness, Messiness, and Leakage. The range of preferred user experiences represented by the scale scores creates a potential target range for product development, such that products that elicit scale scores that fall within the preferred range may be more acceptable, or tolerable, to the population under study. It is recommended that similar analyses should be conducted with other semisolid vaginal formulations, and in other cultures, to determine product property and development targets.
Design of the protoDUNE raw data management infrastructure
Fuess, S.; Illingworth, R.; Mengel, M.; ...
2017-10-01
The Deep Underground Neutrino Experiment (DUNE) will employ a set of Liquid Argon Time Projection Chambers (LArTPC) with a total mass of 40 kt as the main components of its Far Detector. In order to validate this technology and characterize the detector performance at full scale, an ambitious experimental program (called “protoDUNE”) has been initiated which includes a test of the large-scale prototypes for the single-phase and dual-phase LArTPC technologies, which will run in a beam at CERN. The total raw data volume that is slated to be collected during the scheduled 3-month beam run is estimated to be inmore » excess of 2.5 PB for each detector. This data volume will require that the protoDUNE experiment carefully design the DAQ, data handling and data quality monitoring systems to be capable of dealing with challenges inherent with peta-scale data management while simultaneously fulfilling the requirements of disseminating the data to a worldwide collaboration and DUNE associated computing sites. Here in this paper, we present our approach to solving these problems by leveraging the design, expertise and components created for the LHC and Intensity Frontier experiments into a unified architecture that is capable of meeting the needs of protoDUNE.« less
Design of the protoDUNE raw data management infrastructure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fuess, S.; Illingworth, R.; Mengel, M.
The Deep Underground Neutrino Experiment (DUNE) will employ a set of Liquid Argon Time Projection Chambers (LArTPC) with a total mass of 40 kt as the main components of its Far Detector. In order to validate this technology and characterize the detector performance at full scale, an ambitious experimental program (called “protoDUNE”) has been initiated which includes a test of the large-scale prototypes for the single-phase and dual-phase LArTPC technologies, which will run in a beam at CERN. The total raw data volume that is slated to be collected during the scheduled 3-month beam run is estimated to be inmore » excess of 2.5 PB for each detector. This data volume will require that the protoDUNE experiment carefully design the DAQ, data handling and data quality monitoring systems to be capable of dealing with challenges inherent with peta-scale data management while simultaneously fulfilling the requirements of disseminating the data to a worldwide collaboration and DUNE associated computing sites. Here in this paper, we present our approach to solving these problems by leveraging the design, expertise and components created for the LHC and Intensity Frontier experiments into a unified architecture that is capable of meeting the needs of protoDUNE.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kojima, S.; Yokosawa, M.; Matsuyama, M.
To study the practical application of a tritium separation process using Self-Developing Gas Chromatography (SDGC) using a Pd-Pt alloy, intermediate scale-up experiments (22 mm ID x 2 m length column) and the development of a computational simulation method have been conducted. In addition, intermediate scale production of Pd-Pt powder has been developed for the scale-up experiments.The following results were obtained: (1) a 50-fold scale-up from 3 mm to 22 mm causes no significant impact on the SDGC process; (2) the Pd-Pt alloy powder is applicable to a large size SDGC process; and (3) the simulation enables preparation of a conceptualmore » design of a SDGC process for tritium separation.« less
Modeling of copper sorption onto GFH and design of full-scale GFH adsorbers.
Steiner, Michele; Pronk, Wouter; Boller, Markus A
2006-03-01
During rain events, copper wash-off occurring from copper roofs results in environmental hazards. In this study, columns filled with granulated ferric hydroxide (GFH) were used to treat copper-containing roof runoff. It was shown that copper could be removed to a high extent. A model was developed to describe this removal process. The model was based on the Two Region Model (TRM), extended with an additional diffusion zone. The extended model was able to describe the copper removal in long-term experiments (up to 125 days) with variable flow rates reflecting realistic runoff events. The four parameters of the model were estimated based on data gained with specific column experiments according to maximum sensitivity for each parameter. After model validation, the parameter set was used for the design of full-scale adsorbers. These full-scale adsorbers show high removal rates during extended periods of time.
Development and Validation of the Caring Loneliness Scale.
Karhe, Liisa; Kaunonen, Marja; Koivisto, Anna-Maija
2016-12-01
The Caring Loneliness Scale (CARLOS) includes 5 categories derived from earlier qualitative research. This article assesses the reliability and construct validity of a scale designed to measure patient experiences of loneliness in a professional caring relationship. Statistical analysis with 4 different sample sizes included Cronbach's alpha and exploratory factor analysis with principal axis factoring extraction. The sample size of 250 gave the most useful and comprehensible structure, but all 4 samples yielded underlying content of loneliness experiences. The initial 5 categories were reduced to 4 factors with 24 items and Cronbach's alpha ranging from .77 to .90. The findings support the reliability and validity of CARLOS for the assessment of Finnish breast cancer and heart surgery patients' experiences but as all instruments, further validation is needed.
Psychometric testing of an instrument to measure the experience of home.
Molony, Sheila L; McDonald, Deborah Dillon; Palmisano-Mills, Christine
2007-10-01
Research related to quality of life in long-term care has been hampered by a paucity of measurement tools sensitive to environmental interventions. The primary aim of this study was to test the psychometric properties of a new instrument, the Experience of Home (EOH) Scale, designed to measure the strength of the experience of meaningful person-environment transaction. The instrument was administered to 200 older adults in diverse dwelling types. Principal components analysis provided support for construct validity, eliciting a three-factor solution accounting for 63.18% of variance in scores. Internal consistency reliability was supported with Cronbach's alpha of .96 for the entire scale. The EOH Scale is a unique research tool to evaluate interventions to improve quality of living in residential environments.
Irradiation performance of HTGR fuel rods in HFIR experiments HRB-7 and -8
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valentine, K.H.; Homan, F.J.; Long, E.L. Jr.
1977-05-01
The HRB-7 and -8 experiments were designed as a comprehensive test of mixed thorium-uranium oxide fissile particles with Th:U ratios from 0 to 8 for HTGR recycle application. In addition, fissile particles derived from Weak-Acid Resin (WAR) were tested as a potential backup type of fissile particle for HTGR recycle. These experiments were conducted at two temperatures (1250 and 1500/sup 0/C) to determine the influence of operating temperature on the performance parameters studied. The minor objectives were comparison of advanced coating designs where ZrC replaced SiC in the Triso design, testing of fuel coated in laboratory-scale equipment with fuel coatedmore » in production-scale coaters, comparison of the performance of /sup 233/U-bearing particles with that of /sup 235/U-bearing particles, comparison of the performance of Biso coatings with Triso coatings for particles containing the same type of kernel, and testing of multijunction tungsten-rhenium thermocouples. All objectives were accomplished. As a result of these experiments the mixed thorium-uranium oxide fissile kernel was replaced by a WAR-derived particle in the reference recycle design. A tentative decision to make this change had been reached before the HRB-7 and -8 capsules were examined, and the results of the examination confirmed the accuracy of the previous decision. Even maximum dilution (Th/U approximately equal to 8) of the mixed thorium-uranium oxide kernel was insufficient to prevent amoeba of the kernels at rates that are unacceptable in a large HTGR. Other results showed the performance of /sup 233/U-bearing particles to be identical to that of /sup 235/U-bearing particles, the performance of fuel coated in production-scale equipment to be at least as good as that of fuel coated in laboratory-scale coaters, the performance of ZrC coatings to be very promising, and Biso coatings to be inferior to Triso coatings relative to fission product retention.« less
MANUAL: BIOVENTING PRINCIPLES AND PRACTICE VOLUME II. BIOVENTING DESIGN
The results from bioventing research and development efforts and from the pilot-scale bioventing systems have been used to produce this two-volume manual. Although this design manual has been written based on extensive experience with petroleum hydrocarbons (and thus, many exampl...
ERIC Educational Resources Information Center
Chan, Cecilia K. Y.; Wong, George C. K.; Law, Ada K. H.; Zhang, T.; Au, Francis T. K.
2017-01-01
This study aimed to provide evidence-based conclusions from students concerning a capstone-design course in a civil engineering programme in Hong Kong. The evidence was generated by designing a student-experience questionnaire. The questionnaire instrument was assessed for internal consistency in four scales (curriculum and structure changes;…
Karam, Amanda L; McMillan, Catherine C; Lai, Yi-Chun; de Los Reyes, Francis L; Sederoff, Heike W; Grunden, Amy M; Ranjithan, Ranji S; Levis, James W; Ducoste, Joel J
2017-06-14
The optimal design and operation of photosynthetic bioreactors (PBRs) for microalgal cultivation is essential for improving the environmental and economic performance of microalgae-based biofuel production. Models that estimate microalgal growth under different conditions can help to optimize PBR design and operation. To be effective, the growth parameters used in these models must be accurately determined. Algal growth experiments are often constrained by the dynamic nature of the culture environment, and control systems are needed to accurately determine the kinetic parameters. The first step in setting up a controlled batch experiment is live data acquisition and monitoring. This protocol outlines a process for the assembly and operation of a bench-scale photosynthetic bioreactor that can be used to conduct microalgal growth experiments. This protocol describes how to size and assemble a flat-plate, bench-scale PBR from acrylic. It also details how to configure a PBR with continuous pH, light, and temperature monitoring using a data acquisition and control unit, analog sensors, and open-source data acquisition software.
Karam, Amanda L.; McMillan, Catherine C.; Lai, Yi-Chun; de los Reyes, Francis L.; Sederoff, Heike W.; Grunden, Amy M.; Ranjithan, Ranji S.; Levis, James W.; Ducoste, Joel J.
2017-01-01
The optimal design and operation of photosynthetic bioreactors (PBRs) for microalgal cultivation is essential for improving the environmental and economic performance of microalgae-based biofuel production. Models that estimate microalgal growth under different conditions can help to optimize PBR design and operation. To be effective, the growth parameters used in these models must be accurately determined. Algal growth experiments are often constrained by the dynamic nature of the culture environment, and control systems are needed to accurately determine the kinetic parameters. The first step in setting up a controlled batch experiment is live data acquisition and monitoring. This protocol outlines a process for the assembly and operation of a bench-scale photosynthetic bioreactor that can be used to conduct microalgal growth experiments. This protocol describes how to size and assemble a flat-plate, bench-scale PBR from acrylic. It also details how to configure a PBR with continuous pH, light, and temperature monitoring using a data acquisition and control unit, analog sensors, and open-source data acquisition software. PMID:28654054
Moving bed reactor setup to study complex gas-solid reactions.
Gupta, Puneet; Velazquez-Vargas, Luis G; Valentine, Charles; Fan, Liang-Shih
2007-08-01
A moving bed scale reactor setup for studying complex gas-solid reactions has been designed in order to obtain kinetic data for scale-up purpose. In this bench scale reactor setup, gas and solid reactants can be contacted in a cocurrent and countercurrent manner at high temperatures. Gas and solid sampling can be performed through the reactor bed with their composition profiles determined at steady state. The reactor setup can be used to evaluate and corroborate model parameters accounting for intrinsic reaction rates in both simple and complex gas-solid reaction systems. The moving bed design allows experimentation over a variety of gas and solid compositions in a single experiment unlike differential bed reactors where the gas composition is usually fixed. The data obtained from the reactor can also be used for direct scale-up of designs for moving bed reactors.
Students' Self-Evaluation and Reflection (Part 1): "Measurement"
ERIC Educational Resources Information Center
Cambra-Fierro, Jesus; Cambra-Berdun, Jesus
2007-01-01
Purpose: The objective of the paper is the development and validation of scales to assess reflective learning. Design/methodology/approach: The research is based on a literature review plus in-classroom experience. For the scale validation process, exploratory and confirmatory analyses were conducted, following proposals made by Anderson and…
Gene Expression Analysis: Teaching Students to Do 30,000 Experiments at Once with Microarray
ERIC Educational Resources Information Center
Carvalho, Felicia I.; Johns, Christopher; Gillespie, Marc E.
2012-01-01
Genome scale experiments routinely produce large data sets that require computational analysis, yet there are few student-based labs that illustrate the design and execution of these experiments. In order for students to understand and participate in the genomic world, teaching labs must be available where students generate and analyze large data…
ERIC Educational Resources Information Center
Stoet, Gijsbert
2017-01-01
This article reviews PsyToolkit, a free web-based service designed for setting up, running, and analyzing online questionnaires and reaction-time (RT) experiments. It comes with extensive documentation, videos, lessons, and libraries of free-to-use psychological scales and RT experiments. It provides an elaborate interactive environment to use (or…
Proceedings of the Conference on the Design of Experiments (23rd) S
1978-07-01
of Statistics, Carnegie-Mellon University. * [12] Duran , B. S . (1976). A survey of nonparametric tests for scale. Comunications in Statistics A5, 1287...the twenty-third Design of Experiments Conference was the U. S . Army Combat Development Experimentation Command, Fort Ord, California. Excellent...Availability Prof. G. E. P. Box Time Series Modelling University of Wisconsin Dr. Churchill Eisenhart was recipient this year of the Samuel S . Wilks Memorial
Microwave Remote Sensing and the Cold Land Processes Field Experiment
NASA Technical Reports Server (NTRS)
Kim, Edward J.; Cline, Don; Davis, Bert; Hildebrand, Peter H. (Technical Monitor)
2001-01-01
The Cold Land Processes Field Experiment (CLPX) has been designed to advance our understanding of the terrestrial cryosphere. Developing a more complete understanding of fluxes, storage, and transformations of water and energy in cold land areas is a critical focus of the NASA Earth Science Enterprise Research Strategy, the NASA Global Water and Energy Cycle (GWEC) Initiative, the Global Energy and Water Cycle Experiment (GEWEX), and the GEWEX Americas Prediction Project (GAPP). The movement of water and energy through cold regions in turn plays a large role in ecological activity and biogeochemical cycles. Quantitative understanding of cold land processes over large areas will require synergistic advancements in 1) understanding how cold land processes, most comprehensively understood at local or hillslope scales, extend to larger scales, 2) improved representation of cold land processes in coupled and uncoupled land-surface models, and 3) a breakthrough in large-scale observation of hydrologic properties, including snow characteristics, soil moisture, the extent of frozen soils, and the transition between frozen and thawed soil conditions. The CLPX Plan has been developed through the efforts of over 60 interested scientists that have participated in the NASA Cold Land Processes Working Group (CLPWG). This group is charged with the task of assessing, planning and implementing the required background science, technology, and application infrastructure to support successful land surface hydrology remote sensing space missions. A major product of the experiment will be a comprehensive, legacy data set that will energize many aspects of cold land processes research. The CLPX will focus on developing the quantitative understanding, models, and measurements necessary to extend our local-scale understanding of water fluxes, storage, and transformations to regional and global scales. The experiment will particularly emphasize developing a strong synergism between process-oriented understanding, land surface models and microwave remote sensing. The experimental design is a multi-sensor, multi-scale (1-ha to 160,000 km ^ {2}) approach to providing the comprehensive data set necessary to address several experiment objectives. A description focusing on the microwave remote sensing components (ground, airborne, and spaceborne) of the experiment will be presented.
Status and Plans for the FLARE (Facility for Laboratory Reconnection Experiments) Project
NASA Astrophysics Data System (ADS)
Ji, H.; Bhattacharjee, A.; Prager, S.; Daughton, W.; Bale, S.; Carter, T.; Crocker, N.; Drake, J.; Egedal, J.; Sarff, J.; Wallace, J.; Chen, Y.; Cutler, R.; Fox, W.; Heitzenroeder, P.; Kalish, M.; Jara-Almonte, J.; Myers, C.; Ren, Y.; Yamada, M.; Yoo, J.
2015-11-01
The FLARE device (flare.pppl.gov) is a new intermediate-scale plasma experiment under construction at Princeton to study magnetic reconnection in regimes directly relevant to space, solar, astrophysical, and fusion plasmas. The existing small-scale experiments have been focusing on the single X-line reconnection process either with small effective sizes or at low Lundquist numbers, but both of which are typically very large in natural and fusion plasmas. The design of the FLARE device is motivated to provide experimental access to the new regimes involving multiple X-lines, as guided by a reconnection ``phase diagram'' [Ji & Daughton, PoP (2011)]. Most of major components of the FLARE device have been designed and are under construction. The device will be assembled and installed in 2016, followed by commissioning and operation in 2017. The planned research on FLARE as a user facility will be discussed. Supported by NSF.
Flowpath evaluation and reconnaissance by remote field Eddy current testing (FERRET)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smoak, A.E.; Zollinger, W.T.
1993-12-31
This document describes the design and development of FERRET (Flowpath Evaluation and Reconnaisance by Remote-field Eddy current Testing). FERRET is a system for inspecting the steel pipes which carry cooling water to underground nuclear waste storage tanks. The FERRET system has been tested in a small scale cooling pipe mock-up, an improved full scale mock-up, and in flaw detection experiments. Early prototype designs of FERRET and the FERRET launcher (a device which inserts, moves, and retrieves probes from a piping system) as well as the field-ready design are discussed.
Ni, Lian Ting; Fehlings, Darcy; Biddiss, Elaine
2014-06-01
Virtual reality (VR)-based therapy for motor rehabilitation of children with cerebral palsy (CP) is growing in prevalence. Although mainstream active videogames typically offer children an appealing user experience, they are not designed for therapeutic relevance. Conversely, rehabilitation-specific games often struggle to provide an immersive experience that sustains interest. This study aims to design and evaluate two VR-based therapy games for upper and lower limb rehabilitation and to evaluate their efficacy with dual focus on therapeutic relevance and user experience. Three occupational therapists, three physiotherapists, and eight children (8-12 years old), with CP Level I-III on the Gross Motor Function Classification System, evaluated two games for the Microsoft(®) (Redmond, WA) Kinect™ for Windows and completed the System Usability Scale (SUS), Physical Activity Enjoyment Scale (PACES), and custom feedback questionnaires. Children and therapists unanimously agreed on the enjoyment and therapeutic value of the games. Median scores on the PACES were high (6.24±0.95 on the 7-point scale). Therapists considered the system to be of average usability (50th percentile on the SUS). The most prevalent usability issue was detection errors distinguishing the child's movements from the supporting therapist's. The ability to adjust difficulty settings and to focus on targeted goals (e.g., elbow/shoulder extension, weight shifting) was highly valued by therapists. Engaging both therapists and children in a user-centered design approach enabled the development of two VR-based therapy games for upper and lower limb rehabilitation that are dually (a) engaging to the child and (b) therapeutically relevant.
NASA Astrophysics Data System (ADS)
Potters, M. G.; Bombois, X.; Mansoori, M.; Hof, Paul M. J. Van den
2016-08-01
Estimation of physical parameters in dynamical systems driven by linear partial differential equations is an important problem. In this paper, we introduce the least costly experiment design framework for these systems. It enables parameter estimation with an accuracy that is specified by the experimenter prior to the identification experiment, while at the same time minimising the cost of the experiment. We show how to adapt the classical framework for these systems and take into account scaling and stability issues. We also introduce a progressive subdivision algorithm that further generalises the experiment design framework in the sense that it returns the lowest cost by finding the optimal input signal, and optimal sensor and actuator locations. Our methodology is then applied to a relevant problem in heat transfer studies: estimation of conductivity and diffusivity parameters in front-face experiments. We find good correspondence between numerical and theoretical results.
Fine-Scale Structure Design for 3D Printing
NASA Astrophysics Data System (ADS)
Panetta, Francis Julian
Modern additive fabrication technologies can manufacture shapes whose geometric complexities far exceed what existing computational design tools can analyze or optimize. At the same time, falling costs have placed these fabrication technologies within the average consumer's reach. Especially for inexpert designers, new software tools are needed to take full advantage of 3D printing technology. This thesis develops such tools and demonstrates the exciting possibilities enabled by fine-tuning objects at the small scales achievable by 3D printing. The thesis applies two high-level ideas to invent these tools: two-scale design and worst-case analysis. The two-scale design approach addresses the problem that accurately simulating--let alone optimizing--the full-resolution geometry sent to the printer requires orders of magnitude more computational power than currently available. However, we can decompose the design problem into a small-scale problem (designing tileable structures achieving a particular deformation behavior) and a macro-scale problem (deciding where to place these structures in the larger object). This separation is particularly effective, since structures for every useful behavior can be designed once, stored in a database, then reused for many different macroscale problems. Worst-case analysis refers to determining how likely an object is to fracture by studying the worst possible scenario: the forces most efficiently breaking it. This analysis is needed when the designer has insufficient knowledge or experience to predict what forces an object will undergo, or when the design is intended for use in many different scenarios unknown a priori. The thesis begins by summarizing the physics and mathematics necessary to rigorously approach these design and analysis problems. Specifically, the second chapter introduces linear elasticity and periodic homogenization. The third chapter presents a pipeline to design microstructures achieving a wide range of effective isotropic elastic material properties on a single-material 3D printer. It also proposes a macroscale optimization algorithm placing these microstructures to achieve deformation goals under prescribed loads. The thesis then turns to worst-case analysis, first considering the macroscale problem: given a user's design, the fourth chapter aims to determine the distribution of pressures over the surface creating the highest stress at any point in the shape. Solving this problem exactly is difficult, so we introduce two heuristics: one to focus our efforts on only regions likely to concentrate stresses and another converting the pressure optimization into an efficient linear program. Finally, the fifth chapter introduces worst-case analysis at the microscopic scale, leveraging the insight that the structure of periodic homogenization enables us to solve the problem exactly and efficiently. Then we use this worst-case analysis to guide a shape optimization, designing structures with prescribed deformation behavior that experience minimal stresses in generic use.
Photocatalytic destruction of chlorinated solvents in water with solar energy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pacheco, J.E.; Prairie, M.R.; Yellowhorse, L.
1993-08-01
Sandia National Laboratories and the National Renewable Energy Laboratory are developing a photocatalytic process to destroy organic contaminants in water. Tests with common water pollutants have been conducted at Sandia's Solar Thermal Facility using a near commercial scale, single-axis tracking parabolic trough system with a glass pipe reactor mounted at its focus. Experiments at this scale provide verification of laboratory studies and allow examination of design and operation issues at a real-lifescale. The catalyst, titanium dioxide (TiO[sub 2]), is a harmless material found in paint, cosmetics, and toothpaste. Experiments were conducted to determine the effects of key process parameters onmore » destruction rates of chlorinated organic compounds that are common water pollutants. This paper summarizes the engineering-scale results of these experiments and analyses.« less
EFEDA - European field experiment in a desertification-threatened area
NASA Technical Reports Server (NTRS)
Bolle, H.-J.; Andre, J.-C.; Arrue, J. L.; Barth, H. K.; Bessemoulin, P.; Brasa, A.; De Bruin, H. A. R.; Cruces, J.; Dugdale, G.; Engman, E. T.
1993-01-01
During June 1991 more than 30 scientific teams worked in Castilla-La Mancha, Spain, studying the energy and water transfer processes between soil, vegetation, and the atmosphere in semiarid conditions within the coordinated European research project EFEDA (European Field Experiment in Desertification-threatened Areas). Measurements were made from the microscale (e.g., measurements on single plants) up to a scale compatible with the grid size of global models. For this purpose three sites were selected 70 km apart and heavily instrumented at a scale in the order of 30 sq km. Aircraft missions, satellite data, and movable equipment were deployed to provide a bridge to the larger scale. This paper gives a description of the experimental design along with some of the preliminary results of this successful experiment.
Code of Federal Regulations, 2010 CFR
2010-01-01
... atomic weapon, designed or used to sustain nuclear fission in a self-supporting chain reaction. (g... experiments; or (ii) A liquid fuel loading; or (iii) An experimental facility in the core in excess of 16... in the isotope 235, except laboratory scale facilities designed or used for experimental or...
Code of Federal Regulations, 2011 CFR
2011-01-01
... in the isotope 235, except laboratory scale facilities designed or used for experimental or... atomic weapon, designed or used to sustain nuclear fission in a self-supporting chain reaction. (g... experiments; or (ii) A liquid fuel loading; or (iii) An experimental facility in the core in excess of 16...
Code of Federal Regulations, 2012 CFR
2012-01-01
... in the isotope 235, except laboratory scale facilities designed or used for experimental or... atomic weapon, designed or used to sustain nuclear fission in a self-supporting chain reaction. (g... experiments; or (ii) A liquid fuel loading; or (iii) An experimental facility in the core in excess of 16...
Development of a Dynamically Scaled Generic Transport Model Testbed for Flight Research Experiments
NASA Technical Reports Server (NTRS)
Jordan, Thomas; Langford, William; Belcastro, Christine; Foster, John; Shah, Gautam; Howland, Gregory; Kidd, Reggie
2004-01-01
This paper details the design and development of the Airborne Subscale Transport Aircraft Research (AirSTAR) test-bed at NASA Langley Research Center (LaRC). The aircraft is a 5.5% dynamically scaled, remotely piloted, twin-turbine, swept wing, Generic Transport Model (GTM) which will be used to provide an experimental flight test capability for research experiments pertaining to dynamics modeling and control beyond the normal flight envelope. The unique design challenges arising from the dimensional, weight, dynamic (inertial), and actuator scaling requirements necessitated by the research community are described along with the specific telemetry and control issues associated with a remotely piloted subscale research aircraft. Development of the necessary operational infrastructure, including operational and safety procedures, test site identification, and research pilots is also discussed. The GTM is a unique vehicle that provides significant research capacity due to its scaling, data gathering, and control characteristics. By combining data from this testbed with full-scale flight and accident data, wind tunnel data, and simulation results, NASA will advance and validate control upset prevention and recovery technologies for transport aircraft, thereby reducing vehicle loss-of-control accidents resulting from adverse and upset conditions.
NASA Astrophysics Data System (ADS)
Dalesandro, Andrew A.; Theilacker, Jay; Van Sciver, Steven
2012-06-01
Safe operation of superconducting radio frequency (SRF) cavities require design consideration of a sudden catastrophic loss of vacuum (SCLV) adjacent with liquid helium (LHe) vessels and subsequent dangers. An experiment is discussed to test the longitudinal effects of SCLV along the beam line of a string of scaled SRF cavities. Each scaled cavity includes one segment of beam tube within a LHe vessel containing 2 K saturated LHe, and a riser pipe connecting the LHe vessel to a common gas header. At the beam tube inlet is a fast acting solenoid valve to simulate SCLV and a high/low range orifice plate flow-meter to measure air influx to the cavity. The gas header exit also has an orifice plate flow-meter to measure helium venting the system at the relief pressure of 0.4 MPa. Each cavity is instrumented with Validyne pressure transducers and Cernox thermometers. The purpose of this experiment is to quantify the time required to spoil the beam vacuum and the effects of transient heat and mass transfer on the helium system. Heat transfer data is expected to reveal a longitudinal effect due to the geometry of the experiment. Details of the experimental design criteria and objectives are presented.
NASA Technical Reports Server (NTRS)
Wescott, E. M.; Davis, T. N.
1980-01-01
A reliable payload system and scaled down shaped charges were developed for carrying out experiments in solar-terrestrial magnetospheric physics. Four Nike-Tomahawk flights with apogees near 450 km were conducted to investigate magnetospheric electric fields, and two Taurus-Tomahawk rockets were flown in experiments on the auroral acceleration process in discrete auroras. In addition, a radial shaped charge was designed for plasma perturbation experiments.
ERIC Educational Resources Information Center
McBride, Ruari-Santiago; Schubotz, Dirk
2017-01-01
This article investigates educational experiences of transgender and gender non-conforming (TGNC) youth living in Northern Ireland (NI) through a mixed-methods research design and analytical framework of heteronormativity. It draws on large-scale survey data which, for the first time in NI, captured the experiences of 16 year olds who identify as…
Promoting Female Students' Learning Motivation towards Science by Exercising Hands-On Activities
ERIC Educational Resources Information Center
Wen-jin, Kuo; Chia-ju, Liu; Shi-an, Leou
2012-01-01
The purpose of this study is to design different hands-on science activities and investigate which activities could better promote female students' learning motivation towards science. This study conducted three types of science activities which contains nine hands-on activities, an experience scale and a learning motivation scale for data…
Further Validation of the Coach Identity Prominence Scale
ERIC Educational Resources Information Center
Pope, J. Paige; Hall, Craig R.
2014-01-01
This study was designed to examine select psychometric properties of the Coach Identity Prominence Scale (CIPS), including the reliability, factorial validity, convergent validity, discriminant validity, and predictive validity. Coaches (N = 338) who averaged 37 (SD = 12.27) years of age, had a mean of 13 (SD = 9.90) years of coaching experience,…
ERIC Educational Resources Information Center
Nimmer, Donald N.
1979-01-01
The Quality of School Life Scale (QLS) is an instrument designed to measure students' perceptions of their school experiences. Such an instrument may aid educators in evaluating students' perceptions in the three areas identified by the QSL: satisfaction, commitment to classwork, and reactions to teachers. (Author)
Investigating the Mercalli Intensity Scale through "Lived Experience"
ERIC Educational Resources Information Center
Jones, Richard
2012-01-01
The modified Mercalli (MM) intensity scale is composed of 12 increasing levels of intensity that range from imperceptible shaking to catastrophic destruction and is designated by Roman numerals I through XII. Although qualitative in nature, it can provide a more concrete model for middle and high school students striving to understand the dynamics…
Culturally-Anchored Values and University Education Experience Perception
ERIC Educational Resources Information Center
Mitsis, Ann; Foley, Patrick
2009-01-01
Purpose: The purpose of this paper is to examine whether business students' gender, age and culturally-anchored values affect their perceptions of their university course experience. Design/methodology/approach: Culturally diverse business students (n 1/4 548) studying at an Australian university were surveyed using previously established scales.…
DOT National Transportation Integrated Search
2009-03-01
The thirteenth full-scale Accelerated Pavement Test (APT) experiment at the Civil Infrastructure Laboratory (CISL) : of Kansas State University aimed to determine the response and the failure mode of thin concrete overlays. Four : pavement structures...
Chen, Yibing; Ogata, Taiki; Ueyama, Tsuyoshi; Takada, Toshiyuki; Ota, Jun
2018-01-01
Machine vision is playing an increasingly important role in industrial applications, and the automated design of image recognition systems has been a subject of intense research. This study has proposed a system for automatically designing the field-of-view (FOV) of a camera, the illumination strength and the parameters in a recognition algorithm. We formulated the design problem as an optimisation problem and used an experiment based on a hierarchical algorithm to solve it. The evaluation experiments using translucent plastics objects showed that the use of the proposed system resulted in an effective solution with a wide FOV, recognition of all objects and 0.32 mm and 0.4° maximal positional and angular errors when all the RGB (red, green and blue) for illumination and R channel image for recognition were used. Though all the RGB illumination and grey scale images also provided recognition of all the objects, only a narrow FOV was selected. Moreover, full recognition was not achieved by using only G illumination and a grey-scale image. The results showed that the proposed method can automatically design the FOV, illumination and parameters in the recognition algorithm and that tuning all the RGB illumination is desirable even when single-channel or grey-scale images are used for recognition. PMID:29786665
Chen, Yibing; Ogata, Taiki; Ueyama, Tsuyoshi; Takada, Toshiyuki; Ota, Jun
2018-05-22
Machine vision is playing an increasingly important role in industrial applications, and the automated design of image recognition systems has been a subject of intense research. This study has proposed a system for automatically designing the field-of-view (FOV) of a camera, the illumination strength and the parameters in a recognition algorithm. We formulated the design problem as an optimisation problem and used an experiment based on a hierarchical algorithm to solve it. The evaluation experiments using translucent plastics objects showed that the use of the proposed system resulted in an effective solution with a wide FOV, recognition of all objects and 0.32 mm and 0.4° maximal positional and angular errors when all the RGB (red, green and blue) for illumination and R channel image for recognition were used. Though all the RGB illumination and grey scale images also provided recognition of all the objects, only a narrow FOV was selected. Moreover, full recognition was not achieved by using only G illumination and a grey-scale image. The results showed that the proposed method can automatically design the FOV, illumination and parameters in the recognition algorithm and that tuning all the RGB illumination is desirable even when single-channel or grey-scale images are used for recognition.
Lunar exploration rover program developments
NASA Technical Reports Server (NTRS)
Klarer, P. R.
1994-01-01
The Robotic All Terrain Lunar Exploration Rover (RATLER) design concept began at Sandia National Laboratories in late 1991 with a series of small, proof-of-principle, working scale models. The models proved the viability of the concept for high mobility through mechanical simplicity, and eventually received internal funding at Sandia National Laboratories for full scale, proof-of-concept prototype development. Whereas the proof-of-principle models demonstrated the mechanical design's capabilities for mobility, the full scale proof-of-concept design currently under development is intended to support field operations for experiments in telerobotics, autonomous robotic operations, telerobotic field geology, and advanced man-machine interface concepts. The development program's current status is described, including an outline of the program's work over the past year, recent accomplishments, and plans for follow-on development work.
Introduction to Chemical Engineering Reactor Analysis: A Web-Based Reactor Design Game
ERIC Educational Resources Information Center
Orbey, Nese; Clay, Molly; Russell, T.W. Fraser
2014-01-01
An approach to explain chemical engineering through a Web-based interactive game design was developed and used with college freshman and junior/senior high school students. The goal of this approach was to demonstrate how to model a lab-scale experiment, and use the results to design and operate a chemical reactor. The game incorporates both…
Research on computer-aided design of modern marine power systems
NASA Astrophysics Data System (ADS)
Ding, Dongdong; Zeng, Fanming; Chen, Guojun
2004-03-01
To make the MPS (Marine Power System) design process more economical and easier, a new CAD scheme is brought forward which takes much advantage of VR (Virtual Reality) and AI (Artificial Intelligence) technologies. This CAD system can shorten the period of design and reduce the requirements on designers' experience in large scale. And some key issues like the selection of hardware and software of such a system are discussed.
Désiré, Amélie; Paillard, Bruno; Bougaret, Joël; Baron, Michel; Couarraze, Guy
2013-02-01
Scaling-up the extrusion-spheronization process involves the separate scale-up of each of the five process steps: dry mixing, granulation, extrusion, spheronization, and drying. The aim of the study was to compare two screw extrusion systems regarding their suitability for scaling-up. Two drug substances of high- and low-solubility in water were retained at different concentrations as formulation variables. Different spheronization times were tested. The productivity of the process was followed up using the extrusion rate and yield. Pellets were characterized by their size and shape, and by their structural and mechanical properties. A response surface design of experiments was built to evaluate the influence of the different variables and their interactions on each response, and to select the type of extrusion which provides the best results in terms of product quality, the one which shows less influence on the product after scale-up ("scalability") and when the formula used changes ("robustness"), and the one which allows the possibility to adjust pellet properties with spheronization variables ("flexibility"). Axial system showed the best characteristics in terms of product quality at lab and industrial scales, the best robustness at industrial scale, and the best scalability, by comparison with radial system. Axial system thus appeared as the easiest scaled-up system. Compared to lab scale, the conclusions observed at industrial scale were the same in terms of product quality, but different for robustness and flexibility, which confirmed the importance to test the systems at industrial scale before acquiring the equipment.
NASA Astrophysics Data System (ADS)
Kim, E.; Tedesco, M.; de Roo, R.; England, A. W.; Gu, H.; Pham, H.; Boprie, D.; Graf, T.; Koike, T.; Armstrong, R.; Brodzik, M.; Hardy, J.; Cline, D.
2004-12-01
The NASA Cold Land Processes Field Experiment (CLPX-1) was designed to provide microwave remote sensing observations and ground truth for studies of snow and frozen ground remote sensing, particularly issues related to scaling. CLPX-1 was conducted in 2002 and 2003 in Colorado, USA. One of the goals of the experiment was to test the capabilities of microwave emission models at different scales. Initial forward model validation work has concentrated on the Local-Scale Observation Site (LSOS), a 0.8~ha study site consisting of open meadows separated by trees where the most detailed measurements were made of snow depth and temperature, density, and grain size profiles. Results obtained in the case of the 3rd Intensive Observing Period (IOP3) period (February, 2003, dry snow) suggest that a model based on Dense Medium Radiative Transfer (DMRT) theory is able to model the recorded brightness temperatures using snow parameters derived from field measurements. This paper focuses on the ability of forward DMRT modelling, combined with snowpack measurements, to reproduce the radiobrightness signatures observed by the University of Michigan's Truck-Mounted Radiometer System (TMRS) at 19 and 37~GHz during the 4th IOP (IOP4) in March, 2003. Unlike in IOP3, conditions during IOP4 include both wet and dry periods, providing a valuable test of DMRT model performance. In addition, a comparison will be made for the one day of coincident observations by the University of Tokyo's Ground-Based Microwave Radiometer-7 (GBMR-7) and the TMRS. The plot-scale study in this paper establishes a baseline of DMRT performance for later studies at successively larger scales. And these scaling studies will help guide the choice of future snow retrieval algorithms and the design of future Cold Lands observing systems.
Validation Results for Core-Scale Oil Shale Pyrolysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Staten, Josh; Tiwari, Pankaj
2015-03-01
This report summarizes a study of oil shale pyrolysis at various scales and the subsequent development a model for in situ production of oil from oil shale. Oil shale from the Mahogany zone of the Green River formation was used in all experiments. Pyrolysis experiments were conducted at four scales, powdered samples (100 mesh) and core samples of 0.75”, 1” and 2.5” diameters. The batch, semibatch and continuous flow pyrolysis experiments were designed to study the effect of temperature (300°C to 500°C), heating rate (1°C/min to 10°C/min), pressure (ambient and 500 psig) and size of the sample on product formation.more » Comprehensive analyses were performed on reactants and products - liquid, gas and spent shale. These experimental studies were designed to understand the relevant coupled phenomena (reaction kinetics, heat transfer, mass transfer, thermodynamics) at multiple scales. A model for oil shale pyrolysis was developed in the COMSOL multiphysics platform. A general kinetic model was integrated with important physical and chemical phenomena that occur during pyrolysis. The secondary reactions of coking and cracking in the product phase were addressed. The multiscale experimental data generated and the models developed provide an understanding of the simultaneous effects of chemical kinetics, and heat and mass transfer on oil quality and yield. The comprehensive data collected in this study will help advance the move to large-scale in situ oil production from the pyrolysis of oil shale.« less
Diffraction-based analysis of tunnel size for a scaled external occulter testbed
NASA Astrophysics Data System (ADS)
Sirbu, Dan; Kasdin, N. Jeremy; Vanderbei, Robert J.
2016-07-01
For performance verification of an external occulter mask (also called a starshade), scaled testbeds have been developed to measure the suppression of the occulter shadow in the pupil plane and contrast in the image plane. For occulter experiments the scaling is typically performed by maintaining an equivalent Fresnel number. The original Princeton occulter testbed was oversized with respect to both input beam and shadow propagation to limit any diffraction effects due to finite testbed enclosure edges; however, to operate at realistic space-mission equivalent Fresnel numbers an extended testbed is currently under construction. With the longer propagation distances involved, diffraction effects due to the edge of the tunnel must now be considered in the experiment design. Here, we present a diffraction-based model of two separate tunnel effects. First, we consider the effect of tunnel-edge induced diffraction ringing upstream from the occulter mask. Second, we consider the diffraction effect due to clipping of the output shadow by the tunnel downstream from the occulter mask. These calculations are performed for a representative point design relevant to the new Princeton occulter experiment, but we also present an analytical relation that can be used for other propagation distances.
NASA Technical Reports Server (NTRS)
Giere, A. C.; Fowlis, W. W.
1980-01-01
The effect of a radially-variable, dielectric body force, analogous to gravity on baroclinic instability for the design of a spherical, synoptic-scale, atmospheric model experiment in a Spacelab flight is investigated. Exact solutions are examined for quasi-geostrophic baroclinic instability in which the rotational Froude number is a linear function of the height. Flow in a rotating rectilinear channel with a vertically variable body force without horizontal shear of the basic state is also discussed.
Sex differences in virtual navigation influenced by scale and navigation experience.
Padilla, Lace M; Creem-Regehr, Sarah H; Stefanucci, Jeanine K; Cashdan, Elizabeth A
2017-04-01
The Morris water maze is a spatial abilities test adapted from the animal spatial cognition literature and has been studied in the context of sex differences in humans. This is because its standard design, which manipulates proximal (close) and distal (far) cues, applies to human navigation. However, virtual Morris water mazes test navigation skills on a scale that is vastly smaller than natural human navigation. Many researchers have argued that navigating in large and small scales is fundamentally different, and small-scale navigation might not simulate natural human navigation. Other work has suggested that navigation experience could influence spatial skills. To address the question of how individual differences influence navigational abilities in differently scaled environments, we employed both a large- (146.4 m in diameter) and a traditional- (36.6 m in diameter) scaled virtual Morris water maze along with a novel measure of navigation experience (lifetime mobility). We found sex differences on the small maze in the distal cue condition only, but in both cue-conditions on the large maze. Also, individual differences in navigation experience modulated navigation performance on the virtual water maze, showing that higher mobility was related to better performance with proximal cues for only females on the small maze, but for both males and females on the large maze.
Merlin: Computer-Aided Oligonucleotide Design for Large Scale Genome Engineering with MAGE.
Quintin, Michael; Ma, Natalie J; Ahmed, Samir; Bhatia, Swapnil; Lewis, Aaron; Isaacs, Farren J; Densmore, Douglas
2016-06-17
Genome engineering technologies now enable precise manipulation of organism genotype, but can be limited in scalability by their design requirements. Here we describe Merlin ( http://merlincad.org ), an open-source web-based tool to assist biologists in designing experiments using multiplex automated genome engineering (MAGE). Merlin provides methods to generate pools of single-stranded DNA oligonucleotides (oligos) for MAGE experiments by performing free energy calculation and BLAST scoring on a sliding window spanning the targeted site. These oligos are designed not only to improve recombination efficiency, but also to minimize off-target interactions. The application further assists experiment planning by reporting predicted allelic replacement rates after multiple MAGE cycles, and enables rapid result validation by generating primer sequences for multiplexed allele-specific colony PCR. Here we describe the Merlin oligo and primer design procedures and validate their functionality compared to OptMAGE by eliminating seven AvrII restriction sites from the Escherichia coli genome.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Richard R. Schultz; Paul D. Bayless; Richard W. Johnson
2010-09-01
The Oregon State University (OSU) High Temperature Test Facility (HTTF) is an integral experimental facility that will be constructed on the OSU campus in Corvallis, Oregon. The HTTF project was initiated, by the U.S. Nuclear Regulatory Commission (NRC), on September 5, 2008 as Task 4 of the 5 year High Temperature Gas Reactor Cooperative Agreement via NRC Contract 04-08-138. Until August, 2010, when a DOE contract was initiated to fund additional capabilities for the HTTF project, all of the funding support for the HTTF was provided by the NRC via their cooperative agreement. The U.S. Department of Energy (DOE) beganmore » their involvement with the HTTF project in late 2009 via the Next Generation Nuclear Plant project. Because the NRC interests in HTTF experiments were only centered on the depressurized conduction cooldown (DCC) scenario, NGNP involvement focused on expanding the experimental envelope of the HTTF to include steady-state operations and also the pressurized conduction cooldown (PCC). Since DOE has incorporated the HTTF as an ingredient in the NGNP thermal-fluids validation program, several important outcomes should be noted: 1. The reference prismatic reactor design, that serves as the basis for scaling the HTTF, became the modular high temperature gas-cooled reactor (MHTGR). The MHTGR has also been chosen as the reference design for all of the other NGNP thermal-fluid experiments. 2. The NGNP validation matrix is being planned using the same scaling strategy that has been implemented to design the HTTF, i.e., the hierarchical two-tiered scaling methodology developed by Zuber in 1991. Using this approach a preliminary validation matrix has been designed that integrates the HTTF experiments with the other experiments planned for the NGNP thermal-fluids verification and validation project. 3. Initial analyses showed that the inherent power capability of the OSU infrastructure, which only allowed a total operational facility power capability of 0.6 MW, is inadequate to permit steady-state operation at reasonable conditions. 4. To enable the HTTF to operate at a more representative steady-state conditions, DOE recently allocated funding via a DOE subcontract to HTTF to permit an OSU infrastructure upgrade such that 2.2 MW will become available for HTTF experiments. 5. Analyses have been performed to study the relationship between HTTF and MHTGR via the hierarchical two-tiered scaling methodology which has been used successfully in the past, e.g., APEX facility scaling to the Westinghouse AP600 plant. These analyses have focused on the relationship between key variables that will be measured in the HTTF to the counterpart variables in the MHTGR with a focus on natural circulation, using nitrogen as a working fluid, and core heat transfer. 6. Both RELAP5-3D and computational fluid dynamics (CD-Adapco’s STAR-CCM+) numerical models of the MHTGR and the HTTF have been constructed and analyses are underway to study the relationship between the reference reactor and the HTTF. The HTTF is presently being designed. It has ¼-scaling relationship to the MHTGR in both the height and the diameter. Decisions have been made to design the reactor cavity cooling system (RCCS) simulation as a boundary condition for the HTTF to ensure that (a) the boundary condition is well defined and (b) the boundary condition can be modified easily to achieve the desired heat transfer sink for HTTF experimental operations.« less
Small-scale behavior in distorted turbulent boundary layers at low Reynolds number
NASA Technical Reports Server (NTRS)
Saddoughi, Seyed G.
1994-01-01
During the last three years we have conducted high- and low-Reynolds-number experiments, including hot-wire measurements of the velocity fluctuations, in the test-section-ceiling boundary layer of the 80- by 120-foot Full-Scale Aerodynamics Facility at NASA Ames Research Center, to test the local-isotropy predictions of Kolmogorov's universal equilibrium theory. This hypothesis, which states that at sufficiently high Reynolds numbers the small-scale structures of turbulent motions are independent of large-scale structures and mean deformations, has been used in theoretical studies of turbulence and computational methods such as large-eddy simulation; however, its range of validity in shear flows has been a subject of controversy. The present experiments were planned to enhance our understanding of the local-isotropy hypothesis. Our experiments were divided into two sets. First, measurements were taken at different Reynolds numbers in a plane boundary layer, which is a 'simple' shear flow. Second, experiments were designed to address this question: will our criteria for the existence of local isotropy hold for 'complex' nonequilibrium flows in which extra rates of mean strain are added to the basic mean shear?
Dykema, John A; Keith, David W; Anderson, James G; Weisenstein, Debra
2014-12-28
Although solar radiation management (SRM) through stratospheric aerosol methods has the potential to mitigate impacts of climate change, our current knowledge of stratospheric processes suggests that these methods may entail significant risks. In addition to the risks associated with current knowledge, the possibility of 'unknown unknowns' exists that could significantly alter the risk assessment relative to our current understanding. While laboratory experimentation can improve the current state of knowledge and atmospheric models can assess large-scale climate response, they cannot capture possible unknown chemistry or represent the full range of interactive atmospheric chemical physics. Small-scale, in situ experimentation under well-regulated circumstances can begin to remove some of these uncertainties. This experiment-provisionally titled the stratospheric controlled perturbation experiment-is under development and will only proceed with transparent and predominantly governmental funding and independent risk assessment. We describe the scientific and technical foundation for performing, under external oversight, small-scale experiments to quantify the risks posed by SRM to activation of halogen species and subsequent erosion of stratospheric ozone. The paper's scope includes selection of the measurement platform, relevant aspects of stratospheric meteorology, operational considerations and instrument design and engineering.
Using Quality Rating Scales for Professional Development: Experiences from the UK
ERIC Educational Resources Information Center
Mathers, Sandra; Linskey, Faye; Seddon, Judith; Sylva, Kathy
2007-01-01
The ECERS-R and ITERS-R are among two of the most widely used observational measures for describing the characteristics of early childhood education and care. This paper describes a professional development programme currently taking place in seven regions across England, designed to train local government staff in the application of the scales as…
The Work-Related Quality of Life Scale for Higher Education Employees
ERIC Educational Resources Information Center
Edwards, Julian A.; Van Laar, Darren; Easton, Simon; Kinman, Gail
2009-01-01
Previous research suggests that higher education employees experience comparatively high levels of job stress. A range of instruments, both generic and job-specific, has been used to measure stressors and strains in this occupational context. The Work-related Quality of Life (WRQoL) scale is a measure designed to capture perceptions of the working…
A large-scale natural gradient tracer experiment was conducted on Cape Cod, Massachusetts, to examine the transport and dispersion of solutes in a sand and gravel aquifer. The nonreactive tracer, bromide, and the reactive tracers, lithium and molybdate, were injected as a pulse i...
The Use of Experiments and Modeling to Evaluate ...
Symposium Paper This paper reports on a study to examine the thermal decomposition of surrogate CWAs (in this case, Malathion) in a laboratory reactor, analysis of the results using reactor design theory, and subsequent scale-up of the results to a computersimulation of a full-scale commercial hazardous waste incinerator processing ceiling tile contaminated with residual Malathion.
Exercises in Evaluation of a Large-Scale Educational Program.
ERIC Educational Resources Information Center
Glass, Gene V.
This workbook is designed to serve as training experience for educational evaluators at the preservice (graduate school) or inservice stages. The book comprises a series of exercises in the planning, execution, and reporting of the evaluation of a large-scale educational program in this case Title I of the Elementary and Secondary Education Act of…
ERIC Educational Resources Information Center
Choi, Bo Young; Park, Heerak; Nam, Suk Kyung; Lee, Jayoung; Cho, Daeyeon; Lee, Sang Min
2011-01-01
The purpose of this study was to develop a Korean College Stress Inventory (KCSI), which is designed to measure Korean college students' experiences and symptoms of career stress. Even though there have been numerous scales related to career issues, few scales measure the career stress construct and its dimensions. Factor structure, internal…
Toward Active Control of Noise from Hot Supersonic Jets
2013-11-15
several laboratory - and full- scale data sets. Two different scaling scenarios are presented for the practising scientist to choose from. The first...As will be detailed below, this simple proof-of-concept experiment yielded good quality data that reveals details about the large-scale 3D structure...the light-field. Co-PI Thurow has recently designed and assembled a plenoptic camera in his laboratory with its key attributes being its compact
Health care planning and education via gaming-simulation: a two-stage experiment.
Gagnon, J H; Greenblat, C S
1977-01-01
A two-stage process of gaming-simulation design was conducted: the first stage of design concerned national planning for hemophilia care; the second stage of design was for gaming-simulation concerning the problems of hemophilia patients and health care providers. The planning design was intended to be adaptable to large-scale planning for a variety of health care problems. The educational game was designed using data developed in designing the planning game. A broad range of policy-makers participated in the planning game.
Analytical and Experimental Verification of a Flight Article for a Mach-8 Boundary-Layer Experiment
NASA Technical Reports Server (NTRS)
Richards, W. Lance; Monaghan, Richard C.
1996-01-01
Preparations for a boundary-layer transition experiment to be conducted on a future flight mission of the air-launched Pegasus(TM) rocket are underway. The experiment requires a flight-test article called a glove to be attached to the wing of the Mach-8 first-stage booster. A three-dimensional, nonlinear finite-element analysis has been performed and significant small-scale laboratory testing has been accomplished to ensure the glove design integrity and quality of the experiment. Reliance on both the analysis and experiment activities has been instrumental in the success of the flight-article design. Results obtained from the structural analysis and laboratory testing show that all glove components are well within the allowable thermal stress and deformation requirements to satisfy the experiment objectives.
Teaching Demands and Adaptive Curriculum Management.
ERIC Educational Resources Information Center
Flinders, David
A qualitative case study was designed to explore the work experience of six high school English teachers. Teachers were observed over a five month period. No formal, systematic rating scale or observation instruments were used; the observer relied upon his experience as a high school teacher and on specialized training in curriculum theory and…
ERIC Educational Resources Information Center
Roschelle, Jeremy; Shechtman, Nicole; Tatar, Deborah; Hegedus, Stephen; Hopkins, Bill; Empson, Susan; Knudsen, Jennifer; Gallagher, Lawrence P.
2010-01-01
The authors present three studies (two randomized controlled experiments and one embedded quasi-experiment) designed to evaluate the impact of replacement units targeting student learning of advanced middle school mathematics. The studies evaluated the SimCalc approach, which integrates an interactive representational technology, paper curriculum,…
This study reports on the results of work preparing 30,000 Ibs of SARM or synthetic analytical reference matrix, a surrogate Superfund soil containing a vide range of contaminants. It also reports the results ©f bench scale treatability experiments designed to simulate the EP...
Principal component analysis for designed experiments.
Konishi, Tomokazu
2015-01-01
Principal component analysis is used to summarize matrix data, such as found in transcriptome, proteome or metabolome and medical examinations, into fewer dimensions by fitting the matrix to orthogonal axes. Although this methodology is frequently used in multivariate analyses, it has disadvantages when applied to experimental data. First, the identified principal components have poor generality; since the size and directions of the components are dependent on the particular data set, the components are valid only within the data set. Second, the method is sensitive to experimental noise and bias between sample groups. It cannot reflect the experimental design that is planned to manage the noise and bias; rather, it estimates the same weight and independence to all the samples in the matrix. Third, the resulting components are often difficult to interpret. To address these issues, several options were introduced to the methodology. First, the principal axes were identified using training data sets and shared across experiments. These training data reflect the design of experiments, and their preparation allows noise to be reduced and group bias to be removed. Second, the center of the rotation was determined in accordance with the experimental design. Third, the resulting components were scaled to unify their size unit. The effects of these options were observed in microarray experiments, and showed an improvement in the separation of groups and robustness to noise. The range of scaled scores was unaffected by the number of items. Additionally, unknown samples were appropriately classified using pre-arranged axes. Furthermore, these axes well reflected the characteristics of groups in the experiments. As was observed, the scaling of the components and sharing of axes enabled comparisons of the components beyond experiments. The use of training data reduced the effects of noise and bias in the data, facilitating the physical interpretation of the principal axes. Together, these introduced options result in improved generality and objectivity of the analytical results. The methodology has thus become more like a set of multiple regression analyses that find independent models that specify each of the axes.
ALFA: The new ALICE-FAIR software framework
NASA Astrophysics Data System (ADS)
Al-Turany, M.; Buncic, P.; Hristov, P.; Kollegger, T.; Kouzinopoulos, C.; Lebedev, A.; Lindenstruth, V.; Manafov, A.; Richter, M.; Rybalchenko, A.; Vande Vyvre, P.; Winckler, N.
2015-12-01
The commonalities between the ALICE and FAIR experiments and their computing requirements led to the development of large parts of a common software framework in an experiment independent way. The FairRoot project has already shown the feasibility of such an approach for the FAIR experiments and extending it beyond FAIR to experiments at other facilities[1, 2]. The ALFA framework is a joint development between ALICE Online- Offline (O2) and FairRoot teams. ALFA is designed as a flexible, elastic system, which balances reliability and ease of development with performance using multi-processing and multithreading. A message- based approach has been adopted; such an approach will support the use of the software on different hardware platforms, including heterogeneous systems. Each process in ALFA assumes limited communication and reliance on other processes. Such a design will add horizontal scaling (multiple processes) to vertical scaling provided by multiple threads to meet computing and throughput demands. ALFA does not dictate any application protocols. Potentially, any content-based processor or any source can change the application protocol. The framework supports different serialization standards for data exchange between different hardware and software languages.
Results of the Vapor Compression Distillation Flight Experiment (VCD-FE)
NASA Technical Reports Server (NTRS)
Hutchens, Cindy; Graves, Rex
2004-01-01
Vapor Compression Distillation (VCD) is the chosen technology for urine processing aboard the International Space Station (ISS). Key aspects of the VCD design have been verified and significant improvements made throughout the ground;based development history. However, an important element lacking from previous subsystem development efforts was flight-testing. Consequently, the demonstration and validation of the VCD technology and the investigation of subsystem performance in micro-gravity were the primary goals of the VCD-FE. The Vapor Compression Distillation Flight Experiment (VCD-E) was a flight experiment aboard the Space Shuttle Columbia during the STS-107 mission. The VCD-FE was a full-scale developmental version of the Space Station Urine Processor Assembly (UPA) and was designed to test some of the potential micro-gravity issues with the design. This paper summarizes the experiment results.
Multi-scale Material Parameter Identification Using LS-DYNA® and LS-OPT®
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stander, Nielen; Basudhar, Anirban; Basu, Ushnish
2015-09-14
Ever-tightening regulations on fuel economy, and the likely future regulation of carbon emissions, demand persistent innovation in vehicle design to reduce vehicle mass. Classical methods for computational mass reduction include sizing, shape and topology optimization. One of the few remaining options for weight reduction can be found in materials engineering and material design optimization. Apart from considering different types of materials, by adding material diversity and composite materials, an appealing option in automotive design is to engineer steel alloys for the purpose of reducing plate thickness while retaining sufficient strength and ductility required for durability and safety. A project tomore » develop computational material models for advanced high strength steel is currently being executed under the auspices of the United States Automotive Materials Partnership (USAMP) funded by the US Department of Energy. Under this program, new Third Generation Advanced High Strength Steel (i.e., 3GAHSS) are being designed, tested and integrated with the remaining design variables of a benchmark vehicle Finite Element model. The objectives of the project are to integrate atomistic, microstructural, forming and performance models to create an integrated computational materials engineering (ICME) toolkit for 3GAHSS. The mechanical properties of Advanced High Strength Steels (AHSS) are controlled by many factors, including phase composition and distribution in the overall microstructure, volume fraction, size and morphology of phase constituents as well as stability of the metastable retained austenite phase. The complex phase transformation and deformation mechanisms in these steels make the well-established traditional techniques obsolete, and a multi-scale microstructure-based modeling approach following the ICME [0]strategy was therefore chosen in this project. Multi-scale modeling as a major area of research and development is an outgrowth of the Comprehensive Test Ban Treaty of 1996 which banned surface testing of nuclear devices [1]. This had the effect that experimental work was reduced from large scale tests to multiscale experiments to provide material models with validation at different length scales. In the subsequent years industry realized that multi-scale modeling and simulation-based design were transferable to the design optimization of any structural system. Horstemeyer [1] lists a number of advantages of the use of multiscale modeling. Among these are: the reduction of product development time by alleviating costly trial-and-error iterations as well as the reduction of product costs through innovations in material, product and process designs. Multi-scale modeling can reduce the number of costly large scale experiments and can increase product quality by providing more accurate predictions. Research tends to be focussed on each particular length scale, which enhances accuracy in the long term. This paper serves as an introduction to the LS-OPT and LS-DYNA methodology for multi-scale modeling. It mainly focuses on an approach to integrate material identification using material models of different length scales. As an example, a multi-scale material identification strategy, consisting of a Crystal Plasticity (CP) material model and a homogenized State Variable (SV) model, is discussed and the parameter identification of the individual material models of different length scales is demonstrated. The paper concludes with thoughts on integrating the multi-scale methodology into the overall vehicle design.« less
Xenon Purification Research and Development for the LZ Dark Matter Experiment
NASA Astrophysics Data System (ADS)
Pech, Katherin
2013-04-01
The LZ Experiment is a next generation dark matter detector based on the current LUX detector design, with a 7-ton active volume. Although many research and development breakthroughs were achieved for the 350 kg LUX detector, the large volume scaling required for LZ presents a new set of design challenges that need to be overcome. Because the search for WIMP-like dark matter requires ultra low background experiments, the xenon target material in the LZ detector must meet purity specifications beyond what is commercially available. This challenge is two-fold. The xenon must contain extremely low amounts of electronegative impurities such as oxygen, which attenuate the charge signal. Additionally, it must also have very little of the inert isotope Kr-85, a beta-emitter that can obscure the dark matter signal in the detector volume. The purity requirements for the LUX experiment have been achieved, but the factor of 20 scaling in volume for LZ and increased demands for sensitivity mean that new research and development work must be done to increase our xenon purification capabilities. This talk will focus on the efforts being done at Case Western Reserve University to meet these strict purity requirements for the LZ Experiment.
A ``Cyber Wind Facility'' for HPC Wind Turbine Field Experiments
NASA Astrophysics Data System (ADS)
Brasseur, James; Paterson, Eric; Schmitz, Sven; Campbell, Robert; Vijayakumar, Ganesh; Lavely, Adam; Jayaraman, Balaji; Nandi, Tarak; Jha, Pankaj; Dunbar, Alex; Motta-Mena, Javier; Craven, Brent; Haupt, Sue
2013-03-01
The Penn State ``Cyber Wind Facility'' (CWF) is a high-fidelity multi-scale high performance computing (HPC) environment in which ``cyber field experiments'' are designed and ``cyber data'' collected from wind turbines operating within the atmospheric boundary layer (ABL) environment. Conceptually the ``facility'' is akin to a high-tech wind tunnel with controlled physical environment, but unlike a wind tunnel it replicates commercial-scale wind turbines operating in the field and forced by true atmospheric turbulence with controlled stability state. The CWF is created from state-of-the-art high-accuracy technology geometry and grid design and numerical methods, and with high-resolution simulation strategies that blend unsteady RANS near the surface with high fidelity large-eddy simulation (LES) in separated boundary layer, blade and rotor wake regions, embedded within high-resolution LES of the ABL. CWF experiments complement physical field facility experiments that can capture wider ranges of meteorological events, but with minimal control over the environment and with very small numbers of sensors at low spatial resolution. I shall report on the first CWF experiments aimed at dynamical interactions between ABL turbulence and space-time wind turbine loadings. Supported by DOE and NSF.
A quality by design study applied to an industrial pharmaceutical fluid bed granulation.
Lourenço, Vera; Lochmann, Dirk; Reich, Gabriele; Menezes, José C; Herdling, Thorsten; Schewitz, Jens
2012-06-01
The pharmaceutical industry is encouraged within Quality by Design (QbD) to apply science-based manufacturing principles to assure quality not only of new but also of existing processes. This paper presents how QbD principles can be applied to an existing industrial pharmaceutical fluid bed granulation (FBG) process. A three-step approach is presented as follows: (1) implementation of Process Analytical Technology (PAT) monitoring tools at the industrial scale process, combined with multivariate data analysis (MVDA) of process and PAT data to increase the process knowledge; (2) execution of scaled-down designed experiments at a pilot scale, with adequate PAT monitoring tools, to investigate the process response to intended changes in Critical Process Parameters (CPPs); and finally (3) the definition of a process Design Space (DS) linking CPPs to Critical to Quality Attributes (CQAs), within which product quality is ensured by design, and after scale-up enabling its use at the industrial process scale. The proposed approach was developed for an existing industrial process. Through enhanced process knowledge established a significant reduction in product CQAs, variability already within quality specifications ranges was achieved by a better choice of CPPs values. The results of such step-wise development and implementation are described. Copyright © 2012 Elsevier B.V. All rights reserved.
Daniel C. Dey; Martin A. Spetich; Dale R. Wiegel; David L. Graney; John M. Kabrick
2009-01-01
Research on oak (Quercus L.) regeneration has generally consisted of smallscale studies of treatments designed to favor oak, including consideration of site quality and topographic effects on oak regeneration. However, these experiments have not consistently factored in broader-scale ecological differences found in the eastern United States. Oak...
Lange, Gudrun; Leonhart, Rainer; Gruber, Harald
2018-01-01
Creation is an important part of many interventions in creative arts therapies (art, music, dance, and drama therapy). This active part of art-making in arts therapies has not yet been closely investigated. The present study commits to this field of research using a mixed-methods design to investigate the effects of active creation on health-related psychological outcomes. In an artistic inquiry within an experimental design, N = 44 participants engaged in active art-making for eight minutes in the presence of the researcher (first author) with a choice of artistic materials: paper and colors for drawing and writing, musical instruments, space for moving or performing. Before and after the creation, participants completed a well-being, a self-efficacy and an experience of creation scale, and in addition found their own words to express the experiences during the activity. We hypothesized that the experience of empowerment, freedom, impact, and creativity (Experience of Creation Scale) mediates the positive effect of active creation on the outcomes of self-efficacy and well-being, and evaluated this assumption with a mediation analysis. Results suggest that the effect of active creation on both self-efficacy and well-being is significantly mediated by the Experience of Creation Scale. This article focuses on the quantitative side of the investigation. During the process, qualitative and quantitative results were triangulated for a more valid evaluation and jointly contribute to the emerging theory frame of embodied aesthetics. PMID:29439541
Lange, Gudrun; Leonhart, Rainer; Gruber, Harald; Koch, Sabine C
2018-02-12
Creation is an important part of many interventions in creative arts therapies (art, music, dance, and drama therapy). This active part of art-making in arts therapies has not yet been closely investigated. The present study commits to this field of research using a mixed-methods design to investigate the effects of active creation on health-related psychological outcomes. In an artistic inquiry within an experimental design, N = 44 participants engaged in active art-making for eight minutes in the presence of the researcher (first author) with a choice of artistic materials: paper and colors for drawing and writing, musical instruments, space for moving or performing. Before and after the creation, participants completed a well-being, a self-efficacy and an experience of creation scale, and in addition found their own words to express the experiences during the activity. We hypothesized that the experience of empowerment, freedom, impact, and creativity (Experience of Creation Scale) mediates the positive effect of active creation on the outcomes of self-efficacy and well-being, and evaluated this assumption with a mediation analysis. Results suggest that the effect of active creation on both self-efficacy and well-being is significantly mediated by the Experience of Creation Scale. This article focuses on the quantitative side of the investigation. During the process, qualitative and quantitative results were triangulated for a more valid evaluation and jointly contribute to the emerging theory frame of embodied aesthetics.
Extraction of Uranium from Seawater: Design and Testing of a Symbiotic System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slocum, Alex
The U.S. Department of Energy in October 2014 awarded the Massachusetts Institute of Technology (MIT) a Nuclear Energy University Program grant (DE-NE0008268) to investigate the design and testing of a symbiotic system to harvest uranium from seawater. As defined in the proposal, the goals for the project are: 1. Address the design of machines for seawater uranium mining. 2. Develop design rules for a uranium harvesting system that would be integrated into an offshore wind power tower. 3. Fabricate a 1/50th size scale prototype for bench and pool-testing to verify initial analysis and theory. 4. Design, build, and test amore » second 1/10th size scale prototype in the ocean for more comprehensive testing and validation. This report describes work done as part of DE-NE0008268 from 10/01/2014 to 11/30/2017 entitled, “Extraction of Uranium from Seawater: Design and Testing of a Symbiotic System.” This effort is part of the Seawater Uranium Recovery Program. This report details the publications and presentations to date on the project, an introduction to the project’s goals and background research into previous work done to achieve these goals thus far. From there, the report describes an algorithm developed during the project used to optimize the adsorption of uranium by changing mechanical parameters such as immersion time and adsorbent reuses is described. Next, a design tool developed as part of the project to determine the global feasibility of symbiotic uranium harvesting systems. Additionally, the report details work done on shell enclosures for uranium adsorption. Moving on, the results from the design, building, and testing of a 1/50th physical scale prototype of a highly feasible symbiotic uranium harvester is described. Then, the report describes the results from flume experiment used to determine the affect of enclosure shells on the uptake of uranium by the adsorbent they enclose. From there the report details the design of a Symbiotic Machine for Ocean uRanium Extraction (SMORE). Next, the results of the 1/10th scale physical scale prototype of a highly feasible symbiotic uranium harvester are presented. The report then details the design and results of an experiment to examine the hydrodynamic effects of a uranium harvester on the offshore wind turbine it is attached to using a 1/150th Froude scale tow tank test. Finally, the report details the results of an initial cost-analysis for the production of uranium from seawater from such a symbiotic device.« less
Operation of an aquatic worm reactor suitable for sludge reduction at large scale.
Hendrickx, Tim L G; Elissen, Hellen H J; Temmink, Hardy; Buisman, Cees J N
2011-10-15
Treatment of domestic waste water results in the production of waste sludge, which requires costly further processing. A biological method to reduce the amount of waste sludge and its volume is treatment in an aquatic worm reactor. The potential of such a worm reactor with the oligochaete Lumbriculus variegatus has been shown at small scale. For scaling up purposes, a new configuration of the reactor was designed, in which the worms were positioned horizontally in the carrier material. This was tested in a continuous experiment of 8 weeks where it treated all the waste sludge from a lab-scale activated sludge process. The results showed a higher worm growth rate compared to previous experiments with the old configuration, whilst nutrient release was similar. The new configuration has a low footprint and allows for easy aeration and faeces collection, thereby making it suitable for full scale application. Copyright © 2011 Elsevier Ltd. All rights reserved.
Design Process of Flight Vehicle Structures for a Common Bulkhead and an MPCV Spacecraft Adapter
NASA Technical Reports Server (NTRS)
Aggarwal, Pravin; Hull, Patrick V.
2015-01-01
Design and manufacturing space flight vehicle structures is a skillset that has grown considerably at NASA during that last several years. Beginning with the Ares program and followed by the Space Launch System (SLS); in-house designs were produced for both the Upper Stage and the SLS Multipurpose crew vehicle (MPCV) spacecraft adapter. Specifically, critical design review (CDR) level analysis and flight production drawing were produced for the above mentioned hardware. In particular, the experience of this in-house design work led to increased manufacturing infrastructure for both Marshal Space Flight Center (MSFC) and Michoud Assembly Facility (MAF), improved skillsets in both analysis and design, and hands on experience in building and testing (MSA) full scale hardware. The hardware design and development processes from initiation to CDR and finally flight; resulted in many challenges and experiences that produced valuable lessons. This paper builds on these experiences of NASA in recent years on designing and fabricating flight hardware and examines the design/development processes used, as well as the challenges and lessons learned, i.e. from the initial design, loads estimation and mass constraints to structural optimization/affordability to release of production drawing to hardware manufacturing. While there are many documented design processes which a design engineer can follow, these unique experiences can offer insight into designing hardware in current program environments and present solutions to many of the challenges experienced by the engineering team.
The Role of Forethought and Serendipity in Designing a Successful Hydrogeological Research Site
NASA Astrophysics Data System (ADS)
Shapiro, A. M.; Hsieh, P. A.
2008-12-01
Designing and implementing a successful hydrogeologic field research observatory requires careful planning among a multidisciplinary group of research scientists. In addition, a small team of research coordinators needs to assume responsibility for smoothly integrating the multidisciplinary experimental program and promoting the explanation of results across discipline boundaries. A narrow interpretation of success at these hydrogeologic observatories can be viewed as the completion of the field-based experiments and the reporting of results for the field site under investigation. This alone is no small task, given the financial and human resources that are needed to develop and maintain field infrastructure, as well as developing, maintaining, and sharing data and interpretive results. Despite careful planning, however, unexpected or serendipitous results can occur. Such serendipitous results can lead to new understanding and revision of original hypotheses. To fully evaluate such serendipitous results, the field program must collect a broad range of scientifically robust data-beyond what is needed to examine the original hypotheses. In characterizing ground water flow and chemical transport in fractured crystalline rock in the Mirror Lake watershed in central New Hampshire, unexpected effects of scale were observed for hydraulic conductivity and matrix diffusion. Contrary to existing theory, hydraulic conductivity at the site did not increase with scale, whereas the effective coefficient of matrix diffusion was found to increase with scale. These results came to light only after examination of extensive data from carefully designed hydraulic and chemical transport experiments. Experiments were conducted on rock cores, individual fractures and volumes of fractured rock over physical dimensions from meters to kilometers. The interpretation of this data yielded new insight into the effect of scale on chemical transport and hydraulic conductivity of fractured rock. Subsequent evaluation of experiments conducted at other fractured rock sites have showed similarities in hydraulic and chemical transport responses, allowing broader conclusions to be reached concerning geologic controls on ground water flow and chemical transport in fractured rock aquifers.
Developing Large Scale Explosively Driven Flyer Experiments on Sand
NASA Astrophysics Data System (ADS)
Rehagen, Thomas; Kraus, Richard
2017-06-01
Measurements of the dynamic behavior of granular materials are of great importance to a variety of scientific and engineering applications, including planetary science, seismology, and construction and destruction. In addition, high quality data are needed to enhance our understanding of granular physics and improve the computational models used to simulate related physical processes. However, since there is a non-negligible grain size associated with these materials, experiments must be of a relatively large scale in order to capture the continuum response of the material and reduce errors associated with the finite grain size. We will present designs for explosively driven flyer experiments to make high accuracy measurements of the Hugoniot of sand (with a grain size of hundreds of microns). To achieve an accuracy of better than a few percent in density, we are developing a platform to measure the Hugoniot of samples several centimeters in thickness. We will present the target designs as well as coupled designs for the explosively launched flyer system. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract No. DE-AC52-07NA27344.
Physics Criteria for a Subscale Plasma Liner Experiment
Hsu, Scott C.; Thio, Yong C. Francis
2018-02-02
Spherically imploding plasma liners, formed by merging hypersonic plasma jets, are a proposed standoff driver to compress magnetized target plasmas to fusion conditions (Hsu et al. in IEEE Trans Plasma Sci 40:1287, 2012). Here, in this paper, the parameter space and physics criteria are identified for a subscale, plasma-liner-formation experiment to provide data, e.g., on liner ram-pressure scaling and uniformity, that are relevant for addressing scientific issues of full-scale plasma liners required to achieve fusion conditions. Lastly, based on these criteria, we quantitatively estimate the minimum liner kinetic energy and mass needed, which informed the design of a subscale plasmamore » liner experiment now under development.« less
Physics Criteria for a Subscale Plasma Liner Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hsu, Scott C.; Thio, Yong C. Francis
Spherically imploding plasma liners, formed by merging hypersonic plasma jets, are a proposed standoff driver to compress magnetized target plasmas to fusion conditions (Hsu et al. in IEEE Trans Plasma Sci 40:1287, 2012). Here, in this paper, the parameter space and physics criteria are identified for a subscale, plasma-liner-formation experiment to provide data, e.g., on liner ram-pressure scaling and uniformity, that are relevant for addressing scientific issues of full-scale plasma liners required to achieve fusion conditions. Lastly, based on these criteria, we quantitatively estimate the minimum liner kinetic energy and mass needed, which informed the design of a subscale plasmamore » liner experiment now under development.« less
Simulating flow around scaled model of a hypersonic vehicle in wind tunnel
NASA Astrophysics Data System (ADS)
Markova, T. V.; Aksenov, A. A.; Zhluktov, S. V.; Savitsky, D. V.; Gavrilov, A. D.; Son, E. E.; Prokhorov, A. N.
2016-11-01
A prospective hypersonic HEXAFLY aircraft is considered in the given paper. In order to obtain the aerodynamic characteristics of a new construction design of the aircraft, experiments with a scaled model have been carried out in a wind tunnel under different conditions. The runs have been performed at different angles of attack with and without hydrogen combustion in the scaled propulsion engine. However, the measured physical quantities do not provide all the information about the flowfield. Numerical simulation can complete the experimental data as well as to reduce the number of wind tunnel experiments. Besides that, reliable CFD software can be used for calculations of the aerodynamic characteristics for any possible design of the full-scale aircraft under different operation conditions. The reliability of the numerical predictions must be confirmed in verification study of the software. The given work is aimed at numerical investigation of the flowfield around and inside the scaled model of the HEXAFLY-CIAM module under wind tunnel conditions. A cold run (without combustion) was selected for this study. The calculations are performed in the FlowVision CFD software. The flow characteristics are compared against the available experimental data. The carried out verification study confirms the capability of the FlowVision CFD software to calculate the flows discussed.
NASA Technical Reports Server (NTRS)
Thomas, Randy; Stueber, Thomas J.
2013-01-01
The System Identification (SysID) Rack is a real-time hardware-in-the-loop data acquisition (DAQ) and control instrument rack that was designed and built to support inlet testing in the NASA Glenn Research Center 10- by 10-Foot Supersonic Wind Tunnel. This instrument rack is used to support experiments on the Combined-Cycle Engine Large-Scale Inlet for Mode Transition Experiment (CCE? LIMX). The CCE?LIMX is a testbed for an integrated dual flow-path inlet configuration with the two flow paths in an over-and-under arrangement such that the high-speed flow path is located below the lowspeed flow path. The CCE?LIMX includes multiple actuators that are designed to redirect airflow from one flow path to the other; this action is referred to as "inlet mode transition." Multiple phases of experiments have been planned to support research that investigates inlet mode transition: inlet characterization (Phase-1) and system identification (Phase-2). The SysID Rack hardware design met the following requirements to support Phase-1 and Phase-2 experiments: safely and effectively move multiple actuators individually or synchronously; sample and save effector control and position sensor feedback signals; automate control of actuator positioning based on a mode transition schedule; sample and save pressure sensor signals; and perform DAQ and control processes operating at 2.5 KHz. This document describes the hardware components used to build the SysID Rack including their function, specifications, and system interface. Furthermore, provided in this document are a SysID Rack effectors signal list (signal flow); system identification experiment setup; illustrations indicating a typical SysID Rack experiment; and a SysID Rack performance overview for Phase-1 and Phase-2 experiments. The SysID Rack described in this document was a useful tool to meet the project objectives.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanchez, Rene Gerardo; Hutchinson, Jesson D.; Mcclure, Patrick Ray
2015-08-20
The intent of the integral experiment request IER 299 (called KiloPower by NASA) is to assemble and evaluate the operational performance of a compact reactor configuration that closely resembles the flight unit to be used by NASA to execute a deep space exploration mission. The reactor design will include heat pipes coupled to Stirling engines to demonstrate how one can generate electricity when extracting energy from a “nuclear generated” heat source. This series of experiments is a larger scale follow up to the DUFF series of experiments1,2 that were performed using the Flat-Top assembly.
NASA Technical Reports Server (NTRS)
Fowlis, W. W. (Editor); Davis, M. H. (Editor)
1981-01-01
The atmospheric general circulation experiment (AGCE) numerical design for Spacelab flights was studied. A spherical baroclinic flow experiment which models the large scale circulations of the Earth's atmosphere was proposed. Gravity is simulated by a radial dielectric body force. The major objective of the AGCE is to study nonlinear baroclinic wave flows in spherical geometry. Numerical models must be developed which accurately predict the basic axisymmetric states and the stability of nonlinear baroclinic wave flows. A three dimensional, fully nonlinear, numerical model and the AGCE based on the complete set of equations is required. Progress in the AGCE numerical design studies program is reported.
Hong S. He; Robert E. Keane; Louis R. Iverson
2008-01-01
Forest landscape models have become important tools for understanding large-scale and long-term landscape (spatial) processes such as climate change, fire, windthrow, seed dispersal, insect outbreak, disease propagation, forest harvest, and fuel treatment, because controlled field experiments designed to study the effects of these processes are often not possible (...
CRBR pump water test experience
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cook, M.E.; Huber, K.A.
1983-01-01
The hydraulic design features and water testing of the hydraulic scale model and prototype pump of the sodium pumps used in the primary and intermediate sodium loops of the Clinch River Breeder Reactor Plant (CRBRP) are described. The Hydraulic Scale Model tests are performed and the results of these tests are discussed. The Prototype Pump tests are performed and the results of these tests are discussed.
Photovoltaic balance-of-system designs and costs at PVUSA
NASA Astrophysics Data System (ADS)
Reyes, A. B.; Jennings, C.
1995-05-01
This report is one in a series of 1994-1995 PVUSA reports that document PVUSA lessons learned at demonstration sites in California and Texas. During the last 7 years (1988 to 1994), 16 PV systems ranging from 20 kW to 500 kW have been installed. Six 20-kW emerging module technology (EMT) arrays and three turnkey (i.e., vendor designed and integrated) utility-scale systems were procured and installed at PVUSA's main test site in Davis, California. PVUSA host utilities have installed a total of seven EMT arrays and utility-scale systems in their service areas. Additional systems at Davis and host utility sites are planned. One of PVUSA's key objectives is to evaluate the performance, reliability, and cost of PV balance-of-system (BOS). In the procurement stage PVUSA encouraged innovative design to improve upon present practice by reducing maintenance, improving reliability, or lowering manufacturing or construction costs. The project team worked closely with suppliers during the design stage not only to ensure designs met functional and safety specifications, but to provide suggestions for improvement. This report, intended for the photovoltaic (PV) industry and for utility project managers and engineers considering PV plant construction and ownership, documents PVUSA utility-scale system design and cost lessons learned. Complementary PVUSA topical reports document: construction and safety experience; five-year assessment of EMTs; validation of the Kerman 500-kW grid-support PV plant benefits; PVUSA instrumentation and data analysis techniques; procurement, acceptance, and rating practices for PV power plants; experience with power conditioning units and power quality.
"I'm Used to It Now": Experiences of Homophobia among Queer Youth in South African Township Schools
ERIC Educational Resources Information Center
Msibi, Thabo
2012-01-01
This paper explores how sexually marginalised black high-school students from conservative schooling contexts in KwaZulu-Natal, South Africa, experience schooling. It draws on queer theories through life narratives in presenting findings from a small-scale interventionist project designed by the author. The project involved 14 participants…
ERIC Educational Resources Information Center
Mokoena, Sello
2017-01-01
This small-scale study focused on the experiences of student teachers towards teaching practice in an open and distance learning (ODL) institution in South Africa. The sample consisted of 65 fourth year students enrolled for Bachelor of Education, specializing in secondary school teaching. The mixed-method research design consisting of…
Martin W. Ritchie; Kathleen A. Harcksen
2005-01-01
This paper describes implementation and early results of a large-scale, interdisciplinary experiment in the Goosenest Adaptive Management Area in northeastern California. The study is designed to investigate development of late-successional forest attributes in second-growth ponderosa pine stands. The experiment has four treatments replicated five times and encompasses...
Experimental Replication of an Aeroengine Combustion Instability
NASA Technical Reports Server (NTRS)
Cohen, J. M.; Hibshman, J. R.; Proscia, W.; Rosfjord, T. J.; Wake, B. E.; McVey, J. B.; Lovett, J.; Ondas, M.; DeLaat, J.; Breisacher, K.
2000-01-01
Combustion instabilities in gas turbine engines are most frequently encountered during the late phases of engine development, at which point they are difficult and expensive to fix. The ability to replicate an engine-traceable combustion instability in a laboratory-scale experiment offers the opportunity to economically diagnose the problem (to determine the root cause), and to investigate solutions to the problem, such as active control. The development and validation of active combustion instability control requires that the causal dynamic processes be reproduced in experimental test facilities which can be used as a test bed for control system evaluation. This paper discusses the process through which a laboratory-scale experiment was designed to replicate an instability observed in a developmental engine. The scaling process used physically-based analyses to preserve the relevant geometric, acoustic and thermo-fluid features. The process increases the probability that results achieved in the single-nozzle experiment will be scalable to the engine.
NASA Astrophysics Data System (ADS)
Huerta, N. J.; Fahrman, B.; Rod, K. A.; Fernandez, C. A.; Crandall, D.; Moore, J.
2017-12-01
Laboratory experiments provide a robust method to analyze well integrity. Experiments are relatively cheap, controlled, and repeatable. However, simplifying assumptions, apparatus limitations, and scaling are ubiquitous obstacles for translating results from the bench to the field. We focus on advancing the correlation between laboratory results and field conditions by characterizing how failure varies with specimen geometry using two experimental approaches. The first approach is designed to measure the shear bond strength between steel and cement in a down-scaled (< 3" diameter) well geometry. We use several cylindrical casing-cement-casing geometries that either mimic the scaling ratios found in the field or maximize the amount of metal and cement in the sample. We subject the samples to thermal shock cycles to simulate damage to the interfaces from operations. The bond was then measured via a push-out test. We found that not only did expected parameters, e.g. curing time, play a role in shear-bond strength but also that scaling of the geometry was important. The second approach is designed to observe failure of the well system due to pressure applied on the inside of a lab-scale (1.5" diameter) cylindrical casing-cement-rock geometry. The loading apparatus and sample are housed within an industrial X-ray CT scanner capable of imaging the system while under pressure. Radial tension cracks were observed in the cement after an applied internal pressure of 3000 psi and propagated through the cement and into the rock as pressure was increased. Based on our current suite of tests we find that the relationship between sample diameters and thicknesses is an important consideration when observing the strength and failure of well systems. The test results contribute to our knowledge of well system failure, evaluation and optimization of new cements, as well as the applicability of using scaled-down tests as a proxy for understanding field-scale conditions.
SPE5 Sub-Scale Test Series Summary Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vandersall, Kevin S.; Reeves, Robert V.; DeHaven, Martin R.
2016-01-14
A series of 2 SPE5 sub-scale tests were performed to experimentally confirm that a booster system designed and evaluated in prior tests would properly initiate the PBXN-110 case charge fill. To conduct the experiments, a canister was designed to contain the nominally 50 mm diameter booster tube with an outer fill of approximately 150 mm diameter by 150 mm in length. The canisters were filled with PBXN-110 at NAWS-China Lake and shipped back to LLNL for testing in the High Explosives Applications Facility (HEAF). Piezoelectric crystal pins were placed on the outside of the booster tube before filling, and amore » series of piezoelectric crystal pins along with Photonic Doppler Velocimetry (PDV) probes were placed on the outer surface of the canister to measure the relative timing and magnitude of the detonation. The 2 piezoelectric crystal pins integral to the booster design were also utilized along with a series of either piezoelectric crystal pins or piezoelectric polymer pads on the top of the canister or outside case that utilized direct contact, gaps, or different thicknesses of RTV cushions to obtain time of arrival data to evaluate the response in preparation for the large-scale SPE5 test. To further quantify the margin of the booster operation, the 1st test (SPE5SS1) was functioned with both detonators and the 2nd test (SPE5SS2) was functioned with only 1 detonator. A full detonation of the material was observed in both experiments as observed by the pin timing and PDV signals. The piezoelectric pads were found to provide a greater measured signal magnitude during the testing with an RTV layer present, and the improved response is due to the larger measurement surface area of the pad. This report will detail the experiment design, canister assembly for filling, final assembly, experiment firing, presentation of the diagnostic results, and a discussion of the results.« less
Conceptual design of Dipole Research Experiment (DREX)
NASA Astrophysics Data System (ADS)
Xiao, Qingmei; Wang, Zhibin; Wang, Xiaogang; Xiao, Chijie; Yang, Xiaoyi; Zheng, Jinxing
2017-03-01
A new terrella-like device for laboratory simulation of inner magnetosphere plasmas, Dipole Research Experiment, is scheduled to be built at the Harbin Institute of Technology (HIT), China, as a major state scientific research facility for space physics studies. It is designed to provide a ground experimental platform to reproduce the inner magnetosphere to simulate the processes of trapping, acceleration, and transport of energetic charged particles restrained in a dipole magnetic field configuration. The scaling relation of hydromagnetism between the laboratory plasma of the device and the geomagnetosphere plasma is applied to resemble geospace processes in the Dipole Research Experiment plasma. Multiple plasma sources, different kinds of coils with specific functions, and advanced diagnostics are designed to be equipped in the facility for multi-functions. The motivation, design criteria for the Dipole Research Experiment experiments and the means applied to generate the plasma of desired parameters in the laboratory are also described. Supported by National Natural Science Foundation of China (Nos. 11505040, 11261140326 and 11405038), China Postdoctoral Science Foundation (Nos. 2016M591518, 2015M570283) and Project Supported by Natural Scientific Research Innovation Foundation in Harbin Institute of Technology (No. 2017008).
Large-Scale Spacecraft Fire Safety Tests
NASA Technical Reports Server (NTRS)
Urban, David; Ruff, Gary A.; Ferkul, Paul V.; Olson, Sandra; Fernandez-Pello, A. Carlos; T'ien, James S.; Torero, Jose L.; Cowlard, Adam J.; Rouvreau, Sebastien; Minster, Olivier;
2014-01-01
An international collaborative program is underway to address open issues in spacecraft fire safety. Because of limited access to long-term low-gravity conditions and the small volume generally allotted for these experiments, there have been relatively few experiments that directly study spacecraft fire safety under low-gravity conditions. Furthermore, none of these experiments have studied sample sizes and environment conditions typical of those expected in a spacecraft fire. The major constraint has been the size of the sample, with prior experiments limited to samples of the order of 10 cm in length and width or smaller. This lack of experimental data forces spacecraft designers to base their designs and safety precautions on 1-g understanding of flame spread, fire detection, and suppression. However, low-gravity combustion research has demonstrated substantial differences in flame behavior in low-gravity. This, combined with the differences caused by the confined spacecraft environment, necessitates practical scale spacecraft fire safety research to mitigate risks for future space missions. To address this issue, a large-scale spacecraft fire experiment is under development by NASA and an international team of investigators. This poster presents the objectives, status, and concept of this collaborative international project (Saffire). The project plan is to conduct fire safety experiments on three sequential flights of an unmanned ISS re-supply spacecraft (the Orbital Cygnus vehicle) after they have completed their delivery of cargo to the ISS and have begun their return journeys to earth. On two flights (Saffire-1 and Saffire-3), the experiment will consist of a flame spread test involving a meter-scale sample ignited in the pressurized volume of the spacecraft and allowed to burn to completion while measurements are made. On one of the flights (Saffire-2), 9 smaller (5 x 30 cm) samples will be tested to evaluate NASAs material flammability screening tests. The first flight (Saffire-1) is scheduled for July 2015 with the other two following at six-month intervals. A computer modeling effort will complement the experimental effort. Although the experiment will need to meet rigorous safety requirements to ensure the carrier vehicle does not sustain damage, the absence of a crew removes the need for strict containment of combustion products. This will facilitate the first examination of fire behavior on a scale that is relevant to spacecraft fire safety and will provide unique data for fire model validation.
Oilfield scales: controls on precipitation and crystal morphology of barite (barium sulphate)
NASA Astrophysics Data System (ADS)
Stark, A. I. R.; Wogelius, R. A.; Vaughan, D. J.
2003-04-01
The precipitation and subsequent build up of barite (barium sulphate) inside extraction tubing presents a costly problem for off shore oil wells which use seawater to mobilize oil during hydrocarbon recovery. Mixing of reservoir formation water containing Ba2+ ions and seawater containing SO_42- ions results in barite precipitation within the reservoir well-bore region and piping. Great effort has been expended in designing strategies to minimize scale formation but details of the reaction mechanism and sensitivity to thermodynamic variables are poorly constrained. Furthermore, few detailed studies have been carried out under simulated field conditions. Hence an experimental programme was designed to study barite formation under environmentally relevant conditions with control of several system variables during the precipitation reaction. Synthetic sea-water and formation-water brines containing sodium sulphate and barium chloride, respectively, were mixed to induce BaSO_4 precipitation. Experiments were carried out at high temperature (100^oC) and high pressure (500 bars) in double rocking autoclave bombs. Barite formation as a function of the addition of calcium, magnesium, and a generic phosphonate based scale inhibitor was investigated whilst maintaining constant pH, temperature and ionic strength (0.5159). Additional experiments were performed at ambient conditions for comparison. Data concerning nucleation, growth rates, and crystal morphology were obtained. ICP-AES data from the supernatant product solutions showed considerable variation in quantity of barium sulphate precipitated as a function of the listed experimental variables. For example, ESEM analysis of barium sulphate crystals showed a dramatic shift in crystal habit from the typical tabular habit produced in control experiments; experiments performed in the presence of foreign cations produced more equant crystals, while those experiments completed in the presence of the phosphonate scale inhibitor produced precipitates with distorted anhedral shapes. Based on these preliminary results, further experiments which monitor rate and morphology as a function of Ba/Ca ratio, ionic strength, and ion activity product for barite will also be completed.
Thermal systems design and analysis for a 10 K Sorption Cryocooler flight experiment
NASA Technical Reports Server (NTRS)
Bhandari, Pradeep; Bard, Steven
1993-01-01
The design, analysis and predicted performance of the Brilliant Eyes Ten-Kelvin Sorption Cryocooler Experiment (BETSCE) is described from a thermal perspective. BETSCE is a shuttle side-wall mounted cryogenic technology demonstration experiment planned for launch in November 1994. BETSCE uses a significant amount of power (about 500 W peak) and the resultant heat must be rejected passively with radiators, as BETSCE has no access to the active cooling capability of the shuttle. It was a major challenge to design and configure the individual hardware assemblies, with their relatively large radiators, to enable them to reject their heat while satisfying numerous severe shuttle-imposed constraints. This paper is a useful case study of a small shuttle payload that needs to reject relatively high heat loads passively in a highly constrained thermal environment. The design approach described is consistent with today's era of 'faster, better, cheaper' small-scale space missions.
Accretion shocks in the laboratory: Design of an experiment to study star formation
Young, Rachel P.; Kuranz, C. C.; Drake, R. P.; ...
2017-02-13
Here, we present the design of a laboratory-astrophysics experiment to study magnetospheric accretion relevant to young, pre-main-sequence stars. Spectra of young stars show evidence of hotspots created when streams of accreting material impact the surface of the star and create shocks. The structures that form during this process are poorly understood, as the surfaces of young stars cannot be spatially resolved. Our experiment would create a scaled "accretion shock" at a major (several kJ) laser facility. The experiment drives a plasma jet (the "accretion stream") into a solid block (the "stellar surface"), in the presence of a parallel magnetic fieldmore » analogous to the star's local field.« less
Design Against Propagating Shear Failure in Pipelines
NASA Astrophysics Data System (ADS)
Leis, B. N.; Gray, J. Malcolm
Propagating shear failure can occur in gas and certain hazardous liquid transmission pipelines, potentially leading to a large long-burning fire and/or widespread pollution, depending on the transported product. Such consequences require that the design of the pipeline and specification of the steel effectively preclude the chance of propagating shear failure. Because the phenomenology of such failures is complex, design against such occurrences historically has relied on full-scale demonstration experiments coupled with empirically calibrated analytical models. However, as economic drivers have pushed toward larger diameter higher pressure pipelines made of tough higher-strength grades, the design basis to ensure arrest has been severely compromised. Accordingly, for applications where the design basis becomes less certain, as has occurred increasing as steel grade and toughness has increased, it has become necessary to place greater reliance on the use and role of full-scale testing.
Gaming the System: Culture, Process, and Perspectives Supporting a Game and App Design Curriculum
ERIC Educational Resources Information Center
Herro, Danielle
2015-01-01
Games and digital media experiences permeate the lives of youth. Researchers have argued the participatory attributes and cognitive benefits of gaming and media production for more than a decade, relying on socio-cultural theory to bolster their claims. Only recently have large-scale efforts ensued towards moving game play and design into formal…
2009-01-01
Single phase fluid flow in microchannels has been widely investigated ( Morini , 2006; Abdelaziz et al., 2008) and it was verified that the conventional...Optimization, Kluwer. 203 78. Morini , G. L., 2006, “Scaling Effects for Liquid Flows in Microchannels,” Heat Transfer Engineering, Vol. 27, No. 4, pp
ACCESS: Design and Sub-System Performance
NASA Technical Reports Server (NTRS)
Kaiser, Mary Elizabeth; Morris, Matthew J.; McCandliss, Stephan R.; Rasucher, Bernard J.; Kimble, Randy A.; Kruk, Jeffrey W.; Pelton, Russell; Mott, D. Brent; Wen, Hiting; Foltz, Roger;
2012-01-01
Establishing improved spectrophotometric standards is important for a broad range of missions and is relevant to many astrophysical problems. ACCESS, "Absolute Color Calibration Experiment for Standard Stars", is a series of rocket-borne sub-orbital missions and ground-based experiments designed to enable improvements in the precision of the astrophysical flux scale through the transfer of absolute laboratory detector standards from the National Institute of Standards and Technology (NIST) to a network of stellar standards with a calibration accuracy of 1% and a spectral resolving power of 500 across the 0.35 -1.7 micrometer bandpass.
Scaling of Performance in Liquid Propellant Rocket Engine Combustors
NASA Technical Reports Server (NTRS)
Hulka, James
2008-01-01
The objectives are: a) Re-introduce to you the concept of scaling; b) Describe the scaling research conducted in the 1950s and early 1960s, and present some of their conclusions; c) Narrow the focus to scaling for performance of combustion devices for liquid propellant rocket engines; and d) Present some results of subscale to full-scale performance from historical programs. Scaling is "The ability to develop new combustion devices with predictable performance on the basis of test experience with old devices." Scaling can be used to develop combustion devices of any thrust size from any thrust size. Scaling is applied mostly to increase thrust. Objective is to use scaling as a development tool. - Move injector design from an "art" to a "science"
Multidisciplinary Design, Analysis, and Optimization Tool Development Using a Genetic Algorithm
NASA Technical Reports Server (NTRS)
Pak, Chan-gi; Li, Wesley
2009-01-01
Multidisciplinary design, analysis, and optimization using a genetic algorithm is being developed at the National Aeronautics and Space Administration Dryden Flight Research Center (Edwards, California) to automate analysis and design process by leveraging existing tools to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic, and hypersonic aircraft. This is a promising technology, but faces many challenges in large-scale, real-world application. This report describes current approaches, recent results, and challenges for multidisciplinary design, analysis, and optimization as demonstrated by experience with the Ikhana fire pod design.!
The Cosmology Large Angular Scale Surveyor (CLASS): 38 GHz Detector Array of Bolometric Polarimeters
NASA Technical Reports Server (NTRS)
Appel, John W.; Ali, Aamir; Amiri, Mandana; Araujo, Derek; Bennett, Charles L.; Boone, Fletcher; Chan, Manwei; Cho, Hsiao-Mei; Chuss, David T.; Colazo, Felipe;
2014-01-01
The Cosmology Large Angular Scale Surveyor (CLASS) experiment aims to map the polarization of the Cosmic Microwave Background (CMB) at angular scales larger than a few degrees. Operating from Cerro Toco in the Atacama Desert of Chile, it will observe over 65% of the sky at 38, 93, 148, and 217 GHz. In this paper we discuss the design, construction, and characterization of the CLASS 38 GHz detector focal plane, the first ever Q-band bolometric polarimeter array.
The cosmology large angular scale surveyor (CLASS): 38-GHz detector array of bolometric polarimeters
NASA Astrophysics Data System (ADS)
Appel, John W.; Ali, Aamir; Amiri, Mandana; Araujo, Derek; Bennet, Charles L.; Boone, Fletcher; Chan, Manwei; Cho, Hsiao-Mei; Chuss, David T.; Colazo, Felipe; Crowe, Erik; Denis, Kevin; Dünner, Rolando; Eimer, Joseph; Essinger-Hileman, Thomas; Gothe, Dominik; Halpern, Mark; Harrington, Kathleen; Hilton, Gene; Hinshaw, Gary F.; Huang, Caroline; Irwin, Kent; Jones, Glenn; Karakula, John; Kogut, Alan J.; Larson, David; Limon, Michele; Lowry, Lindsay; Marriage, Tobias; Mehrle, Nicholas; Miller, Amber D.; Miller, Nathan; Moseley, Samuel H.; Novak, Giles; Reintsema, Carl; Rostem, Karwan; Stevenson, Thomas; Towner, Deborah; U-Yen, Kongpop; Wagner, Emily; Watts, Duncan; Wollack, Edward; Xu, Zhilei; Zeng, Lingzhen
2014-07-01
The Cosmology Large Angular Scale Surveyor (CLASS) experiment aims to map the polarization of the Cosmic Microwave Background (CMB) at angular scales larger than a few degrees. Operating from Cerro Toco in the Atacama Desert of Chile, it will observe over 65% of the sky at 38, 93, 148, and 217 GHz. In this paper we discuss the design, construction, and characterization of the CLASS 38 GHz detector focal plane, the first ever Q-band bolometric polarimeter array.
Real-time gray-scale photolithography for fabrication of continuous microstructure
NASA Astrophysics Data System (ADS)
Peng, Qinjun; Guo, Yongkang; Liu, Shijie; Cui, Zheng
2002-10-01
A novel real-time gray-scale photolithography technique for the fabrication of continuous microstructures that uses a LCD panel as a real-time gray-scale mask is presented. The principle of design of the technique is explained, and computer simulation results based on partially coherent imaging theory are given for the patterning of a microlens array and a zigzag grating. An experiment is set up, and a microlens array and a zigzag grating on panchromatic silver halide sensitized gelatin with trypsinase etching are obtained.
Scale/TSUNAMI Sensitivity Data for ICSBEP Evaluations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T; Reed, Davis Allan; Lefebvre, Robert A
2011-01-01
The Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) software developed at Oak Ridge National Laboratory (ORNL) as part of the Scale code system provide unique methods for code validation, gap analysis, and experiment design. For TSUNAMI analysis, sensitivity data are generated for each application and each existing or proposed experiment used in the assessment. The validation of diverse sets of applications requires potentially thousands of data files to be maintained and organized by the user, and a growing number of these files are available through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE) distributed through themore » International Criticality Safety Benchmark Evaluation Program (ICSBEP). To facilitate the use of the IHECSBE benchmarks in rigorous TSUNAMI validation and gap analysis techniques, ORNL generated SCALE/TSUNAMI sensitivity data files (SDFs) for several hundred benchmarks for distribution with the IHECSBE. For the 2010 edition of IHECSBE, the sensitivity data were generated using 238-group cross-section data based on ENDF/B-VII.0 for 494 benchmark experiments. Additionally, ORNL has developed a quality assurance procedure to guide the generation of Scale inputs and sensitivity data, as well as a graphical user interface to facilitate the use of sensitivity data in identifying experiments and applying them in validation studies.« less
NASA Astrophysics Data System (ADS)
Ji, H.; Bhattacharjee, A.; Prager, S.; Daughton, W. S.; Bale, S. D.; Carter, T. A.; Crocker, N.; Drake, J. F.; Egedal, J.; Sarff, J.; Wallace, J.; Belova, E.; Ellis, R.; Fox, W. R., II; Heitzenroeder, P.; Kalish, M.; Jara-Almonte, J.; Myers, C. E.; Que, W.; Ren, Y.; Titus, P.; Yamada, M.; Yoo, J.
2014-12-01
A new intermediate-scale plasma experiment, called the Facility for Laboratory Reconnection Experiments or FLARE, is under construction at Princeton as a joint project by five universities and two national labs to study magnetic reconnection in regimes directly relevant to space, solar and astrophysical plasmas. The currently existing small-scale experiments have been focusing on the single X-line reconnection process in plasmas either with small effective sizes or at low Lundquist numbers, both of which are typically very large in natural plasmas. These new regimes involve multiple X-lines as guided by a reconnection "phase diagram", in which different coupling mechanisms from the global system scale to the local dissipation scale are classified into different reconnection phases [H. Ji & W. Daughton, Phys. Plasmas 18, 111207 (2011)]. The design of the FLARE device is based on the existing Magnetic Reconnection Experiment (MRX) at Princeton (http://mrx.pppl.gov) and is to provide experimental access to the new phases involving multiple X-lines at large effective sizes and high Lundquist numbers, directly relevant to space and solar plasmas. The motivating major physics questions, the construction status, and the planned collaborative research especially with space and solar research communities will be discussed.
NASA Astrophysics Data System (ADS)
Ji, Hantao; Bhattacharjee, A.; Prager, S.; Daughton, W.; Bale, Stuart D.; Carter, T.; Crocker, N.; Drake, J.; Egedal, J.; Sarff, J.; Fox, W.; Jara-Almonte, J.; Myers, C.; Ren, Y.; Yamada, M.; Yoo, J.
2015-04-01
A new intermediate-scale plasma experiment, called the Facility for Laboratory Reconnection Experiments or FLARE (flare.pppl.gov), is under construction at Princeton as a joint project by five universities and two national labs to study magnetic reconnection in regimes directly relevant to heliophysical and astrophysical plasmas. The currently existing small-scale experiments have been focusing on the single X-line reconnection process in plasmas either with small effective sizes or at low Lundquist numbers, both of which are typically very large in natural plasmas. These new regimes involve multiple X-lines as guided by a reconnection "phase diagram", in which different coupling mechanisms from the global system scale to the local dissipation scale are classified into different reconnection phases [H. Ji & W. Daughton, Phys. Plasmas 18, 111207 (2011)]. The design of the FLARE device is based on the existing Magnetic Reconnection Experiment (MRX) (mrx.pppl.gov) and is to provide experimental access to the new phases involving multiple X-lines at large effective sizes and high Lundquist numbers, directly relevant to magnetospheric, solar wind, and solar coronal plasmas. After a brief summary of recent laboratory results on the topic of magnetic reconnection, the motivating major physics questions, the construction status, and the planned collaborative research especially with heliophysics communities will be discussed.
NASA Technical Reports Server (NTRS)
DeLay, Tom K.; Munafo, Paul (Technical Monitor)
2001-01-01
The AFRL USFE project is an experimental test bed for new propulsion technologies. It will utilize ambient temperature fuel and oxidizers (Kerosene and Hydrogen peroxide). The system is pressure fed, not pump fed, and will utilize a helium pressurant tank to drive the system. Mr. DeLay has developed a method for cost effectively producing a unique, large pressurant tank that is not commercially available. The pressure vessel is a layered composite structure with an electroformed metallic permeation barrier. The design/process is scalable and easily adaptable to different configurations with minimal cost in tooling development 1/3 scale tanks have already been fabricated and are scheduled for testing. The full-scale pressure vessel (50" diameter) design will be refined based on the performance of the sub-scale tank. The pressure vessels have been designed to operate at 6,000 psi. a PV/W of 1.92 million is anticipated.
A quantitative approach to evaluating caring in nursing simulation.
Eggenberger, Terry L; Keller, Kathryn B; Chase, Susan K; Payne, Linda
2012-01-01
This study was designed to test a quantitative method of measuring caring in the simulated environment. Since competency in caring is central to nursing practice, ways of including caring concepts in designing scenarios and in evaluation of performance need to be developed. Coates' Caring Efficacy scales were adapted for simulation and named the Caring Efficacy Scale-Simulation Student Version (CES-SSV) and Caring Efficacy Scale-Simulation Faculty Version (CES-SFV). A correlational study was designed to compare student self-ratings with faculty ratings on caring efficacy during an adult acute simulation experience with traditional and accelerated baccalaureate students in a nursing program grounded in caring theory. Student self-ratings were significantly correlated with objective ratings (r = 0.345, 0.356). Both the CES-SSV and the CES-SFV were found to have excellent internal consistency and significantly correlated interrater reliability. They were useful in measuring caring in the simulated learning environment.
The Intelligent Control System and Experiments for an Unmanned Wave Glider.
Liao, Yulei; Wang, Leifeng; Li, Yiming; Li, Ye; Jiang, Quanquan
2016-01-01
The control system designing of Unmanned Wave Glider (UWG) is challenging since the control system is weak maneuvering, large time-lag and large disturbance, which is difficult to establish accurate mathematical model. Meanwhile, to complete marine environment monitoring in long time scale and large spatial scale autonomously, UWG asks high requirements of intelligence and reliability. This paper focuses on the "Ocean Rambler" UWG. First, the intelligent control system architecture is designed based on the cerebrum basic function combination zone theory and hierarchic control method. The hardware and software designing of the embedded motion control system are mainly discussed. A motion control system based on rational behavior model of four layers is proposed. Then, combining with the line-of sight method(LOS), a self-adapting PID guidance law is proposed to compensate the steady state error in path following of UWG caused by marine environment disturbance especially current. Based on S-surface control method, an improved S-surface heading controller is proposed to solve the heading control problem of the weak maneuvering carrier under large disturbance. Finally, the simulation experiments were carried out and the UWG completed autonomous path following and marine environment monitoring in sea trials. The simulation experiments and sea trial results prove that the proposed intelligent control system, guidance law, controller have favorable control performance, and the feasibility and reliability of the designed intelligent control system of UWG are verified.
The Intelligent Control System and Experiments for an Unmanned Wave Glider
Liao, Yulei; Wang, Leifeng; Li, Yiming; Li, Ye; Jiang, Quanquan
2016-01-01
The control system designing of Unmanned Wave Glider (UWG) is challenging since the control system is weak maneuvering, large time-lag and large disturbance, which is difficult to establish accurate mathematical model. Meanwhile, to complete marine environment monitoring in long time scale and large spatial scale autonomously, UWG asks high requirements of intelligence and reliability. This paper focuses on the “Ocean Rambler” UWG. First, the intelligent control system architecture is designed based on the cerebrum basic function combination zone theory and hierarchic control method. The hardware and software designing of the embedded motion control system are mainly discussed. A motion control system based on rational behavior model of four layers is proposed. Then, combining with the line-of sight method(LOS), a self-adapting PID guidance law is proposed to compensate the steady state error in path following of UWG caused by marine environment disturbance especially current. Based on S-surface control method, an improved S-surface heading controller is proposed to solve the heading control problem of the weak maneuvering carrier under large disturbance. Finally, the simulation experiments were carried out and the UWG completed autonomous path following and marine environment monitoring in sea trials. The simulation experiments and sea trial results prove that the proposed intelligent control system, guidance law, controller have favorable control performance, and the feasibility and reliability of the designed intelligent control system of UWG are verified. PMID:28005956
Plasma Discharge Initiation of Explosives in Rock Blasting Application: A Case Study
NASA Astrophysics Data System (ADS)
Jae-Ou, Chae; Young-Jun, Jeong; V, M. Shmelev; A, A. Denicaev; V, M. Poutchkov; V, Ravi
2006-07-01
A plasma discharge initiation system for the explosive volumetric combustion charge was designed, investigated and developed for practical application. Laboratory scale experiments were carried out before conducting the large scale field tests. The resultant explosions gave rise to less noise, insignificant seismic vibrations and good specific explosive consumption for rock blasting. Importantly, the technique was found to be safe and environmentally friendly.
ERIC Educational Resources Information Center
Ochando-Pulido, J. M.
2017-01-01
The Chemical Engineering Department at the University of Granada have endeavored to make a number of high quality experiments to familiarize our students with our latest research and also scale-up of processes. A pilot-scale wastewater treatment plant was set-up to give students a close practical view of the treatments of effluents by-produced in…
From the past to the future: Integrating work experience into the design process.
Bittencourt, João Marcos; Duarte, Francisco; Béguin, Pascal
2017-01-01
Integrating work activity issues into design process is a broadly discussed theme in ergonomics. Participation is presented as the main means for such integration. However, a late participation can limit the development of both project solutions and future work activity. This article presents the concept of construction of experience aiming at the articulated development of future activities and project solutions. It is a non-teleological approach where the initial concepts will be transformed by the experience built up throughout the design process. The method applied was a case study of an ergonomic participation during the design of a new laboratory complex for biotechnology research. Data was obtained through analysis of records in a simulation process using a Lego scale model and interviews with project participants. The simulation process allowed for developing new ways of working and generating changes in the initial design solutions, which enable workers to adopt their own developed strategies for conducting work more safely and efficiently in the future work system. Each project decision either opens or closes a window of opportunities for developing a future activity. Construction of experience in a non-teleological design process allows for understanding the consequences of project solutions for future work.
NASA Astrophysics Data System (ADS)
Takatsuji, Toshiyuki; Tanaka, Ken-ichi
1996-06-01
A procedure is derived by which sensory attributes can be scaled as a function of various physical and/or chemical properties of the object to be tested. This procedure consists of four successive steps: (i) design and experiment, (ii) fabrication of specimens according to the design parameters, (iii) assessment of a sensory attribute using sensory evaluation and (iv) derivation of the relationship between the parameters and the sensory attribute. In these steps an experimental design using orthogonal arrays, analysis of variance and regression analyses are used strategically. When a specimen with the design parameters cannot be physically fabricated, an alternative specimen having parameters closest to the design is selected from a group of specimens which can be physically made. The influence of the deviation of actual parameters from the desired ones is also discussed. A method of confirming the validity of the regression equation is also investigated. The procedure is applied to scale the sensory sharpness of kitchen knives as a function of the edge angle and the roughness of the cutting edge.
Diagnosis of Acceleration, Reconnection, Turbulence, and Heating
NASA Astrophysics Data System (ADS)
Dufor, Mikal T.; Jemiolo, Andrew J.; Keesee, Amy; Cassak, Paul; Tu, Weichao; Scime, Earl E.
2017-10-01
The DARTH (Diagnosis of Acceleration, Reconnection, Turbulence, and Heating) experiment is an intermediate-scale, experimental facility designed to study magnetic reconnection at and below the kinetic scale of ions and electrons. The experiment will have non-perturbative diagnostics with high temporal and three-dimensional spatial resolution, giving it the capability to investigate kinetic-scale physics. Of specific scientific interest are particle acceleration, plasma heating, turbulence and energy dissipation during reconnection. Here we will describe the magnetic field system and the two plasma guns used to create flux ropes that then merge through magnetic reconnection. We will also describe the key diagnostic systems: laser induced fluorescence (LIF) for ion vdf measurements, a 300 GHz microwave scattering system for sub-mm wavelength fluctuation measurements and a Thomson scattering laser for electron vdf measurements. The vacuum chamber is designed to provide unparalleled access for these particle diagnostics. The scientific goals of DARTH are to examine particle acceleration and heating during, the role of three-dimensional instabilities during reconnection, how reconnection ceases, and the role of impurities and asymmetries in reconnection. This work was supported by the by the O'Brien Energy Research Fund.
NASA Astrophysics Data System (ADS)
Kornev, V. A.; Askinazi, L. G.; Belokurov, A. A.; Chernyshev, F. V.; Lebedev, S. V.; Melnik, A. D.; Shabelsky, A. A.; Tukachinsky, A. S.; Zhubr, N. A.
2017-12-01
The paper presents DD neutron flux measurements in neutron beam injection (NBI) experiments aimed at the optimization of target plasma and heating beam parameters to achieve maximum neutron flux in the TUMAN-3M compact tokamak. Two ion sources of different design were used, which allowed the separation of the beam’s energy and power influence on the neutron rate. Using the database of experiments performed with the two ion sources, an empirical scaling was derived describing the neutron rate dependence on the target plasma and heating beam parameters. Numerical modeling of the neutron rate in the NBI experiments performed using the ASTRA transport code showed good agreement with the scaling.
On the role of minicomputers in structural design
NASA Technical Reports Server (NTRS)
Storaasli, O. O.
1977-01-01
Results are presented of exploratory studies on the use of a minicomputer in conjunction with large-scale computers to perform structural design tasks, including data and program management, use of interactive graphics, and computations for structural analysis and design. An assessment is made of minicomputer use for the structural model definition and checking and for interpreting results. Included are results of computational experiments demonstrating the advantages of using both a minicomputer and a large computer to solve a large aircraft structural design problem.
Inverse problems in the design, modeling and testing of engineering systems
NASA Technical Reports Server (NTRS)
Alifanov, Oleg M.
1991-01-01
Formulations, classification, areas of application, and approaches to solving different inverse problems are considered for the design of structures, modeling, and experimental data processing. Problems in the practical implementation of theoretical-experimental methods based on solving inverse problems are analyzed in order to identify mathematical models of physical processes, aid in input data preparation for design parameter optimization, help in design parameter optimization itself, and to model experiments, large-scale tests, and real tests of engineering systems.
Miskovic, Ljubisa; Alff-Tuomala, Susanne; Soh, Keng Cher; Barth, Dorothee; Salusjärvi, Laura; Pitkänen, Juha-Pekka; Ruohonen, Laura; Penttilä, Merja; Hatzimanikatis, Vassily
2017-01-01
Recent advancements in omics measurement technologies have led to an ever-increasing amount of available experimental data that necessitate systems-oriented methodologies for efficient and systematic integration of data into consistent large-scale kinetic models. These models can help us to uncover new insights into cellular physiology and also to assist in the rational design of bioreactor or fermentation processes. Optimization and Risk Analysis of Complex Living Entities (ORACLE) framework for the construction of large-scale kinetic models can be used as guidance for formulating alternative metabolic engineering strategies. We used ORACLE in a metabolic engineering problem: improvement of the xylose uptake rate during mixed glucose-xylose consumption in a recombinant Saccharomyces cerevisiae strain. Using the data from bioreactor fermentations, we characterized network flux and concentration profiles representing possible physiological states of the analyzed strain. We then identified enzymes that could lead to improved flux through xylose transporters (XTR). For some of the identified enzymes, including hexokinase (HXK), we could not deduce if their control over XTR was positive or negative. We thus performed a follow-up experiment, and we found out that HXK2 deletion improves xylose uptake rate. The data from the performed experiments were then used to prune the kinetic models, and the predictions of the pruned population of kinetic models were in agreement with the experimental data collected on the HXK2 -deficient S. cerevisiae strain. We present a design-build-test cycle composed of modeling efforts and experiments with a glucose-xylose co-utilizing recombinant S. cerevisiae and its HXK2 -deficient mutant that allowed us to uncover interdependencies between upper glycolysis and xylose uptake pathway. Through this cycle, we also obtained kinetic models with improved prediction capabilities. The present study demonstrates the potential of integrated "modeling and experiments" systems biology approaches that can be applied for diverse applications ranging from biotechnology to drug discovery.
Development of Large-Scale Spacecraft Fire Safety Experiments
NASA Technical Reports Server (NTRS)
Ruff, Gary A.; Urban, David; Fernandez-Pello, A. Carlos; T'ien, James S.; Torero, Jose L.; Legros, Guillaume; Eigenbrod, Christian; Smirnov, Nickolay; Fujita, Osamu; Cowlard, Adam J.;
2013-01-01
The status is presented of a spacecraft fire safety research project that is under development to reduce the uncertainty and risk in the design of spacecraft fire safety systems by testing at nearly full scale in low-gravity. Future crewed missions are expected to be more complex and longer in duration than previous exploration missions outside of low-earth orbit. This will increase the challenge of ensuring a fire-safe environment for the crew throughout the mission. Based on our fundamental uncertainty of the behavior of fires in low-gravity, the need for realistic scale testing at reduced gravity has been demonstrated. To address this gap in knowledge, a project has been established under the NASA Advanced Exploration Systems Program under the Human Exploration and Operations Mission directorate with the goal of substantially advancing our understanding of the spacecraft fire safety risk. Associated with the project is an international topical team of fire experts from other space agencies who conduct research that is integrated into the overall experiment design. The experiments are under development to be conducted in an Orbital Science Corporation Cygnus vehicle after it has undocked from the ISS. Although the experiment will need to meet rigorous safety requirements to ensure the carrier vehicle does not sustain damage, the absence of a crew removes the need for strict containment of combustion products. The tests will be fully automated with the data downlinked at the conclusion of the test before the Cygnus vehicle reenters the atmosphere. A computer modeling effort will complement the experimental effort. The international topical team is collaborating with the NASA team in the definition of the experiment requirements and performing supporting analysis, experimentation and technology development. The status of the overall experiment and the associated international technology development efforts are summarized.
ERIC Educational Resources Information Center
Maretzki, A.; Shimabukuro, S.
Nutrition curriculum design research was undertaken to address the issue of linkage between school food experiences and home food experiences of elementary school children in Hawaii. One hundred and forty-four parents judged the relative importance of seventeen food-related activites. The sample consisted of Asian, Caucasian, and Polynesian…
Peter A. Bisson; Shannon M. Claeson; Steven M. Wondzell; Alex D. Foster; Ashley Steel
2013-01-01
We present preliminary results from an experiment in which alternative forest buff er treatments were applied to clusters of watersheds in southwest Washington using a Before-After-Control-Impact (BACI) design. Th e treatments occurred on small (~2- to 9-ha) headwater catchments, and compared continuous fi xed-width buff ered, discontinuous patch-buff ered, and...
NASA Astrophysics Data System (ADS)
Solodov, A. A.; Rosenberg, M. J.; Myatt, J. F.; Shaw, J. G.; Seka, W.; Epstein, R.; Short, R. W.; Follett, R. K.; Regan, S. P.; Froula, D. H.; Radha, P. B.; Michel, P.; Chapman, T.; Hohenberger, M.
2017-10-01
Laser-plasma interaction (LPI) instabilities, such as stimulated Raman scattering (SRS) and two-plasmon decay, can be detrimental for direct-drive inertial confinement fusion because of target preheat by the high-energy electrons they generate. The radiation-hydrodynamic code DRACO was used to design planar-target experiments at the National Ignition Facility that generated plasma and interaction conditions relevant to ignition direct-drive designs (IL 1015W/cm2 , Te > 3 keV, density gradient scale lengths of Ln 600 μm). Laser-energy conversion efficiency to hot electrons of 0.5% to 2.5% with temperature of 45 to 60 keV was inferred from the experiment when the laser intensity at the quarter-critical surface increased from 6 to 15 ×1014W/cm2 . LPI was dominated by SRS, as indicated by the measured scattered-light spectra. Simulations of SRS using the LPI code LPSE have been performed and compared with predictions of theoretical models. Implications for ignition-scale direct-drive experiments will be discussed. This material is based upon work supported by the Department of Energy National Nuclear Security Administration under Award Number DE-NA0001944.
Kinter, Elizabeth T; Prior, Thomas J; Carswell, Christopher I; Bridges, John F P
2012-01-01
While the application of conjoint analysis and discrete-choice experiments in health are now widely accepted, a healthy debate exists around competing approaches to experimental design. There remains, however, a paucity of experimental evidence comparing competing design approaches and their impact on the application of these methods in patient-centered outcomes research. Our objectives were to directly compare the choice-model parameters and predictions of an orthogonal and a D-efficient experimental design using a randomized trial (i.e., an experiment on experiments) within an application of conjoint analysis studying patient-centered outcomes among outpatients diagnosed with schizophrenia in Germany. Outpatients diagnosed with schizophrenia were surveyed and randomized to receive choice tasks developed using either an orthogonal or a D-efficient experimental design. The choice tasks elicited judgments from the respondents as to which of two patient profiles (varying across seven outcomes and process attributes) was preferable from their own perspective. The results from the two survey designs were analyzed using the multinomial logit model, and the resulting parameter estimates and their robust standard errors were compared across the two arms of the study (i.e., the orthogonal and D-efficient designs). The predictive performances of the two resulting models were also compared by computing their percentage of survey responses classified correctly, and the potential for variation in scale between the two designs of the experiments was tested statistically and explored graphically. The results of the two models were statistically identical. No difference was found using an overall chi-squared test of equality for the seven parameters (p = 0.69) or via uncorrected pairwise comparisons of the parameter estimates (p-values ranged from 0.30 to 0.98). The D-efficient design resulted in directionally smaller standard errors for six of the seven parameters, of which only two were statistically significant, and no differences were found in the observed D-efficiencies of their standard errors (p = 0.62). The D-efficient design resulted in poorer predictive performance, but this was not significant (p = 0.73); there was some evidence that the parameters of the D-efficient design were biased marginally towards the null. While no statistical difference in scale was detected between the two designs (p = 0.74), the D-efficient design had a higher relative scale (1.06). This could be observed when the parameters were explored graphically, as the D-efficient parameters were lower. Our results indicate that orthogonal and D-efficient experimental designs have produced results that are statistically equivalent. This said, we have identified several qualitative findings that speak to the potential differences in these results that may have been statistically identified in a larger sample. While more comparative studies focused on the statistical efficiency of competing design strategies are needed, a more pressing research problem is to document the impact the experimental design has on respondent efficiency.
Small Independent Action Force (SIAF), Vegetation Classification Study
1976-03-01
CONTENTS I. INTRODUCTION 8 II. BACKGBCUND and PORPOSE 10 III. METHOD 16 A. EXPERIMENTAL DESIGN 16 B. SUBJECTS .’ 17 C. APPARATUS 17 D. STIMULUS...reliability of subjects will be obtained. 15 III. METHOD A. EXPERIMENTAL DESIGN . The experiment involved a continous stream of stimuli. Phase 1 stimuli...the attribute to be scaled. The subjecr must designate one of the pair as greater. No equality judgments are permitted. In order to obtain data from
Accelerated testing for studying pavement design and performance (FY 2004) : research summary.
DOT National Transportation Integrated Search
2009-03-01
The thirteenth full-scale Accelerated Pavement Test (APT) experiment at the Civil Infrastructure Laboratory (CISL) of Kansas State University aimed to determine the response and the failure mode of thin concrete overlays.
Development and psychometric testing of the rural pregnancy experience scale (RPES).
Kornelsen, Jude; Stoll, Kathrin; Grzybowski, Stefan
2011-01-01
Rural pregnant woman who lack local access to maternity care due to their remote living circumstances may experience stress and anxiety related to pregnancy and parturition. The Rural Pregnancy Experience Scale (RPES) was designed to assess the unique worry and concerns reflective of the stress and anxiety of rural pregnant women related to pregnancy and parturition. The items of the scale were designed based on the results of a qualitative study of the experiences of pregnant rural women, thereby building a priori content validity into the measure. The relevancy content validity index (CVI) for this instrument was 1.0 and the clarity CVI was .91, as rated by maternity care specialists. A field test of the RPES with 187 pregnant rural women from British Columbia indicated that it had two factors: financial worries and worries/concerns about maternity care services, which were consistent with the conceptual base of the tool. Cronbach's alpha for the total RPES was .91; for the financial worries subscale and the worries/concerns about maternity care services subscale, alpha were .89 and .88, respectively. Construct validity was supported by significant correlations between the total scores of the RPES and the Depression Anxiety Stress Scales (DASS [r =.39, p < .01]), and subscale scores on the RPES were significantly correlated and converged with the depression, anxiety, and stress subscales of the DASS supporting convergent validity (correlations ranged between .20; p < .05 and .43; p < .01). Construct validity was also supported by findings that the level of access and availability of maternity care services were significantly associated with RPES scores. It was concluded that the RPES is a reliable and valid measure of worries and concerns reflective of rural pregnant women's stress and anxiety related to pregnancy and parturition.
Large scale rigidity-based flexibility analysis of biomolecules
Streinu, Ileana
2016-01-01
KINematics And RIgidity (KINARI) is an on-going project for in silico flexibility analysis of proteins. The new version of the software, Kinari-2, extends the functionality of our free web server KinariWeb, incorporates advanced web technologies, emphasizes the reproducibility of its experiments, and makes substantially improved tools available to the user. It is designed specifically for large scale experiments, in particular, for (a) very large molecules, including bioassemblies with high degree of symmetry such as viruses and crystals, (b) large collections of related biomolecules, such as those obtained through simulated dilutions, mutations, or conformational changes from various types of dynamics simulations, and (c) is intended to work as seemlessly as possible on the large, idiosyncratic, publicly available repository of biomolecules, the Protein Data Bank. We describe the system design, along with the main data processing, computational, mathematical, and validation challenges underlying this phase of the KINARI project. PMID:26958583
Discovering the Highest Energy Neutrinos Using a Radio Phased Array
NASA Astrophysics Data System (ADS)
Vieregg, Abigail
2018-06-01
The detection of high energy neutrinos is an important step toward understanding the most energetic cosmic accelerators and would enable tests of fundamental physics at energy scales that cannot easily be achieved on Earth. IceCube has detected astrophysical neutrinos at lower energies, and at higher energies the best limits to date on the flux comes from IceCube and the ANITA experiment, a NASA balloon-borne radio telescope designed to detect coherent radio Cherenkov emission from cosmogenic ultra-high energy neutrinos. I will discuss a new radio phased array design that will push the achievable sensitivity and lower the energy threshold. I will discuss the initial deployment and performance of an 8-channel system in a ground-based experiment at the South Pole (ARA), and the plans for scaling to O(100) channels and lowering the power consumption for future balloon-borne and ground-based applications.
Sequential Design of Experiments to Maximize Learning from Carbon Capture Pilot Plant Testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soepyan, Frits B.; Morgan, Joshua C.; Omell, Benjamin P.
Pilot plant test campaigns can be expensive and time-consuming. Therefore, it is of interest to maximize the amount of learning and the efficiency of the test campaign given the limited number of experiments that can be conducted. This work investigates the use of sequential design of experiments (SDOE) to overcome these challenges by demonstrating its usefulness for a recent solvent-based CO2 capture plant test campaign. Unlike traditional design of experiments methods, SDOE regularly uses information from ongoing experiments to determine the optimum locations in the design space for subsequent runs within the same experiment. However, there are challenges that needmore » to be addressed, including reducing the high computational burden to efficiently update the model, and the need to incorporate the methodology into a computational tool. We address these challenges by applying SDOE in combination with a software tool, the Framework for Optimization, Quantification of Uncertainty and Surrogates (FOQUS) (Miller et al., 2014a, 2016, 2017). The results of applying SDOE on a pilot plant test campaign for CO2 capture suggests that relative to traditional design of experiments methods, SDOE can more effectively reduce the uncertainty of the model, thus decreasing technical risk. Future work includes integrating SDOE into FOQUS and using SDOE to support additional large-scale pilot plant test campaigns.« less
NASA Astrophysics Data System (ADS)
Kurucz, Charles N.; Waite, Thomas D.; Otaño, Suzana E.; Cooper, William J.; Nickelsen, Michael G.
2002-11-01
The effectiveness of using high energy electron beam irradiation for the removal of toxic organic chemicals from water and wastewater has been demonstrated by commercial-scale experiments conducted at the Electron Beam Research Facility (EBRF) located in Miami, Florida and elsewhere. The EBRF treats various waste and water streams up to 450 l min -1 (120 gal min -1) with doses up to 8 kilogray (kGy). Many experiments have been conducted by injecting toxic organic compounds into various plant feed streams and measuring the concentrations of compound(s) before and after exposure to the electron beam at various doses. Extensive experimentation has also been performed by dissolving selected chemicals in 22,700 l (6000 gal) tank trucks of potable water to simulate contaminated groundwater, and pumping the resulting solutions through the electron beam. These large-scale experiments, although necessary to demonstrate the commercial viability of the process, require a great deal of time and effort. This paper compares the results of large-scale electron beam irradiations to those obtained from bench-scale irradiations using gamma rays generated by a 60Co source. Dose constants from exponential contaminant removal models are found to depend on the source of radiation and initial contaminant concentration. Possible reasons for observed differences such as a dose rate effect are discussed. Models for estimating electron beam dose constants from bench-scale gamma experiments are presented. Data used to compare the removal of organic compounds using gamma irradiation and electron beam irradiation are taken from the literature and a series of experiments designed to examine the effects of pH, the presence of turbidity, and initial concentration on the removal of various organic compounds (benzene, toluene, phenol, PCE, TCE and chloroform) from simulated groundwater.
NASA Astrophysics Data System (ADS)
Watts, Duncan; CLASS Collaboration
2018-01-01
The Cosmology Large Angular Scale Surveyor (CLASS) will use large-scale measurements of the polarized cosmic microwave background (CMB) to constrain the physics of inflation, reionization, and massive neutrinos. The experiment is designed to characterize the largest scales, which are inaccessible to most ground-based experiments, and remove Galactic foregrounds from the CMB maps. In this dissertation talk, I present simulations of CLASS data and demonstrate their ability to constrain the simplest single-field models of inflation and to reduce the uncertainty of the optical depth to reionization, τ, to near the cosmic variance limit, significantly improving on current constraints. These constraints will bring a qualitative shift in our understanding of standard ΛCDM cosmology. In particular, CLASS's measurement of τ breaks cosmological parameter degeneracies. Probes of large scale structure (LSS) test the effect of neutrino free-streaming at small scales, which depends on the mass of the neutrinos. CLASS's τ measurement, when combined with next-generation LSS and BAO measurements, will enable a 4σ detection of neutrino mass, compared with 2σ without CLASS data.. I will also briefly discuss the CLASS experiment's measurements of circular polarization of the CMB and the implications of the first-such near-all-sky map.
Learning by Doing, Scale Effects, or Neither? Cardiac Surgeons after Residency
Huesch, Marco D
2009-01-01
Objective To examine impacts of operating surgeon scale and cumulative experience on postoperative outcomes for patients treated with coronary artery bypass grafts (CABG) by “new” surgeons. Pooled linear, fixed effects panel, and instrumented regressions were estimated. Data Sources The administrative data included comorbidities, procedures, and outcomes for 19,978 adult CABG patients in Florida in 1998–2006, and public data on 57 cardiac surgeons who completed residencies after 1997. Study Design Analysis was at the patient level. Controls for risk, hospital scale and scope, and operating surgeon characteristics were made. Patient choice model instruments were constructed. Experience was estimated allowing for “forgetting” effects. Principal Findings Panel regressions with surgeon fixed effects showed neither surgeon scale nor cumulative volumes significantly impacted mortality nor consistently impacted morbidity. Estimation of “forgetting” suggests that almost all prior experience is depreciated from one quarter to the next. Instruments were strong, but exogeneity of volume was not rejected. Conclusions In postresidency surgeons, no persuasive evidence is found for learning by doing, scale, or selection effects. More research is needed to support the cautious view that, for these “new” cardiac surgeons, patient volume could be redistributed based on realized outcomes without disruption. PMID:19732169
Scaled Jump in Gravity-Reduced Virtual Environments.
Kim, MyoungGon; Cho, Sunglk; Tran, Tanh Quang; Kim, Seong-Pil; Kwon, Ohung; Han, JungHyun
2017-04-01
The reduced gravity experienced in lunar or Martian surfaces can be simulated on the earth using a cable-driven system, where the cable lifts a person to reduce his or her weight. This paper presents a novel cable-driven system designed for the purpose. It is integrated with a head-mounted display and a motion capture system. Focusing on jump motion within the system, this paper proposes to scale the jump and reports the experiments made for quantifying the extent to which a jump can be scaled without the discrepancy between physical and virtual jumps being noticed by the user. With the tolerable range of scaling computed from these experiments, an application named retargeted jump is developed, where a user can jump up onto virtual objects while physically jumping in the real-world flat floor. The core techniques presented in this paper can be extended to develop extreme-sport simulators such as parasailing and skydiving.
Roessl, Ulrich; Humi, Sebastian; Leitgeb, Stefan; Nidetzky, Bernd
2015-09-01
Freezing constitutes an important unit operation of biotechnological protein production. Effects of freeze-and-thaw (F/T) process parameters on stability and other quality attributes of the protein product are usually not well understood. Here a design of experiments (DoE) approach was used to characterize the F/T behavior of L-lactic dehydrogenase (LDH) in a 700-mL pilot-scale freeze container equipped with internal temperature and pH probes. In 24-hour experiments, target temperature between -10 and -38°C most strongly affected LDH stability whereby enzyme activity was retained best at the highest temperature of -10°C. Cooling profile and liquid fill volume also had significant effects on LDH stability and affected the protein aggregation significantly. Parameters of the thawing phase had a comparably small effect on LDH stability. Experiments in which the standard sodium phosphate buffer was exchanged by Tris-HCl and the non-ionic surfactant Tween 80 was added to the protein solution showed that pH shift during freezing and protein surface exposure were the main factors responsible for LDH instability at the lower freeze temperatures. Collectively, evidence is presented that supports the use of DoE-based systematic analysis at pilot scale in the identification of F/T process parameters critical for protein stability and in the development of suitable process control strategies. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
ERIC Educational Resources Information Center
Chou, Huey-Wen; Wang, Yu-Fang
1999-01-01
Compares the effects of two training methods on computer attitude and performance in a World Wide Web page design program in a field experiment with high school students in Taiwan. Discusses individual differences, Kolb's Experiential Learning Theory and Learning Style Inventory, Computer Attitude Scale, and results of statistical analyses.…
High subsonic flow tests of a parallel pipe followed by a large area ratio diffuser
NASA Technical Reports Server (NTRS)
Barna, P. S.
1975-01-01
Experiments were performed on a pilot model duct system in order to explore its aerodynamic characteristics. The model was scaled from a design projected for the high speed operation mode of the Aircraft Noise Reduction Laboratory. The test results show that the model performed satisfactorily and therefore the projected design will most likely meet the specifications.
Ju, Jinyong; Li, Wei; Wang, Yuqiao; Fan, Mengbao; Yang, Xuefeng
2016-01-01
Effective feedback control requires all state variable information of the system. However, in the translational flexible-link manipulator (TFM) system, it is unrealistic to measure the vibration signals and their time derivative of any points of the TFM by infinite sensors. With the rigid-flexible coupling between the global motion of the rigid base and the elastic vibration of the flexible-link manipulator considered, a two-time scale virtual sensor, which includes the speed observer and the vibration observer, is designed to achieve the estimation for the vibration signals and their time derivative of the TFM, as well as the speed observer and the vibration observer are separately designed for the slow and fast subsystems, which are decomposed from the dynamic model of the TFM by the singular perturbation. Additionally, based on the linear-quadratic differential games, the observer gains of the two-time scale virtual sensor are optimized, which aims to minimize the estimation error while keeping the observer stable. Finally, the numerical calculation and experiment verify the efficiency of the designed two-time scale virtual sensor. PMID:27801840
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crater, Jason; Galleher, Connor; Lievense, Jeff
NREL is developing an advanced aerobic bubble column model using Aspen Custom Modeler (ACM). The objective of this work is to integrate the new fermentor model with existing techno-economic models in Aspen Plus and Excel to establish a new methodology for guiding process design. To assist this effort, NREL has contracted Genomatica to critique and make recommendations for improving NREL's bioreactor model and large scale aerobic bioreactor design for biologically producing lipids at commercial scale. Genomatica has highlighted a few areas for improving the functionality and effectiveness of the model. Genomatica recommends using a compartment model approach with an integratedmore » black-box kinetic model of the production microbe. We also suggest including calculations for stirred tank reactors to extend the models functionality and adaptability for future process designs. Genomatica also suggests making several modifications to NREL's large-scale lipid production process design. The recommended process modifications are based on Genomatica's internal techno-economic assessment experience and are focused primarily on minimizing capital and operating costs. These recommendations include selecting/engineering a thermotolerant yeast strain with lipid excretion; using bubble column fermentors; increasing the size of production fermentors; reducing the number of vessels; employing semi-continuous operation; and recycling cell mass.« less
Patient satisfaction with a hospital-based neuropsychology service.
Foran, Amie; Millar, Elisa; Dorstyn, Diana
2016-09-01
Objective The aim of the present study was to develop and pilot a measure of patient satisfaction that encompasses themes, activities, settings and interactions specific to the neuropsychological assessment process. Methods A focus group of out-patients (n=15) was surveyed to identify the factors commonly associated with a satisfactory neuropsychological experience. Responses informed a purposely designed 14-item patient satisfaction scale (α=0.88) that was completed by 66 hospital out-patients with mild to moderate cognitive impairment. Results Satisfaction with the neuropsychological assessment process was generally reported, with the testing phase (85%) rated significantly more favourably than the pre-assessment (79%) and feedback (70%) phases. Commentaries provided by 32 respondents identified interpersonal facilitators to a satisfactory neuropsychological assessment experience, but also dissatisfaction with physical aspects of the testing environment in addition to service availability. Conclusions The patient satisfaction scale can be used as a quality assurance tool to evaluate neuropsychological service delivery. Large-scale research is needed to confirm the scale's psychometric properties. Further research may also include a broader perspective on the consumers' experience of neuropsychological services.
Using Generative Representations to Evolve Robots. Chapter 1
NASA Technical Reports Server (NTRS)
Hornby, Gregory S.
2004-01-01
Recent research has demonstrated the ability of evolutionary algorithms to automatically design both the physical structure and software controller of real physical robots. One of the challenges for these automated design systems is to improve their ability to scale to the high complexities found in real-world problems. Here we claim that for automated design systems to scale in complexity they must use a representation which allows for the hierarchical creation and reuse of modules, which we call a generative representation. Not only is the ability to reuse modules necessary for functional scalability, but it is also valuable for improving efficiency in testing and construction. We then describe an evolutionary design system with a generative representation capable of hierarchical modularity and demonstrate it for the design of locomoting robots in simulation. Finally, results from our experiments show that evolution with our generative representation produces better robots than those evolved with a non-generative representation.
NASA Astrophysics Data System (ADS)
Gao, Dongyue; Wang, Yishou; Wu, Zhanjun; Rahim, Gorgin; Bai, Shengbao
2014-05-01
The detection capability of a given structural health monitoring (SHM) system strongly depends on its sensor network placement. In order to minimize the number of sensors while maximizing the detection capability, optimal design of the PZT sensor network placement is necessary for structural health monitoring (SHM) of a full-scale composite horizontal tail. In this study, the sensor network optimization was simplified as a problem of determining the sensor array placement between stiffeners to achieve the desired the coverage rate. First, an analysis of the structural layout and load distribution of a composite horizontal tail was performed. The constraint conditions of the optimal design were presented. Then, the SHM algorithm of the composite horizontal tail under static load was proposed. Based on the given SHM algorithm, a sensor network was designed for the full-scale composite horizontal tail structure. Effective profiles of cross-stiffener paths (CRPs) and uncross-stiffener paths (URPs) were estimated by a Lamb wave propagation experiment in a multi-stiffener composite specimen. Based on the coverage rate and the redundancy requirements, a seven-sensor array-network was chosen as the optimal sensor network for each airfoil. Finally, a preliminary SHM experiment was performed on a typical composite aircraft structure component. The reliability of the SHM result for a composite horizontal tail structure under static load was validated. In the result, the red zone represented the delamination damage. The detection capability of the optimized sensor network was verified by SHM of a full-scale composite horizontal tail; all the diagnosis results were obtained in two minutes. The result showed that all the damage in the monitoring region was covered by the sensor network.
Canadian Hydrogen Intensity Mapping Experiment (CHIME) pathfinder
NASA Astrophysics Data System (ADS)
Bandura, Kevin; Addison, Graeme E.; Amiri, Mandana; Bond, J. Richard; Campbell-Wilson, Duncan; Connor, Liam; Cliche, Jean-François; Davis, Greg; Deng, Meiling; Denman, Nolan; Dobbs, Matt; Fandino, Mateus; Gibbs, Kenneth; Gilbert, Adam; Halpern, Mark; Hanna, David; Hincks, Adam D.; Hinshaw, Gary; Höfer, Carolin; Klages, Peter; Landecker, Tom L.; Masui, Kiyoshi; Mena Parra, Juan; Newburgh, Laura B.; Pen, Ue-li; Peterson, Jeffrey B.; Recnik, Andre; Shaw, J. Richard; Sigurdson, Kris; Sitwell, Mike; Smecher, Graeme; Smegal, Rick; Vanderlinde, Keith; Wiebe, Don
2014-07-01
A pathfinder version of CHIME (the Canadian Hydrogen Intensity Mapping Experiment) is currently being commissioned at the Dominion Radio Astrophysical Observatory (DRAO) in Penticton, BC. The instrument is a hybrid cylindrical interferometer designed to measure the large scale neutral hydrogen power spectrum across the redshift range 0.8 to 2.5. The power spectrum will be used to measure the baryon acoustic oscillation (BAO) scale across this poorly probed redshift range where dark energy becomes a significant contributor to the evolution of the Universe. The instrument revives the cylinder design in radio astronomy with a wide field survey as a primary goal. Modern low-noise amplifiers and digital processing remove the necessity for the analog beam forming that characterized previous designs. The Pathfinder consists of two cylinders 37m long by 20m wide oriented north-south for a total collecting area of 1,500 square meters. The cylinders are stationary with no moving parts, and form a transit instrument with an instantaneous field of view of ~100 degrees by 1-2 degrees. Each CHIME Pathfinder cylinder has a feedline with 64 dual polarization feeds placed every ~30 cm which Nyquist sample the north-south sky over much of the frequency band. The signals from each dual-polarization feed are independently amplified, filtered to 400-800 MHz, and directly sampled at 800 MSps using 8 bits. The correlator is an FX design, where the Fourier transform channelization is performed in FPGAs, which are interfaced to a set of GPUs that compute the correlation matrix. The CHIME Pathfinder is a 1/10th scale prototype version of CHIME and is designed to detect the BAO feature and constrain the distance-redshift relation. The lessons learned from its implementation will be used to inform and improve the final CHIME design.
A dynamic multi-scale Markov model based methodology for remaining life prediction
NASA Astrophysics Data System (ADS)
Yan, Jihong; Guo, Chaozhong; Wang, Xing
2011-05-01
The ability to accurately predict the remaining life of partially degraded components is crucial in prognostics. In this paper, a performance degradation index is designed using multi-feature fusion techniques to represent deterioration severities of facilities. Based on this indicator, an improved Markov model is proposed for remaining life prediction. Fuzzy C-Means (FCM) algorithm is employed to perform state division for Markov model in order to avoid the uncertainty of state division caused by the hard division approach. Considering the influence of both historical and real time data, a dynamic prediction method is introduced into Markov model by a weighted coefficient. Multi-scale theory is employed to solve the state division problem of multi-sample prediction. Consequently, a dynamic multi-scale Markov model is constructed. An experiment is designed based on a Bently-RK4 rotor testbed to validate the dynamic multi-scale Markov model, experimental results illustrate the effectiveness of the methodology.
Green infrastructure retrofits on residential parcels: Ecohydrologic modeling for stormwater design
NASA Astrophysics Data System (ADS)
Miles, B.; Band, L. E.
2014-12-01
To meet water quality goals stormwater utilities and not-for-profit watershed organizations in the U.S. are working with citizens to design and implement green infrastructure on residential land. Green infrastructure, as an alternative and complement to traditional (grey) stormwater infrastructure, has the potential to contribute to multiple ecosystem benefits including stormwater volume reduction, carbon sequestration, urban heat island mitigation, and to provide amenities to residents. However, in small (1-10-km2) medium-density urban watersheds with heterogeneous land cover it is unclear whether stormwater retrofits on residential parcels significantly contributes to reduce stormwater volume at the watershed scale. In this paper, we seek to improve understanding of how small-scale redistribution of water at the parcel scale as part of green infrastructure implementation affects urban water budgets and stormwater volume across spatial scales. As study sites we use two medium-density headwater watersheds in Baltimore, MD and Durham, NC. We develop ecohydrology modeling experiments to evaluate the effectiveness of redirecting residential rooftop runoff to un-altered pervious surfaces and to engineered rain gardens to reduce stormwater runoff. As baselines for these experiments, we performed field surveys of residential rooftop hydrologic connectivity to adjacent impervious surfaces, and found low rates of connectivity. Through simulations of pervasive adoption of downspout disconnection to un-altered pervious areas or to rain garden stormwater control measures (SCM) in these catchments, we find that most parcel-scale changes in stormwater fate are attenuated at larger spatial scales and that neither SCM alone is likely to provide significant changes in streamflow at the watershed scale.
NASA Astrophysics Data System (ADS)
Goodman, H.
2017-12-01
This investigation seeks to develop sealant technology that can restore containment to completed wells that suffer CO2 gas leakages currently untreatable using conventional technologies. Experimentation is performed at the Mont Terri Underground Research Laboratory (MT-URL) located in NW Switzerland. The laboratory affords investigators an intermediate-scale test site that bridges the gap between the laboratory bench and full field-scale conditions. Project focus is the development of CO2 leakage remediation capability using sealant technology. The experimental concept includes design and installation of a field scale completion package designed to mimic well systems heating-cooling conditions that may result in the development of micro-annuli detachments between the casing-cement-formation boundaries (Figure 1). Of particular interest is to test novel sealants that can be injected in to relatively narrow micro-annuli flow-paths of less than 120 microns aperture. Per a special report on CO2 storage submitted to the IPCC[1], active injection wells, along with inactive wells that have been abandoned, are identified as one of the most probable sources of leakage pathways for CO2 escape to the surface. Origins of pressure leakage common to injection well and completions architecture often occur due to tensile cracking from temperature cycles, micro-annulus by casing contraction (differential casing to cement sheath movement) and cement sheath channel development. This discussion summarizes the experiment capability and sealant testing results. The experiment concludes with overcoring of the entire mock-completion test site to assess sealant performance in 2018. [1] IPCC Special Report on Carbon Dioxide Capture and Storage (September 2005), section 5.7.2 Processes and pathways for release of CO2 from geological storage sites, page 244
VIBRATING PERVAPORATION MODULES: EFFECT OF MODULE DESIGN ON PERFORMANCE
A third commercial-scale vibrating pervaporation membrane module was fabricated and evaluated for the separation of volatile organic compounds (VOCs) from aqueous solutions. Experiments with surrogate solutions of four hydrophobic VOCs (1,1,1-trichloroethane (TCA), trichloroethy...
Materials and Process Activities for NASA's Composite Crew Module
NASA Technical Reports Server (NTRS)
Polis, Daniel L.
2012-01-01
In January 2007, the NASA Administrator and Associate Administrator for the Exploration Systems Mission Directorate chartered the NASA Engineering and Safety Center (NESC) to design, build, and test a full-scale Composite Crew Module (CCM). The overall goal of the CCM project was to develop a team from the NASA family with hands-on experience in composite design, manufacturing, and testing in anticipation of future space exploration systems being made of composite materials. The CCM project was planned to run concurrently with the Orion project s baseline metallic design within the Constellation Program so that features could be compared and discussed without inducing risk to the overall Program. The materials and process activities were prioritized based on a rapid prototype approach. This approach focused developmental activities on design details with greater risk and uncertainty, such as out-of-autoclave joining, over some of the more traditional lamina and laminate building block levels. While process development and associated building block testing were performed, several anomalies were still observed at the full-scale level due to interactions between process robustness and manufacturing scale-up. This paper describes the process anomalies that were encountered during the CCM development and the subsequent root cause investigations that led to the final design solutions. These investigations highlight the importance of full-scale developmental work early in the schedule of a complex composite design/build project.
Pretest characterization of WIPP experimental waste
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, J.; Davis, H.; Drez, P.E.
The Waste Isolation Pilot Plant (WIPP) near Carlsbad, New Mexico, is an underground repository designed for the storage and disposal of transuranic (TRU) wastes from US Department of Energy (DOE) facilities across the country. The Performance Assessment (PA) studies for WIPP address compliance of the repository with applicable regulations, and include full-scale experiments to be performed at the WIPP site. These experiments are the bin-scale and alcove tests to be conducted by Sandia National Laboratories (SNL). Prior to conducting these experiments, the waste to be used in these tests needs to be characterized to provide data on the initial conditionsmore » for these experiments. This characterization is referred to as the Pretest Characterization of WIPP Experimental Waste, and is also expected to provide input to other programmatic efforts related to waste characterization. The purpose of this paper is to describe the pretest waste characterization activities currently in progress for the WIPP bin-scale waste, and to discuss the program plan and specific analytical protocols being developed for this characterization. The relationship between different programs and documents related to waste characterization efforts is also highlighted in this paper.« less
Large Scale Flutter Data for Design of Rotating Blades Using Navier-Stokes Equations
NASA Technical Reports Server (NTRS)
Guruswamy, Guru P.
2012-01-01
A procedure to compute flutter boundaries of rotating blades is presented; a) Navier-Stokes equations. b) Frequency domain method compatible with industry practice. Procedure is initially validated: a) Unsteady loads with flapping wing experiment. b) Flutter boundary with fixed wing experiment. Large scale flutter computation is demonstrated for rotating blade: a) Single job submission script. b) Flutter boundary in 24 hour wall clock time with 100 cores. c) Linearly scalable with number of cores. Tested with 1000 cores that produced data in 25 hrs for 10 flutter boundaries. Further wall-clock speed-up is possible by performing parallel computations within each case.
Thatcher, T L; Wilson, D J; Wood, E E; Craig, M J; Sextro, R G
2004-08-01
Scale modeling is a useful tool for analyzing complex indoor spaces. Scale model experiments can reduce experimental costs, improve control of flow and temperature conditions, and provide a practical method for pretesting full-scale system modifications. However, changes in physical scale and working fluid (air or water) can complicate interpretation of the equivalent effects in the full-scale structure. This paper presents a detailed scaling analysis of a water tank experiment designed to model a large indoor space, and experimental results obtained with this model to assess the influence of furniture and people in the pollutant concentration field at breathing height. Theoretical calculations are derived for predicting the effects from losses of molecular diffusion, small scale eddies, turbulent kinetic energy, and turbulent mass diffusivity in a scale model, even without Reynolds number matching. Pollutant dispersion experiments were performed in a water-filled 30:1 scale model of a large room, using uranine dye injected continuously from a small point source. Pollutant concentrations were measured in a plane, using laser-induced fluorescence techniques, for three interior configurations: unobstructed, table-like obstructions, and table-like and figure-like obstructions. Concentrations within the measurement plane varied by more than an order of magnitude, even after the concentration field was fully developed. Objects in the model interior had a significant effect on both the concentration field and fluctuation intensity in the measurement plane. PRACTICAL IMPLICATION: This scale model study demonstrates both the utility of scale models for investigating dispersion in indoor environments and the significant impact of turbulence created by furnishings and people on pollutant transport from floor level sources. In a room with no furniture or occupants, the average concentration can vary by about a factor of 3 across the room. Adding furniture and occupants can increase this spatial variation by another factor of 3.
Microgravity Particle Dynamics
NASA Technical Reports Server (NTRS)
Clark, Ivan O.; Johnson, Edward J.
1996-01-01
This research seeks to identify the experiment design parameters for future flight experiments to better resolve the effects of thermal and velocity gradients on gas-solid flows. By exploiting the reduced body forces and minimized thermal convection current of reduced gravity experiments, features of gas-solid flow normally masked by gravitationally induced effects can be studied using flow regimes unattainable under unigravity. This paper assesses the physical scales of velocity, length, time, thermal gradient magnitude, and velocity gradient magnitude likely to be involved in laminar gas-solid multiphase flight experiments for 1-100 micro-m particles.
Methods of Scientific Research: Teaching Scientific Creativity at Scale
NASA Astrophysics Data System (ADS)
Robbins, Dennis; Ford, K. E. Saavik
2016-01-01
We present a scaling-up plan for AstroComNYC's Methods of Scientific Research (MSR), a course designed to improve undergraduate students' understanding of science practices. The course format and goals, notably the open-ended, hands-on, investigative nature of the curriculum are reviewed. We discuss how the course's interactive pedagogical techniques empower students to learn creativity within the context of experimental design and control of variables thinking. To date the course has been offered to a limited numbers of students in specific programs. The goals of broadly implementing MSR is to reach more students and early in their education—with the specific purpose of supporting and improving retention of students pursuing STEM careers. However, we also discuss challenges in preserving the effectiveness of the teaching and learning experience at scale.
NASA Astrophysics Data System (ADS)
Aziz, Mohammad Abdul; Al-khulaidi, Rami Ali; Rashid, MM; Islam, M. R.; Rashid, MAN
2017-03-01
In this research, a development and performance test of a fixed-bed batch type pyrolysis reactor for pilot scale pyrolysis oil production was successfully completed. The characteristics of the pyrolysis oil were compared to other experimental results. A solid horizontal condenser, a burner for furnace heating and a reactor shield were designed. Due to the pilot scale pyrolytic oil production encountered numerous problems during the plant’s operation. This fixed-bed batch type pyrolysis reactor method will demonstrate the energy saving concept of solid waste tire by creating energy stability. From this experiment, product yields (wt. %) for liquid or pyrolytic oil were 49%, char 38.3 % and pyrolytic gas 12.7% with an operation running time of 185 minutes.
ERIC Educational Resources Information Center
Heard, Courtney Christian Charisse
2013-01-01
The purpose of this study was to assess the sex-role egalitarian attitudes and gender role socialization experiences of African American men and women. A sequential mixed-methods design was employed to research this phenomenon. The Sex-Role Egalitarianism Scale-Short Form BB (SRES-BB) was utilized to assess sex-role egalitarian attitudes (King…
Do We Need to Design Course-Based Undergraduate Research Experiences for Authenticity?
Rowland, Susan; Pedwell, Rhianna; Lawrie, Gwen; Lovie-Toon, Joseph; Hung, Yu
2016-01-01
The recent push for more authentic teaching and learning in science, technology, engineering, and mathematics indicates a shared agreement that undergraduates require greater exposure to professional practices. There is considerable variation, however, in how “authentic” science education is defined. In this paper we present our definition of authenticity as it applies to an “authentic” large-scale undergraduate research experience (ALURE); we also look to the literature and the student voice for alternate perceptions around this concept. A metareview of science education literature confirmed the inconsistency in definitions and application of the notion of authentic science education. An exploration of how authenticity was explained in 604 reflections from ALURE and traditional laboratory students revealed contrasting and surprising notions and experiences of authenticity. We consider the student experience in terms of alignment with 1) the intent of our designed curriculum and 2) the literature definitions of authentic science education. These findings contribute to the conversation surrounding authenticity in science education. They suggest two things: 1) educational experiences can have significant authenticity for the participants, even when there is no purposeful design for authentic practice, and 2) the continuing discussion of and design for authenticity in UREs may be redundant. PMID:27909029
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stepinski, Dominique C.; Vandegrift, G. F.
2015-09-30
Argonne is assisting SHINE Medical Technologies (SHINE) in their efforts to develop SHINE, an accelerator-driven process that will utilize a uranyl-sulfate solution for the production of fission product Mo-99. An integral part of the process is the development of a column for the separation and recovery of Mo-99, followed by a concentration column to reduce the product volume from 15-25 L to <1 L. Argonne has collected data from batch studies and breakthrough column experiments to utilize the VERSE (Versatile Reaction Separation) simulation program (Purdue University) to design plant-scale product recovery and concentration processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pourrezaei, K.
1982-01-01
A neutral beam probe capable of measuring plasma space potential in a fully 3-dimensional magnetic field geometry has been developed. This neutral beam was successfully used to measure an arc target plasma contained within the ALEX baseball magnetic coil. A computer simulation of the experiment was performed to refine the experimental design and to develop a numerical model for scaling the ALEX neutral beam probe to other cases of fully 3-dimensional magnetic field. Based on this scaling a 30 to 50 keV neutral cesium beam probe capable of measuring space potential in the thermal barrier region of TMX Upgrade wasmore » designed.« less
Sajadi, Seyede Fateme; Arshadi, Nasrin; Zargar, Yadolla; Mehrabizade Honarmand, Mahnaz; Hajjari, Zahra
2015-06-01
Numerous studies have demonstrated that early maladaptive schemas, emotional dysregulation are supposed to be the defining core of borderline personality disorder. Many studies have also found a strong association between the diagnosis of borderline personality and the occurrence of suicide ideation and dissociative symptoms. The present study was designed to investigate the relationship between borderline personality features and schema, emotion regulation, dissociative experiences and suicidal ideation among high school students in Shiraz City, Iran. In this descriptive correlational study, 300 students (150 boys and 150 girls) were selected from the high schools in Shiraz, Iran, using the multi-stage random sampling. Data were collected using some instruments including borderline personality feature scale for children, young schema questionnaire-short form, difficulties in emotion-regulation scale (DERS), dissociative experience scale and beck suicide ideation scale. Data were analyzed using the Pearson correlation coefficient and multivariate regression analysis. The results showed a significant positive correlation between schema, emotion regulation, dissociative experiences and suicide ideation with borderline personality features. Moreover, the results of multivariate regression analysis suggested that among the studied variables, schema was the most effective predicting variable of borderline features (P < 0.001). The findings of this study are in accordance with findings from previous studies, and generally show a meaningful association between schema, emotion regulation, dissociative experiences, and suicide ideation with borderline personality features.
Progress of LMJ-relevant implosions experiments on OMEGA
NASA Astrophysics Data System (ADS)
Casner, A.; Philippe, F.; Tassin, V.; Seytor, P.; Monteil, M.-C.; Gauthier, P.; Park, H. S.; Robey, H.; Ross, J.; Amendt, P.; Girard, F.; Villette, B.; Reverdin, C.; Loiseau, P.; Caillaud, T.; Landoas, O.; Li, C. K.; Petrasso, R.; Seguin, F.; Rosenberg, M.; Renaudin, P.
2013-11-01
In preparation of the first ignition attempts on the Laser Mégajoule (LMJ), an experimental program is being pursued on OMEGA to investigate LMJ-relevant hohlraums. First, radiation temperature levels close to 300 eV were recently achieved in reduced-scale hohlraums with modest backscatter losses. Regarding the baseline target design for fusion experiments on LMJ, an extensive experimental database has also been collected for scaled implosions experiments in both empty and gas-filled rugby-shaped hohlraums. We acquired a full picture of hohlraum energetics and implosion dynamics. Not only did the rugby hohlraums show significantly higher x-ray drive energy over the cylindrical hohlraums, but symmetry control by power balance was demonstrated, as well as high-performance D2 implosions enabling the use of a complete suite of neutrons diagnostics. Charged particle diagnostics provide complementary insights into the physics of these x-ray driven implosions. An overview of these results demonstrates our ability to control the key parameters driving the implosion, lending more confidence in extrapolations to ignition-scale targets.
ERIC Educational Resources Information Center
King, Gary; Gakidou, Emmanuela; Ravishankar, Nirmala; Moore, Ryan T.; Lakin, Jason; Vargas, Manett; Tellez-Rojo, Martha Maria; Avila, Juan Eugenio Hernandez; Avila, Mauricio Hernandez; Llamas, Hector Hernandez
2007-01-01
We develop an approach to conducting large-scale randomized public policy experiments intended to be more robust to the political interventions that have ruined some or all parts of many similar previous efforts. Our proposed design is insulated from selection bias in some circumstances even if we lose observations; our inferences can still be…
NASA Technical Reports Server (NTRS)
Schlesinger, R. E.
1984-01-01
The present investigation is concerned with results from an initial set of comparative experiments in a project which utilize a three-dimensional convective storm model. The modeling results presented are related to four comparative experiments, designated Cases A through D. One of two scientific questions considered involves the dynamical processes, either near the cloud top or well within the cloud interior, which contribute to organize cloud thermal patterns such as those revealed by IR satellite imagery for some storms having strong internal cloud-scale rotation. The second question is concerned with differences, in cloud-top height and temperature field characteristics, between thunderstorms with and without significant internal cloud-scale rotation. The four experiments A-D are compared with regard to both interior and cloud-top configurations in the context of the second question. A particular strong-shear experiment, Case B, is analyzed to address question one.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rae, Philip J; Bauer, Clare L; Stennett, C
A small scale cook-off experiment has been designed to provide a violence metric for both booster and IHE materials, singly and in combination. The experiment has a simple, axisymmetric geometry provided by a 10 mm internal diameter cylindrical steel confinement up to 80 mm in length. Heating is applied from one end of the sample length creating pseudo 1-D heating profile and a thermal gradient across the sample(s). At the opposite end of the confinement to the heating block, a machined groove provides a point of rupture that generates a cylindrical fragment. The displacement of the external face of themore » fragment is detected by Heterodyne Velocimetry. Proof of concept experiments are reported focusing on HMX and TATB formulations, and are described in relation to confinement, ullage and heating profile. The development of a violence metric, based upon fragment velocity records is discussed.« less
DESIGN OF A MTBE REMEDIATION TECHNOLOGY EVALUATION
This study examines the intrinsic variability of dissolved MTBE concentrations in ground water during the course of a pilot-scale bioremedial technology trial in Port Hueneme, California. A pre-trial natural gradient tracer experiment using bromide was conducted in an anaerobic t...
Experiment to Form and Characterize a Section of a Spherically Imploding Plasma Liner
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hsu, S. C.; Langendorf, S. J.; Yates, K. C.
Here, we describe an experiment to form and characterize a section of a spherically imploding plasma liner by merging six supersonic plasma jets that are launched by newly designed contoured-gap coaxial plasma guns. This experiment is a prelude to forming a fully spherical imploding plasma liner using many dozens of plasma guns, as a standoff driver for plasma-jet-driven magneto-inertial fusion. The objectives of the six-jet experiments are to assess the evolution and scalings of liner Mach number and uniformity, which are important metrics for spherically imploding plasma liners to compress magnetized target plasmas to fusion conditions. Lastly, this article describesmore » the design of the coaxial plasma guns, experimental characterization of the plasma jets, six-jet experimental setup and diagnostics, initial diagnostic data from three- and six-jet experiments, and the high-level objectives of associated numerical modeling.« less
Experiment to Form and Characterize a Section of a Spherically Imploding Plasma Liner
Hsu, S. C.; Langendorf, S. J.; Yates, K. C.; ...
2017-12-18
Here, we describe an experiment to form and characterize a section of a spherically imploding plasma liner by merging six supersonic plasma jets that are launched by newly designed contoured-gap coaxial plasma guns. This experiment is a prelude to forming a fully spherical imploding plasma liner using many dozens of plasma guns, as a standoff driver for plasma-jet-driven magneto-inertial fusion. The objectives of the six-jet experiments are to assess the evolution and scalings of liner Mach number and uniformity, which are important metrics for spherically imploding plasma liners to compress magnetized target plasmas to fusion conditions. Lastly, this article describesmore » the design of the coaxial plasma guns, experimental characterization of the plasma jets, six-jet experimental setup and diagnostics, initial diagnostic data from three- and six-jet experiments, and the high-level objectives of associated numerical modeling.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paguio, R. R.; Smith, G. E.; Taylor, J. L.
Z-Beamlet (ZBL) experiments conducted at the PECOS test facility at Sandia National Laboratories (SNL) investigated the nonlinear processes in laser plasma interaction (or laserplasma instabilities LPI) that complicate the deposition of laser energy by enhanced absorption, backscatter, filamentation and beam-spray that can occur in large-scale laser-heated gas cell targets. These targets and experiments were designed to provide better insight into the physics of the laser preheat stage of the Magnetized Liner Inertial Fusion (MagLIF) scheme being tested on the SNL Z-machine. The experiments aim to understand the tradeoffs between laser spot size, laser pulse shape, laser entrance hole (LEH) windowmore » thickness, and fuel density for laser preheat. Gas cell target design evolution and fabrication adaptations to accommodate the evolving experiment and scientific requirements are also described in this paper.« less
Paguio, R. R.; Smith, G. E.; Taylor, J. L.; ...
2017-12-04
Z-Beamlet (ZBL) experiments conducted at the PECOS test facility at Sandia National Laboratories (SNL) investigated the nonlinear processes in laser plasma interaction (or laserplasma instabilities LPI) that complicate the deposition of laser energy by enhanced absorption, backscatter, filamentation and beam-spray that can occur in large-scale laser-heated gas cell targets. These targets and experiments were designed to provide better insight into the physics of the laser preheat stage of the Magnetized Liner Inertial Fusion (MagLIF) scheme being tested on the SNL Z-machine. The experiments aim to understand the tradeoffs between laser spot size, laser pulse shape, laser entrance hole (LEH) windowmore » thickness, and fuel density for laser preheat. Gas cell target design evolution and fabrication adaptations to accommodate the evolving experiment and scientific requirements are also described in this paper.« less
Design of the Madison Dynamo Experiment
NASA Astrophysics Data System (ADS)
Kendrick, R. D.; Bayliss, R. A.; Forest, C. B.; Nornberg, M. D.; O'Connell, R.; Spence, E. J.
2003-10-01
A spherical dynamo experiment has been constructed at the University of Wisconsin's liquid sodium facility. The goals of the experiment are to observe and understand magnetic instabilities driven by flow shear in MHD systems, investigate MHD turbulence for magnetic Reynolds numbers of ˜100, and understand the role of fluid turbulence in current generation. Magnetic field generation is possible for only specific flow geometries. We have studied and achieved simple roll flow geometries in a full scale water experiment. Results from this experiment have guided the design of the sodium experiment. The experiment consists of a 1 m diameter, spherical stainless steel vessel filled with liquid sodium at 110 Celsius. Two 100 Hp motors with impellers drive flows in the liquid sodium with flow velocities ˜ 15 m/s. A grid of Hall probes on the surface of the sodium vessel measure the generated external magnetic field. Hall probe feed-thru arrays measure the internal field. Preliminary investigations include measurements of the turbulent electromotive force and excitation of magnetic eigenmodes.
Cummins, Steven; Petticrew, Mark; Higgins, Cassie; Findlay, Anne; Sparks, Leigh
2005-12-01
To assess the effect on fruit and vegetable consumption, self reported, and psychological health of a "natural experiment"-the introduction of large scale food retailing in a deprived Scottish community. Prospective quasi-experimental design comparing baseline and follow up data in an "intervention" community with a matched "comparison" community in Glasgow, UK. 412 men and women aged 16 or over for whom follow up data on fruit and vegetable consumption and GHQ-12 were available. Fruit and vegetable consumption in portions per day, poor self reported health, and poor psychological health (GHQ-12). Adjusting for age, sex, educational attainment, and employment status there was no population impact on daily fruit and vegetable consumption, self reported, and psychological health. There was some evidence for a net reduction in the prevalence of poor psychological health for residents who directly engaged with the intervention. Government policy has advocated using large scale food retailing as a social intervention to improve diet and health in poor communities. In contrast with a previous uncontrolled study this study did not find evidence for a net intervention effect on fruit and vegetable consumption, although there was evidence for an improvement in psychological health for those who directly engaged with the intervention. Although definitive conclusions about the effect of large scale retailing on diet and health in deprived communities cannot be drawn from non-randomised controlled study designs, evaluations of the impacts of natural experiments may offer the best opportunity to generate evidence about the health impacts of retail interventions in poor communities.
Flight Experiment Verification of Shuttle Boundary Layer Transition Prediction Tool
NASA Technical Reports Server (NTRS)
Berry, Scott A.; Berger, Karen T.; Horvath, Thomas J.; Wood, William A.
2016-01-01
Boundary layer transition at hypersonic conditions is critical to the design of future high-speed aircraft and spacecraft. Accurate methods to predict transition would directly impact the aerothermodynamic environments used to size a hypersonic vehicle's thermal protection system. A transition prediction tool, based on wind tunnel derived discrete roughness correlations, was developed and implemented for the Space Shuttle return-to-flight program. This tool was also used to design a boundary layer transition flight experiment in order to assess correlation uncertainties, particularly with regard to high Mach-number transition and tunnel-to-flight scaling. A review is provided of the results obtained from the flight experiment in order to evaluate the transition prediction tool implemented for the Shuttle program.
2013-11-11
View of Flight Engineer (FE) Mike Hopkins initiating a CFE-2 (Capillary Flow Experiment - 2) Interior Corner Flow - 5 (ICF-5) test run. Liquids behave differently in space than they do on Earth, so containers that can process, hold or transport them must be designed carefully to work in microgravity. The Capillary Flow Experiment-2 furthers research on wetting, which is a liquid's ability to spread across a surface, and its impact over large length scales in strange container shapes in microgravity environments. This work will improve our capabilities to quickly and accurately predict how related processes occur, and allow us to design better systems to process liquids aboard spacecraft (i.e., liquid fuel tanks, thermals fluids, and water processing for life support). Image was released by astronaut on Twitter.
Guler, N; Volegov, P; Danly, C R; Grim, G P; Merrill, F E; Wilde, C H
2012-10-01
Inertial confinement fusion experiments at the National Ignition Facility are designed to understand the basic principles of creating self-sustaining fusion reactions by laser driven compression of deuterium-tritium (DT) filled cryogenic plastic capsules. The neutron imaging diagnostic provides information on the distribution of the central fusion reaction region and the surrounding DT fuel by observing neutron images in two different energy bands for primary (13-17 MeV) and down-scattered (6-12 MeV) neutrons. From this, the final shape and size of the compressed capsule can be estimated and the symmetry of the compression can be inferred. These experiments provide small sources with high yield neutron flux. An aperture design that includes an array of pinholes and penumbral apertures has provided the opportunity to image the same source with two different techniques. This allows for an evaluation of these different aperture designs and reconstruction algorithms.
Sensorless position estimator applied to nonlinear IPMC model
NASA Astrophysics Data System (ADS)
Bernat, Jakub; Kolota, Jakub
2016-11-01
This paper addresses the issue of estimating position for an ionic polymer metal composite (IPMC) known as electro active polymer (EAP). The key step is the construction of a sensorless mode considering only current feedback. This work takes into account nonlinearities caused by electrochemical effects in the material. Owing to the recent observer design technique, the authors obtained both Lyapunov function based estimation law as well as sliding mode observer. To accomplish the observer design, the IPMC model was identified through a series of experiments. The research comprises time domain measurements. The identification process was completed by means of geometric scaling of three test samples. In the proposed design, the estimated position accurately tracks the polymer position, which is illustrated by the experiments.
NASA Astrophysics Data System (ADS)
Kinsman, L.; Gerhard, J.; Torero, J.; Scholes, G.; Murray, C.
2013-12-01
Self-sustaining Treatment for Active Remediation (STAR) is a relatively new remediation approach for soil contaminated with organic industrial liquids. This technology uses smouldering combustion, a controlled, self-sustaining burning reaction, to destroy nonaqueous phase liquids (NAPLs) and thereby render soil clean. While STAR has been proven at the bench scale, success at industrial scales requires the process to be scaled-up significantly. The objective of this study was to conduct an experimental investigation into how liquid smouldering combustion phenomena scale. A suite of detailed forward smouldering experiments were conducted in short (16 cm dia. x 22 cm high), intermediate (16 cm dia. x 127 cm high), and large (97 cm dia. x 300 cm high; a prototype ex-situ reactor) columns; this represents scaling of up to 530 times based on the volume treated. A range of fuels were investigated, with the majority of experiments conducted using crude oil sludge as well as canola oil as a non-toxic surrogate for hazardous contaminants. To provide directly comparable data sets and to isolate changes in the smouldering reaction which occurred solely due to scaling effects, sand grain size, contaminant type, contaminant concentration and air injection rates were controlled between the experimental scales. Several processes could not be controlled and were identified to be susceptible to changes in scale, including: mobility of the contaminant, heat losses, and buoyant flow effects. For each experiment, the propagation of the smouldering front was recorded using thermocouples and analyzed by way of temperature-time and temperature-distance plots. In combination with the measurement of continuous mass loss and gaseous emissions, these results were used to evaluate the fundamental differences in the way the reaction front propagates through the mixture of sand and fuel across the various scales. Key governing parameters were compared between the small, intermediate, and large scale experiments, including: peak temperatures, velocities and thicknesses of the smouldering front, rates of mass destruction of the contaminant, and rates of gaseous emissions during combustion. Additionally, upward and downward smouldering experiments were compared at the column scale to assess the significance of buoyant flow effects. An understanding of these scaling relationships will provide important information to aid in the design of field-scale applications of STAR.
Ye, Yong; Deng, Jiahao; Shen, Sanmin; Hou, Zhuo; Liu, Yuting
2016-01-01
A novel method for proximity detection of moving targets (with high dielectric constants) using a large-scale (the size of each sensor is 31 cm × 19 cm) planar capacitive sensor system (PCSS) is proposed. The capacitive variation with distance is derived, and a pair of electrodes in a planar capacitive sensor unit (PCSU) with a spiral shape is found to have better performance on sensitivity distribution homogeneity and dynamic range than three other shapes (comb shape, rectangular shape, and circular shape). A driving excitation circuit with a Clapp oscillator is proposed, and a capacitance measuring circuit with sensitivity of 0.21 Vp−p/pF is designed. The results of static experiments and dynamic experiments demonstrate that the voltage curves of static experiments are similar to those of dynamic experiments; therefore, the static data can be used to simulate the dynamic curves. The dynamic range of proximity detection for three projectiles is up to 60 cm, and the results of the following static experiments show that the PCSU with four neighboring units has the highest sensitivity (the sensitivities of other units are at least 4% lower); when the attack angle decreases, the intensity of sensor signal increases. This proposed method leads to the design of a feasible moving target detector with simple structure and low cost, which can be applied in the interception system. PMID:27196905
An Automatic Instrument to Study the Spatial Scaling Behavior of Emissivity
Tian, Jing; Zhang, Renhua; Su, Hongbo; Sun, Xiaomin; Chen, Shaohui; Xia, Jun
2008-01-01
In this paper, the design of an automatic instrument for measuring the spatial distribution of land surface emissivity is presented, which makes the direct in situ measurement of the spatial distribution of emissivity possible. The significance of this new instrument lies in two aspects. One is that it helps to investigate the spatial scaling behavior of emissivity and temperature; the other is that, the design of the instrument provides theoretical and practical foundations for the implement of measuring distribution of surface emissivity on airborne or spaceborne. To improve the accuracy of the measurements, the emissivity measurement and its uncertainty are examined in a series of carefully designed experiments. The impact of the variation of target temperature and the environmental irradiance on the measurement of emissivity is analyzed as well. In addition, the ideal temperature difference between hot environment and cool environment is obtained based on numerical simulations. Finally, the scaling behavior of surface emissivity caused by the heterogeneity of target is discussed. PMID:27879735
The World's Largest Photovoltaic Concentrator System.
ERIC Educational Resources Information Center
Smith, Harry V.
1982-01-01
The Mississippi County Community College large-scale energy experiment, featuring the emerging high technology of solar electricity, is described. The project includes a building designed for solar electricity and a power plant consisting of a total energy photovoltaic system, and features two experimental developments. (MLW)
Looby, Mairead; Ibarra, Neysi; Pierce, James J; Buckley, Kevin; O'Donovan, Eimear; Heenan, Mary; Moran, Enda; Farid, Suzanne S; Baganz, Frank
2011-01-01
This study describes the application of quality by design (QbD) principles to the development and implementation of a major manufacturing process improvement for a commercially distributed therapeutic protein produced in Chinese hamster ovary cell culture. The intent of this article is to focus on QbD concepts, and provide guidance and understanding on how the various components combine together to deliver a robust process in keeping with the principles of QbD. A fed-batch production culture and a virus inactivation step are described as representative examples of upstream and downstream unit operations that were characterized. A systematic approach incorporating QbD principles was applied to both unit operations, involving risk assessment of potential process failure points, small-scale model qualification, design and execution of experiments, definition of operating parameter ranges and process validation acceptance criteria followed by manufacturing-scale implementation and process validation. Statistical experimental designs were applied to the execution of process characterization studies evaluating the impact of operating parameters on product quality attributes and process performance parameters. Data from process characterization experiments were used to define the proven acceptable range and classification of operating parameters for each unit operation. Analysis of variance and Monte Carlo simulation methods were used to assess the appropriateness of process design spaces. Successful implementation and validation of the process in the manufacturing facility and the subsequent manufacture of hundreds of batches of this therapeutic protein verifies the approaches taken as a suitable model for the development, scale-up and operation of any biopharmaceutical manufacturing process. Copyright © 2011 American Institute of Chemical Engineers (AIChE).
High Voltage Tests in the LUX-ZEPLIN System Test
NASA Astrophysics Data System (ADS)
Whitis, Thomas; Lux-Zeplin Collaboration
2016-03-01
The LUX-ZEPLIN (LZ) project is a dark matter direct detection experiment using liquid xenon. The detector is a time projection chamber (TPC) requiring the establishment of a large electric field inside of the detector in order to drift ionization electrons. Historically, many xenon TPC designs have been unable to reach their design fields due to light production and breakdown. The LZ System Test is scaled so that with a cathode voltage of -50 kV, it will have the fields that will be seen in the LZ detector at -100 kV. It will use a fully instrumented but scaled-down version of the LZ TPC design with a vessel set and gas system designed for quick turnaround, allowing for iterative modifications to the TPC prototype and instrumentation. This talk will present results from the high voltage tests performed during the first runs of the LZ System Test.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Srivastava, V.; Fannin, K.F.; Biljetina, R.
1986-07-01
The Institute of Gas Technology (IGT) conducted a comprehensive laboratory-scale research program to develop and optimize the anaerobic digestion process for producing methane from water hyacinth and sludge blends. This study focused on digester design and operating techniques, which gave improved methane yields and production rates over those observed using conventional digesters. The final digester concept and the operating experience was utilized to design and operate a large-scale experimentla test unit (ETU) at Walt Disney World, Florida. This paper describes the novel digester design, operating techniques, and the results obtained in the laboratory. The paper also discusses a kinetic modelmore » which predicts methane yield, methane production rate, and digester effluent solids as a function of retention time. This model was successfully utilized to predict the performance of the ETU. 15 refs., 6 figs., 6 tabs.« less
A decade of discovery: Experiments with the Get Away Special (GAS) canister
NASA Technical Reports Server (NTRS)
Brienzo, Robert
1992-01-01
The experiments from Booker T. Washington's High School for Engineering Profession designed an experiment for a Shuttle flight. The experiments which was flown on STS-42, were contained in three layers of a Get Away Special canister. The first layer housed the Heterogeneous Flow Experiment, to test the commercial application of space exploration; layer two housed an Artemia Salinas Growth Experiment, a test to determine the success and range of food production in microgravity for longer future missions; and layer three, reserved for the computer and monitoring equipment. What was learned from these experiments; and more importantly, what impact they had on education on a broader scale is the subject of this article.
2006-05-01
a significant design project that requires development of a large scale software project . A distinct shortcoming of Purdue ECE...18-540: Rapid Prototyping of Computer Systems This is a project -oriented course which will deal with all four aspects of project development ; the...instructors, will develop specifications for a mobile computer to assist in inspection and maintenance. The application will be partitioned
NASA Astrophysics Data System (ADS)
Cho, Y. J.; Zullah, M. A.; Faizal, M.; Choi, Y. D.; Lee, Y. H.
2012-11-01
A variety of technologies has been proposed to capture the energy from waves. Some of the more promising designs are undergoing demonstration testing at commercial scales. Due to the complexity of most offshore wave energy devices and their motion response in different sea states, physical tank tests are common practice for WEC design. Full scale tests are also necessary, but are expensive and only considered once the design has been optimized. Computational Fluid Dynamics (CFD) is now recognized as an important complement to traditional physical testing techniques in offshore engineering. Once properly calibrated and validated to the problem, CFD offers a high density of test data and results in a reasonable timescale to assist with design changes and improvements to the device. The purpose of this study is to investigate the performance of a newly developed direct drive hydro turbine (DDT), which will be built in a caisson for extraction of wave energy. Experiments and CFD analysis are conducted to clarify the turbine performance and internal flow characteristics. The results show that commercial CFD code can be applied successfully to the simulation of the wave motion in the water tank. The performance of the turbine for wave energy converter is studied continuously for a ongoing project.
Development of a low energy electron spectrometer for SCOPE
NASA Astrophysics Data System (ADS)
Tominaga, Y.; Saito, Y.; Yokota, S.
2010-12-01
We are newly developing an electrostatic analyzer which measures low energy electrons for the future satellite mission SCOPE (cross Scale COupling in the Plasma universE). The main purpose of the SCOPE mission is to understand the cross scale coupling between macroscopic MHD scale phenom- ena and microscopic ion and electron scale phenomena. In order to understand the dynamics of plasma in such small scales, we need to observe the plasma with an analyzer which has high time resolutions. In the Earth's magnetosphere, typical timescale of plasma cyclotron frequency is ~10 sec (ions) and ~ 10 msec (electrons). In order to conduct electron-scale observations, an analyzer which has a very high time resolution(~ 10 msec) is necessary for the experiment. So far, we decided a design of the analyzer. The analyzer has three nested spherical/toroidal deflectors, which enables us to measure two different energies simultaneously and shorten the time resolution of the experiment. In order to obtain 3D velocity distribution functions of electrons, the analyzer must have 4-pi steradian field of view. We will install 8 sets of the analyzers on the satellite. Using all these analyzers we will secure 4-pi str fov at the same time. In the experiment, we plan to measure electrons from 10 eV to 22.5 keV with 32 steps. Given that the sampling time of the experiment is 0.5 msec, it takes about 8 msec to measure the whole energy range, then the time resolution of the experiment is 8 msec. The energy and angular resolution of the inner analyzer is 0.23 and 16 degrees, respectively, and that of the outer analyzer is 0.17 and 11.5 degrees, respectively. To measure enough electrons within the sampling time, the analyzer is designed to have geometrical factors (sensitivities) of 7.5e-3 (inner analyzer) and 1.0e-2 (outer analyzer) cm-2 str-1, respectively. However, it is not apparent that these characteristics of the analyzer is really appropriate for the experiment. And there are some operational problems which we have to consider and resolve. In this study, we ... 1.confirm that the analyzer we designed has characteristics appropriate for the experiment and it can measure the 3D distribution function and velocity moments of electrons. 2.estimate how the non-uniformity of the analyzer's efficiency affects the velocity moments. 3.estimate how spin motion of the satellite affects the velocity moments. Assuming Maxwellian electron distribution function with known density, bulk velocity, and temperature, we calculated the counts that the analyzer will measure taking into account the characteristic of the analyzer. Using these counts, we calculated the distribution function and velocity moments, and compared the results with the assumed density, bulk velocity and temperature in order to see the precision of the experiment. From these calculations we found that ... 1.the characteristics of the analyzer are good enough to measure the velocity moments of electrons with an error less than several percent. 2.the non-uniformity of the efficiency of the analyzers will severely affect the bulk velocity of electrons. 3.we should have special observation modes (to change the time resolution or energy range) which depends on the observation area.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Narlesky, Joshua Edward; Berg, John M.; Duque, Juan
A set of six long-term, full-scale experiments were initiated to determine the type and extent of corrosion that occurs in 3013 containers packaged with chloride-bearing plutonium oxide materials. The materials were exposed to a high relative humidity environment representative of actual packaging conditions for the materials in storage. The materials were sealed in instrumented, inner 3013 containers with corrosion specimens designed to test the corrosiveness of the environment inside the containers under various conditions. This report focuses on initial loading conditions that are used to establish a baseline to show how the conditions change throughout the storage lifetime of themore » containers.« less
Dykema, John A.; Keith, David W.; Anderson, James G.; Weisenstein, Debra
2014-01-01
Although solar radiation management (SRM) through stratospheric aerosol methods has the potential to mitigate impacts of climate change, our current knowledge of stratospheric processes suggests that these methods may entail significant risks. In addition to the risks associated with current knowledge, the possibility of ‘unknown unknowns’ exists that could significantly alter the risk assessment relative to our current understanding. While laboratory experimentation can improve the current state of knowledge and atmospheric models can assess large-scale climate response, they cannot capture possible unknown chemistry or represent the full range of interactive atmospheric chemical physics. Small-scale, in situ experimentation under well-regulated circumstances can begin to remove some of these uncertainties. This experiment—provisionally titled the stratospheric controlled perturbation experiment—is under development and will only proceed with transparent and predominantly governmental funding and independent risk assessment. We describe the scientific and technical foundation for performing, under external oversight, small-scale experiments to quantify the risks posed by SRM to activation of halogen species and subsequent erosion of stratospheric ozone. The paper's scope includes selection of the measurement platform, relevant aspects of stratospheric meteorology, operational considerations and instrument design and engineering. PMID:25404681
Work environment impact scale: testing the psychometric properties of the Swedish version.
Ekbladh, Elin; Fan, Chia-Wei; Sandqvist, Jan; Hemmingsson, Helena; Taylor, Renée
2014-01-01
The Work Environment Impact Scale (WEIS) is an assessment that focuses on the fit between a person and his or her work environment. It is based on Kielhofner's Model of Human Occupation and designed to gather information on how clients experience their work environment. The aim of this study was to examine the psychometric properties of the Swedish version of the WEIS assessment instrument. In total, 95 ratings on the 17-item WEIS were obtained from a sample of clients with experience of sick leave due to different medical conditions. Rasch analysis was used to analyze the data. Overall, the WEIS items together cohered to form a single construct of increasingly challenging work environmental factors. The hierarchical ordering of the items along the continuum followed a logical and expected pattern, and the participants were validly measured by the scale. The three occupational therapists serving as raters validly used the scale, but demonstrated a relatively high rater separation index, indicating differences in rater severity. The findings provide evidence that the Swedish version of the WEIS is a psychometrically sound assessment across diagnoses and occupations, which can provide valuable information about experiences of work environment challenges.
Numerical Simulation Applications in the Design of EGS Collab Experiment 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnston, Henry; White, Mark D.; Fu, Pengcheng
The United States Department of Energy, Geothermal Technologies Office (GTO) is funding a collaborative investigation of enhanced geothermal systems (EGS) processes at the meso-scale. This study, referred to as the EGS Collab project, is a unique opportunity for scientists and engineers to investigate the creation of fracture networks and circulation of fluids across those networks under in-situ stress conditions. The EGS Collab project is envisioned to comprise three experiments and the site for the first experiment is on the 4850 Level (4,850 feet below ground surface) in phyllite of the Precambrian Poorman formation, at the Sanford Underground Research Facility, locatedmore » at the former Homestake Gold Mine, in Lead, South Dakota. Principal objectives of the project are to develop a number of intermediate-scale field sites and to conduct well-controlled in situ experiments focused on rock fracture behavior and permeability enhancement. Data generated during these experiments will be compared against predictions of a suite of computer codes specifically designed to solve problems involving coupled thermal, hydrological, geomechanical, and geochemical processes. Comparisons between experimental and numerical simulation results will provide code developers with direction for improvements and verification of process models, build confidence in the suite of available numerical tools, and ultimately identify critical future development needs for the geothermal modeling community. Moreover, conducting thorough comparisons of models, modelling approaches, measurement approaches and measured data, via the EGS Collab project, will serve to identify techniques that are most likely to succeed at the Frontier Observatory for Research in Geothermal Energy (FORGE), the GTO's flagship EGS research effort. As noted, outcomes from the EGS Collab project experiments will serve as benchmarks for computer code verification, but numerical simulation additionally plays an essential role in designing these meso-scale experiments. This paper describes specific numerical simulations supporting the design of Experiment 1, a field test involving hydraulic stimulation of two fractures from notched sections of the injection borehole and fluid circulation between sub-horizontal injection and production boreholes in each fracture individually and collectively, including the circulation of chilled water. Whereas the mine drift allows for accurate and close placement of monitoring instrumentation to the developed fractures, active ventilation in the drift cooled the rock mass within the experimental volume. Numerical simulations were executed to predict seismic events and magnitudes during stimulation, initial fracture orientations for smooth horizontal wellbores, pressure requirements for fracture initiation from notched wellbores, fracture propagation during stimulation between the injection and production boreholes, tracer travel times between the injection and production boreholes, produced fluid temperatures with chilled water injections, pressure limits on fluid circulation to avoid fracture growth, temperature environment surrounding the 4850 Level drift, and fracture propagation within a stress field altered by drift excavation, ventilation cooling, and dewatering.« less
NASA Technical Reports Server (NTRS)
Deyoung, James A.; Klepczynski, William J.; Mckinley, Angela Davis; Powell, William M.; Mai, Phu V.; Hetzel, P.; Bauch, A.; Davis, J. A.; Pearce, P. R.; Baumont, Francoise S.
1995-01-01
The international transatlantic time and frequency transfer experiment was designed by participating laboratories and has been implemented during 1994 to test the international communications path involving a large number of transmitting stations. This paper will present empirically determined clock and time scale differences, time and frequency domain instabilities, and a representative power spectral density analysis. The experiments by the method of co-location which will allow absolute calibration of the participating laboratories have been performed. Absolute time differences and accuracy levels of this experiment will be assessed in the near future.
Wetted foam liquid fuel ICF target experiments
Olson, R. E.; Leeper, R. J.; Yi, S. A.; ...
2016-05-26
We are developing a new NIF experimental platform that employs wetted foam liquid fuel layer ICF capsules. We will use the liquid fuel layer capsules in a NIF sub-scale experimental campaign to explore the relationship between hot spot convergence ratio (CR) and the predictability of hot spot formation. DT liquid layer ICF capsules allow for flexibility in hot spot CR via the adjustment of the initial cryogenic capsule temperature and, hence, DT vapor density. Our hypothesis is that the predictive capability of hot spot formation is robust and 1D-like for a relatively low CR hot spot (CR~15), but will becomemore » less reliable as hot spot CR is increased to CR>20. Simulations indicate that backing off on hot spot CR is an excellent way to reduce capsule instability growth and to improve robustness to low-mode x-ray flux asymmetries. In the initial experiments, we will test our hypothesis by measuring hot spot size, neutron yield, ion temperature, and burn width to infer hot spot pressure and compare to predictions for implosions with hot spot CR's in the range of 12 to 25. Larger scale experiments are also being designed, and we will advance from sub-scale to full-scale NIF experiments to determine if 1D-like behavior at low CR is retained as the scale-size is increased. The long-term objective is to develop a liquid fuel layer ICF capsule platform with robust thermonuclear burn, modest CR, and significant α-heating with burn propagation.« less
CONTROL OF CRYPTOSPORIDIUM OOCYSTS BY STEADY-STATE CONVENTIONAL TREATMENT
Pilot-scale experiments have been performed to assess the ability of conventional treatment to control Cryptosporidium oocysts under steady-state conditions. The work was performed with a pilot plant that was designed to minimize flow rates and, as a result, the number of oocyst...
Scaling of Two-Phase Flows to Partial-Earth Gravity
NASA Technical Reports Server (NTRS)
Hurlbert, Kathryn M.; Witte, Larry C.
2003-01-01
A report presents a method of scaling, to partial-Earth gravity, of parameters that describe pressure drops and other characteristics of two-phase (liquid/ vapor) flows. The development of the method was prompted by the need for a means of designing two-phase flow systems to operate on the Moon and on Mars, using fluid-properties and flow data from terrestrial two-phase-flow experiments, thus eliminating the need for partial-gravity testing. The report presents an explicit procedure for designing an Earth-based test bed that can provide hydrodynamic similarity with two-phase fluids flowing in partial-gravity systems. The procedure does not require prior knowledge of the flow regime (i.e., the spatial orientation of the phases). The method also provides for determination of pressure drops in two-phase partial-gravity flows by use of a generalization of the classical Moody chart (previously applicable to single-phase flow only). The report presents experimental data from Mars- and Moon-activity experiments that appear to demonstrate the validity of this method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jostsons, A.; Ridal, A.; Mercer, D.J.
1996-05-01
The Synroc Demonstration Plant (SDP) was designed and constructed at Lucas Heights to demonstrate the feasibility of Synroc production on a commercial scale (10 kg/hr) with simulated Purex liquid HLW. Since commissioning of the SDP in 1987, over 6000 kg of Synroc has been fabricated with a range of feeds and waste loadings. The SDP utilises uniaxial hot-pressing to consolidate Synroc. Pressureless sintering and hot-isostatic pressing have also been studied at smaller scales. The results of this extensive process development have been incorporated in a conceptual design for a radioactive plant to condition HLW from a reprocessing plant with amore » capacity to treat 800 tpa of spent LWR fuel. Synroic containing TRU, including Pu, and fission products has been fabricated and characterised in a glove-box facility and hot cells, respectively. The extensive experience in processing of Synroc over the past 15 years is summarised and its relevance to immobilization of surplus plutonium is discussed.« less
The Amazon Boundary-Layer Experiment (ABLE 2B) - A meteorological perspective
NASA Technical Reports Server (NTRS)
Garstang, Michael; Greco, Steven; Scala, John; Swap, Robert; Ulanski, Stanley; Fitzjarrald, David; Martin, David; Browell, Edward; Shipman, Mark; Connors, Vickie
1990-01-01
The Amazon Boundary-Layer Experiments (ABLE) 2A and 2B, which were performed near Manaus, Brazil in July-August, 1985, and April-May, 1987 are discussed. The experiments were performed to study the sources, sinks, concentrations, and transports of trace gases and aerosols in rain forest soils, wetlands, and vegetation. Consideration is given the design and preliminary results of the experiment, focusing on the relationships between meteorological scales of motion and the flux, transports, and reactions of chemical species and aerosols embedded in the atmospheric fluid. Meteorological results are presented and the role of the meteorological results in the atmospheric chemistry experiment is examined.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ishii, Mamoru
The NEUP funded project, NEUP-3496, aims to experimentally investigate two-phase natural circulation flow instability that could occur in Small Modular Reactors (SMRs), especially for natural circulation SMRs. The objective has been achieved by systematically performing tests to study the general natural circulation instability characteristics and the natural circulation behavior under start-up or design basis accident conditions. Experimental data sets highlighting the effect of void reactivity feedback as well as the effect of power ramp-up rate and system pressure have been used to develop a comprehensive stability map. The safety analysis code, RELAP5, has been used to evaluate experimental results andmore » models. Improvements to the constitutive relations for flashing have been made in order to develop a reliable analysis tool. This research has been focusing on two generic SMR designs, i.e. a small modular Simplified Boiling Water Reactor (SBWR) like design and a small integral Pressurized Water Reactor (PWR) like design. A BWR-type natural circulation test facility was firstly built based on the three-level scaling analysis of the Purdue Novel Modular Reactor (NMR) with an electric output of 50 MWe, namely NMR-50, which represents a BWR-type SMR with a significantly reduced reactor pressure vessel (RPV) height. The experimental facility was installed with various equipment to measure thermalhydraulic parameters such as pressure, temperature, mass flow rate and void fraction. Characterization tests were performed before the startup transient tests and quasi-steady tests to determine the loop flow resistance. The control system and data acquisition system were programmed with LabVIEW to realize the realtime control and data storage. The thermal-hydraulic and nuclear coupled startup transients were performed to investigate the flow instabilities at low pressure and low power conditions for NMR-50. Two different power ramps were chosen to study the effect of startup power density on the flow instability. The experimental startup transient results showed the existence of three different flow instability mechanisms, i.e., flashing instability, condensation induced flow instability, and density wave oscillations. In addition, the void-reactivity feedback did not have significant effects on the flow instability during the startup transients for NMR-50. ii Several initial startup procedures with different power ramp rates were experimentally investigated to eliminate the flow instabilities observed from the startup transients. Particularly, the very slow startup transient and pressurized startup transient tests were performed and compared. It was found that the very slow startup transients by applying very small power density can eliminate the flashing oscillations in the single-phase natural circulation and stabilize the flow oscillations in the phase of net vapor generation. The initially pressurized startup procedure was tested to eliminate the flashing instability during the startup transients as well. The pressurized startup procedure included the initial pressurization, heat-up, and venting process. The startup transient tests showed that the pressurized startup procedure could eliminate the flow instability during the transition from single-phase flow to two-phase flow at low pressure conditions. The experimental results indicated that both startup procedures were applicable to the initial startup of NMR. However, the pressurized startup procedures might be preferred due to short operating hours required. In order to have a deeper understanding of natural circulation flow instability, the quasi-steady tests were performed using the test facility installed with preheater and subcooler. The effect of system pressure, core inlet subcooling, core power density, inlet flow resistance coefficient, and void reactivity feedback were investigated in the quasi-steady state tests. The experimental stability boundaries were determined between unstable and stable flow conditions in the dimensionless stability plane of inlet subcooling number and Zuber number. To predict the stability boundary theoretically, linear stability analysis in the frequency domain was performed at four sections of the natural circulation test loop. The flashing phenomena in the chimney section was considered as an axially uniform heat source. And the dimensionless characteristic equation of the pressure drop perturbation was obtained by considering the void fraction effect and outlet flow resistance in the core section. The theoretical flashing boundary showed some discrepancies with previous experimental data from the quasi-steady state tests. In the future, thermal non-equilibrium was recommended to improve the accuracy of flashing instability boundary. As another part of the funded research, flow instabilities of a PWR-type SMR under low pressure and low power conditions were investigated experimentally as well. The NuScale reactor design was selected as the prototype for the PWR-type SMR. In order to experimentally study the natural circulation behavior of NuScale iii reactor during accidental scenarios, detailed scaling analyses are necessary to ensure that the scaled phenomena could be obtained in a laboratory test facility. The three-level scaling method is used as well to obtain the scaling ratios derived from various non-dimensional numbers. The design of the ideally scaled facility (ISF) was initially accomplished based on these scaling ratios. Then the engineering scaled facility (ESF) was designed and constructed based on the ISF by considering engineering limitations including laboratory space, pipe size, and pipe connections etc. PWR-type SMR experiments were performed in this well-scaled test facility to investigate the potential thermal hydraulic flow instability during the blowdown events, which might occur during the loss of coolant accident (LOCA) and loss of heat sink accident (LOHS) of the prototype PWR-type SMR. Two kinds of experiments, normal blowdown event and cold blowdown event, were experimentally investigated and compared with code predictions. The normal blowdown event was experimentally simulated since an initial condition where the pressure was lower than the designed pressure of the experiment facility, while the code prediction of blowdown started from the normal operation condition. Important thermal hydraulic parameters including reactor pressure vessel (RPV) pressure, containment pressure, local void fraction and temperature, pressure drop and natural circulation flow rate were measured and analyzed during the blowdown event. The pressure and water level transients are similar to the experimental results published by NuScale [51], which proves the capability of current loop in simulating the thermal hydraulic transient of real PWR-type SMR. During the 20000s blowdown experiment, water level in the core was always above the active fuel assemble during the experiment and proved the safety of natural circulation cooling and water recycling design of PWR-type SMR. Besides, pressure, temperature, and water level transient can be accurately predicted by RELAP5 code. However, the oscillations of natural circulation flow rate, water level and pressure drops were observed during the blowdown transients. This kind of flow oscillations are related to the water level and the location upper plenum, which is a path for coolant flow from chimney to steam generator and down comer. In order to investigate the transients start from the opening of ADS valve in both experimental and numerical way, the cold blow-down experiment is conducted. For the cold blowdown event, different from setting both reactor iv pressure vessel (RPV) and containment at high temperature and pressure, only RPV was heated close to the highest designed pressure and then open the ADS valve, same process was predicted using RELAP5 code. By doing cold blowdown experiment, the entire transients from the opening of ADS can be investigated by code and benchmarked with experimental data. Similar flow instability observed in the cold blowdown experiment. The comparison between code prediction and experiment data showed that the RELAP5 code can successfully predict the pressure void fraction and temperature transient during the cold blowdown event with limited error, but numerical instability exists in predicting natural circulation flow rate. Besides, the code is lack of capability in predicting the water level related flow instability observed in experiments.« less
NASA Astrophysics Data System (ADS)
Colette, A.; Ciarelli, G.; Otero, N.; Theobald, M.; Solberg, S.; Andersson, C.; Couvidat, F.; Manders-Groot, A.; Mar, K. A.; Mircea, M.; Pay, M. T.; Raffort, V.; Tsyro, S.; Cuvelier, K.; Adani, M.; Bessagnet, B.; Bergstrom, R.; Briganti, G.; Cappelletti, A.; D'isidoro, M.; Fagerli, H.; Ojha, N.; Roustan, Y.; Vivanco, M. G.
2017-12-01
The Eurodelta-Trends multi-model chemistry-transport experiment has been designed to better understand the evolution of air pollution and its drivers for the period 1990-2010 in Europe. The main objective of the experiment is to assess the efficiency of air pollutant emissions mitigation measures in improving regional scale air quality. The experiment is designed in three tiers with increasing degree of computational demand in order to facilitate the participation of as many modelling teams as possible. The basic experiment consists of simulations for the years 1990, 2000 and 2010. Sensitivity analysis for the same three years using various combinations of (i) anthropogenic emissions, (ii) chemical boundary conditions and (iii) meteorology complements it. The most demanding tier consists in two complete time series from 1990 to 2010, simulated using either time varying emissions for corresponding years or constant emissions. Eight chemistry-transport models have contributed with calculation results to at least one experiment tier, and six models have completed the 21-year trend simulations. The modelling results are publicly available for further use by the scientific community. We assess the skill of the models in capturing observed air pollution trends for the 1990-2010 time period. The average particulate matter relative trends are well captured by the models, even if they display the usual lower bias in reproducing absolute levels. Ozone trends are also well reproduced, yet slightly overestimated in the 1990s. The attribution study emphasizes the efficiency of mitigation measures in reducing air pollution over Europe, although a strong impact of long range transport is pointed out for ozone trends. Meteorological variability is also an important factor in some regions of Europe. The results of the first health and ecosystem impact studies impacts building upon a regional scale multi-model ensemble over a 20yr time period will also be presented.
2010-08-01
petroleum industry. Moreover, heterogeneity control strategies can be applied to improve the efficiency of a variety of in situ remediation technologies...conditions that differ significantly from those found in environmental systems . Therefore many of the design criteria used by the petroleum industry for...were helpful in constructing numerical models in up-scaled systems (2-D tanks). The UTCHEM model was able to successfully simulate 2-D experimental
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ho, Sa V.; Athmer, C.J.; Sheridan, P.W.
Contamination in low permeability soils poses a significant technical challenge to in-situ remediation efforts. Poor accessibility to the contaminants and difficulty in delivery of treatment reagents have rendered existing in-situ treatments such as bioremediation, vapor extraction, pump and treat rather ineffective when applied to low permeability soils present at many contaminated sites. This technology is an integrated in-situ treatment in which established geotechnical methods are used to install degradation zones directly in the contaminated W and electro-osmosis is utilized to move the contaminants back and forth through those zones until the treatment is completed. This topical report summarizes the resultsmore » of the lab and pilot sized Lasagna{trademark} experiments conducted at Monsanto. Experiments were conducted with kaofinite and an actual Paducah soil in units ranging from bench-scale containing kg-quantity of soil to pilot-scale containing about half a ton of soil having various treatment zone configurations. The obtained data support the feasibility of scaling up this technology with respect to electrokinetic parameters as well as removal of organic contaminants. A mathematical model was developed that was successful in predicting the temperature rises in the soil. The information and experience gained from these experiments along with the modeling effort enabled us to successfully design and operate a larger field experiment at a DOE TCE-contaminated clay site.« less
Bacterial Transport in Heterogeneous Porous Media: Laboratory and Field Experiments
NASA Astrophysics Data System (ADS)
Fuller, M. E.
2001-12-01
A fully instrumented research site for examining field-scale bacterial transport has been established on the eastern shore of Virginia. Studies employing intact sediment cores from the South Oyster site have been performed to examine the effects of physical and chemical heterogeneity, to derive transport parameters, and to aid in the selection of bacterial strains for use in field experiments. A variety of innovative methods for tracking bacteria were developed and evaluated under both laboratory and field conditions, providing the tools to detect target cell concentrations in groundwater down to <20 cells/ml, and to perform real-time monitoring in the field. Comprehensive modeling efforts have provided a framework for the layout and instrumentation of the field site, and have aided in the design and interpretation of field-scale bacterial transport experiments. Field transport experiments were conducted in both aerobic and an anoxic flow cells to determine the effects of physical and chemical heterogeneity on field-scale bacterial transport. The results of this research not only contribute to the development of more effective bioremediation strategies, but also have implications for a better understanding of bacterial movement in the subsurface as it relates to public health microbiology and general microbial ecology.
A Fundamental Study of Smoldering with Emphasis on Experimental Design for Zero-G
NASA Technical Reports Server (NTRS)
Fernandez-Pello, Carlos; Pagni, Patrick J.
1995-01-01
A research program to study smoldering combustion with emphasis on the design of an experiment to be conducted in the space shuttle was conducted at the Department of Mechanical Engineering, University of California, Berkeley. The motivation of the research is the interest in smoldering both as a fundamental combustion problem and as a serious fire risk. Research conducted included theoretical and experimental studies that have brought considerable new information about smolder combustion, the effect that buoyancy has on the process, and specific information for the design of a space experiment. Experiments were conducted at normal gravity, in opposed and forward mode of propagation and in the upward and downward direction to determine the effect and range of influence of gravity on smolder. Experiments were also conducted in microgravity, in a drop tower and in parabolic aircraft flights, where the brief microgravity periods were used to analyze transient aspects of the problem. Significant progress was made on the study of one-dimensional smolder, particularly in the opposed-flow configuration. These studies provided enough information to design a small-scale space-based experiment that was successfully conducted in the Spacelab Glovebox in the June 1992 USML-1/STS-50 mission of the Space Shuttle Columbia.
Kassab, Salah Eldin; Al-Shafei, Ahmad I; Salem, Abdel Halim; Otoom, Sameer
2015-01-01
This study examined the relationships between the different aspects of students' course experience, self-regulated learning, and academic achievement of medical students in a blended learning curriculum. Perceptions of medical students (n=171) from the Royal College of Surgeons in Ireland, Medical University of Bahrain (RCSI Bahrain), on the blended learning experience were measured using the Student Course Experience Questionnaire (SCEQ), with an added e-Learning scale. In addition, self-regulated learning was measured using the Motivated Strategies for Learning Questionnaire (MSLQ). Academic achievement was measured by the scores of the students at the end of the course. A path analysis was created to test the relationships between the different study variables. Path analysis indicated that the perceived quality of the face-to-face component of the blended experience directly affected the motivation of students. The SCEQ scale "quality of teaching" directly affected two aspects of motivation: control of learning and intrinsic goal orientation. Furthermore, appropriate course workload directly affected the self-efficacy of students. Moreover, the e-Learning scale directly affected students' peer learning and critical thinking but indirectly affected metacognitive regulation. The resource management regulation strategies, time and study environment, and effort regulation directly affected students' examination scores (17% of the variance explained). However, there were no significant direct relationships between the SCEQ scales and cognitive learning strategies or examination scores. The results of this study will have important implications for designing blended learning courses in medical schools.
Scale dependant compensational stacking of channelized sedimentary deposits
NASA Astrophysics Data System (ADS)
Wang, Y.; Straub, K. M.; Hajek, E. A.
2010-12-01
Compensational stacking, the tendency for sediment transport system to preferentially fill topographic lows, thus smoothing out topographic relief is a concept used in the interpretation of the stratigraphic record. Recently, a metric was developed to quantify the strength of compensation in sedimentary basins by comparing observed stacking patterns to what would be expected from simple, uncorrelated stacking. This method uses the rate of decay of spatial variability in sedimentation between picked depositional horizons with increasing vertical stratigraphic averaging distance. We explore how this metric varies as a function of stratigraphic scale using data from physical experiments, stratigraphy exposed in outcrops and numerical models. In an experiment conducted at Tulane University’s Sediment Dynamics Laboratory, the topography of a channelized delta formed by weakly cohesive sediment was monitored along flow-perpendicular transects at a high temporal resolution relative to channel kinematics. Over the course of this experiment a uniform relative subsidence pattern, designed to isolate autogenic processes, resulted in the construction of a stratigraphic package that is 25 times as thick as the depth of the experimental channels. We observe a scale-dependence on the compensational stacking of deposits set by the system’s avulsion time-scale. Above the avulsion time-scale deposits stack purely compensationally, but below this time-scale deposits stack somewhere between randomly and deterministically. The well-exposed Ferris Formation (Cretaceous/Paleogene, Hanna Basin, Wyoming, USA) also shows scale-dependant stratigraphic organization which appears to be set by an avulsion time-scale. Finally, we utilize simple object-based models to illustrate how channel avulsions influence compensation in alluvial basins.
2013-11-21
View of Flight Engineer (FE) Koichi Wakata posing for a photo during a CFE-2 (Capillary Flow Experiment - 2) Interior Corner Flow - 8 (ICF-8) test run. Liquids behave differently in space than they do on Earth, so containers that can process, hold or transport them must be designed carefully to work in microgravity. The Capillary Flow Experiment-2 furthers research on wetting, which is a liquid's ability to spread across a surface, and its impact over large length scales in strange container shapes in microgravity environments. This work will improve capabilities to quickly and accurately predict how related processes occur, and allow us to design better systems to process liquids aboard spacecraft (i.e., liquid fuel tanks, thermals fluids, and water processing for life support). Image was released by astronaut on Twitter.
Tracer tomography: design concepts and field experiments using heat as a tracer.
Doro, Kennedy O; Cirpka, Olaf A; Leven, Carsten
2015-04-01
Numerical and laboratory studies have provided evidence that combining hydraulic tomography with tomographic tracer tests could improve the estimation of hydraulic conductivity compared with using hydraulic data alone. Field demonstrations, however, have been lacking so far, which we attribute to experimental difficulties. In this study, we present a conceptual design and experimental applications of tracer tomography at the field scale using heat as a tracer. In our experimental design, we improve active heat tracer testing by minimizing possible effects of heat losses, buoyancy, viscosity, and changing boundary conditions. We also utilize a cost-effective approach of measuring temperature changes in situ at high resolution. We apply the presented method to the 8 m thick heterogeneous, sandy gravel, alluvial aquifer at the Lauswiesen Hydrogeological Research Site in Tübingen, Germany. Results of our tomographic heat-tracer experiments are in line with earlier work on characterizing the aquifer at the test site. We demonstrate from the experimental perspective that tracer tomography is applicable and suitable at the field scale using heat as a tracer. The experimental results also demonstrate the potential of heat-tracer tomography as a cost-effective means for characterizing aquifer heterogeneity. © 2014, National Ground Water Association.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Weizhao; Ren, Huaqing; Wang, Zequn
2016-10-19
An integrated computational materials engineering method is proposed in this paper for analyzing the design and preforming process of woven carbon fiber composites. The goal is to reduce the cost and time needed for the mass production of structural composites. It integrates the simulation methods from the micro-scale to the macro-scale to capture the behavior of the composite material in the preforming process. In this way, the time consuming and high cost physical experiments and prototypes in the development of the manufacturing process can be circumvented. This method contains three parts: the micro-scale representative volume element (RVE) simulation to characterizemore » the material; the metamodeling algorithm to generate the constitutive equations; and the macro-scale preforming simulation to predict the behavior of the composite material during forming. The results show the potential of this approach as a guidance to the design of composite materials and its manufacturing process.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Atamturktur, Sez; Unal, Cetin; Hemez, Francois
The project proposed to provide a Predictive Maturity Framework with its companion metrics that (1) introduce a formalized, quantitative means to communicate information between interested parties, (2) provide scientifically dependable means to claim completion of Validation and Uncertainty Quantification (VU) activities, and (3) guide the decision makers in the allocation of Nuclear Energy’s resources for code development and physical experiments. The project team proposed to develop this framework based on two complimentary criteria: (1) the extent of experimental evidence available for the calibration of simulation models and (2) the sophistication of the physics incorporated in simulation models. The proposed frameworkmore » is capable of quantifying the interaction between the required number of physical experiments and degree of physics sophistication. The project team has developed this framework and implemented it with a multi-scale model for simulating creep of a core reactor cladding. The multi-scale model is composed of the viscoplastic self-consistent (VPSC) code at the meso-scale, which represents the visco-plastic behavior and changing properties of a highly anisotropic material and a Finite Element (FE) code at the macro-scale to represent the elastic behavior and apply the loading. The framework developed takes advantage of the transparency provided by partitioned analysis, where independent constituent codes are coupled in an iterative manner. This transparency allows model developers to better understand and remedy the source of biases and uncertainties, whether they stem from the constituents or the coupling interface by exploiting separate-effect experiments conducted within the constituent domain and integral-effect experiments conducted within the full-system domain. The project team has implemented this procedure with the multi- scale VPSC-FE model and demonstrated its ability to improve the predictive capability of the model. Within this framework, the project team has focused on optimizing resource allocation for improving numerical models through further code development and experimentation. Related to further code development, we have developed a code prioritization index (CPI) for coupled numerical models. CPI is implemented to effectively improve the predictive capability of the coupled model by increasing the sophistication of constituent codes. In relation to designing new experiments, we investigated the information gained by the addition of each new experiment used for calibration and bias correction of a simulation model. Additionally, the variability of ‘information gain’ through the design domain has been investigated in order to identify the experiment settings where maximum information gain occurs and thus guide the experimenters in the selection of the experiment settings. This idea was extended to evaluate the information gain from each experiment can be improved by intelligently selecting the experiments, leading to the development of the Batch Sequential Design (BSD) technique. Additionally, we evaluated the importance of sufficiently exploring the domain of applicability in experiment-based validation of high-consequence modeling and simulation by developing a new metric to quantify coverage. This metric has also been incorporated into the design of new experiments. Finally, we have proposed a data-aware calibration approach for the calibration of numerical models. This new method considers the complexity of a numerical model (the number of parameters to be calibrated, parameter uncertainty, and form of the model) and seeks to identify the number of experiments necessary to calibrate the model based on the level of sophistication of the physics. The final component in the project team’s work to improve model calibration and validation methods is the incorporation of robustness to non-probabilistic uncertainty in the input parameters. This is an improvement to model validation and uncertainty quantification stemming beyond the originally proposed scope of the project. We have introduced a new metric for incorporating the concept of robustness into experiment-based validation of numerical models. This project has accounted for the graduation of two Ph.D. students (Kendra Van Buren and Josh Hegenderfer) and two M.S. students (Matthew Egeberg and Parker Shields). One of the doctoral students is now working in the nuclear engineering field and the other one is a post-doctoral fellow at the Los Alamos National Laboratory. Additionally, two more Ph.D. students (Garrison Stevens and Tunc Kulaksiz) who are working towards graduation have been supported by this project.« less
Overview of the Fusion Z-Pinch Experiment FuZE
NASA Astrophysics Data System (ADS)
Weber, T. R.; Shumlak, U.; Nelson, B. A.; Golingo, R. P.; Claveau, E. L.; McLean, H. S.; Tummel, K. K.; Higginson, D. P.; Schmidt, A. E.; UW/LLNL Team
2016-10-01
Previously, the ZaP device, at the University of Washington, demonstrated sheared flow stabilized (SFS) Z-pinch plasmas. Instabilities that have historically plagued Z-pinch plasma confinement were mitigated using sheared flows generated from a coaxial plasma gun of the Marshall type. Based on these results, a new SFS Z-pinch experiment, the Fusion Z-pinch Experiment (FuZE), has been constructed. FuZE is designed to investigate the scaling of SFS Z-pinch plasmas towards fusion conditions. The experiment will be supported by high fidelity physics modeling using kinetic and fluid simulations. Initial plans are in place for a pulsed fusion reactor following the results of FuZE. Notably, the design relies on proven commercial technologies, including a modest discharge current (1.5 MA) and voltage (40 kV), and liquid metal electrodes. Supported by DoE FES, NNSA, and ARPA-E ALPHA.
Shrink-film microfluidic education modules: Complete devices within minutes.
Nguyen, Diep; McLane, Jolie; Lew, Valerie; Pegan, Jonathan; Khine, Michelle
2011-06-01
As advances in microfluidics continue to make contributions to diagnostics and life sciences, broader awareness of this expanding field becomes necessary. By leveraging low-cost microfabrication techniques that require no capital equipment or infrastructure, simple, accessible, and effective educational modules can be made available for a broad range of educational needs from middle school demonstrations to college laboratory classes. These modules demonstrate key microfluidic concepts such as diffusion and separation as well as "laboratory on-chip" applications including chemical reactions and biological assays. These modules are intended to provide an interdisciplinary hands-on experience, including chip design, fabrication of functional devices, and experiments at the microscale. Consequently, students will be able to conceptualize physics at small scales, gain experience in computer-aided design and microfabrication, and perform experiments-all in the context of addressing real-world challenges by making their own lab-on-chip devices.
NASA Technical Reports Server (NTRS)
Kendall, J. S.; Stoeffler, R. C.
1972-01-01
Investigations of various phases of gaseous nuclear rocket technology have been conducted. The principal research efforts have recently been directed toward the closed-cycle, vortex-stabilized nuclear light bulb engine and toward a small-scale fissioning uranium plasma experiment that could be conducted in the Los Alamos Scientific Laboratory's Nuclear Furnace. The engine concept is based on the transfer of energy by thermal radiation from gaseous fissioning uranium, through a transparent wall, to hydrogen propellant. The reference engine configuration is comprised of seven unit cavities, each having its own fuel transparent wall and propellant duct. The basic design of the engine is described. Subsequent studies performed to supplement and investigate the basic design are reported. Summaries of other nuclear light bulb research programs are included.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Monet, Giath; Bacon, David J; Osetskiy, Yury N
2010-01-01
Given the time and length scales in molecular dynamics (MD) simulations of dislocation-defect interactions, quantitative MD results cannot be used directly in larger scale simulations or compared directly with experiment. A method to extract fundamental quantities from MD simulations is proposed here. The first quantity is a critical stress defined to characterise the obstacle resistance. This mesoscopic parameter, rather than the obstacle 'strength' designed for a point obstacle, is to be used for an obstacle of finite size. At finite temperature, our analyses of MD simulations allow the activation energy to be determined as a function of temperature. The resultsmore » confirm the proportionality between activation energy and temperature that is frequently observed by experiment. By coupling the data for the activation energy and the critical stress as functions of temperature, we show how the activation energy can be deduced at a given value of the critical stress.« less
The Multidimensional Loss Scale: validating a cross-cultural instrument for measuring loss.
Vromans, Lyn; Schweitzer, Robert D; Brough, Mark
2012-04-01
The Multidimensional Loss Scale (MLS) represents the first instrument designed specifically to index Experience of Loss Events and Loss Distress across multiple domains (cultural, social, material, and intrapersonal) relevant to refugee settlement. Recently settled Burmese adult refugees (N = 70) completed a questionnaire battery, including MLS items. Analyses explored MLS internal consistency, convergent and divergent validity, and factor structure. Cronbach alphas indicated satisfactory internal consistency for Experience of Loss Events (0.85) and Loss Distress (0.92), reflecting a unitary construct of multidimensional loss. Loss Distress did not correlate with depression or anxiety symptoms and correlated moderately with interpersonal grief and trauma symptoms, supporting divergent and convergent validity. Factor analysis provided preliminary support for a five-factor model: Loss of Symbolic Self, Loss of Interdependence, Loss of Home, Interpersonal Loss, and Loss of Intrapersonal Integrity. Received well by participants, the new scale shows promise for application in future research and practice.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guinn, I.; Buuck, M.; Cuesta, C.
The MAJORANA Collaboration will seek neutrinoless double beta decay (0νββ) in {sup 76}Ge using isotopically enriched p-type point contact (PPC) high purity Germanium (HPGe) detectors. A tonne-scale array of HPGe detectors would require background levels below 1 count/ROI-tonne-year in the 4 keV region of interest (ROI) around the 2039 keV Q-value of the decay. In order to demonstrate the feasibility of such an experiment, the MAJORANA DEMONSTRATOR, a 40 kg HPGe detector array, is being constructed with a background goal of < 3 count/ROI-tonne-year, which is expected to scale down to < 1 count/ROI-tonne-year for a tonne-scale experiment. The signalmore » readout electronics, which must be placed in close proximity to the detectors, present a challenge toward reaching this background goal. This talk will discuss the materials and design used to construct signal readout electronics with low enough backgrounds for the MAJORANA DEMONSTRATOR.« less
Modeling and Analysis of Realistic Fire Scenarios in Spacecraft
NASA Technical Reports Server (NTRS)
Brooker, J. E.; Dietrich, D. L.; Gokoglu, S. A.; Urban, D. L.; Ruff, G. A.
2015-01-01
An accidental fire inside a spacecraft is an unlikely, but very real emergency situation that can easily have dire consequences. While much has been learned over the past 25+ years of dedicated research on flame behavior in microgravity, a quantitative understanding of the initiation, spread, detection and extinguishment of a realistic fire aboard a spacecraft is lacking. Virtually all combustion experiments in microgravity have been small-scale, by necessity (hardware limitations in ground-based facilities and safety concerns in space-based facilities). Large-scale, realistic fire experiments are unlikely for the foreseeable future (unlike in terrestrial situations). Therefore, NASA will have to rely on scale modeling, extrapolation of small-scale experiments and detailed numerical modeling to provide the data necessary for vehicle and safety system design. This paper presents the results of parallel efforts to better model the initiation, spread, detection and extinguishment of fires aboard spacecraft. The first is a detailed numerical model using the freely available Fire Dynamics Simulator (FDS). FDS is a CFD code that numerically solves a large eddy simulation form of the Navier-Stokes equations. FDS provides a detailed treatment of the smoke and energy transport from a fire. The simulations provide a wealth of information, but are computationally intensive and not suitable for parametric studies where the detailed treatment of the mass and energy transport are unnecessary. The second path extends a model previously documented at ICES meetings that attempted to predict maximum survivable fires aboard space-craft. This one-dimensional model implies the heat and mass transfer as well as toxic species production from a fire. These simplifications result in a code that is faster and more suitable for parametric studies (having already been used to help in the hatch design of the Multi-Purpose Crew Vehicle, MPCV).
Encouraging Gender Analysis in Research Practice
ERIC Educational Resources Information Center
Thien, Deborah
2009-01-01
Few resources for practical teaching or fieldwork exercises exist which address gender in geographical contexts. This paper adds to teaching and fieldwork resources by describing an experience with designing and implementing a "gender intervention" for a large-scale, multi-university, bilingual research project that brought together a group of…
Deep multi-scale convolutional neural network for hyperspectral image classification
NASA Astrophysics Data System (ADS)
Zhang, Feng-zhe; Yang, Xia
2018-04-01
In this paper, we proposed a multi-scale convolutional neural network for hyperspectral image classification task. Firstly, compared with conventional convolution, we utilize multi-scale convolutions, which possess larger respective fields, to extract spectral features of hyperspectral image. We design a deep neural network with a multi-scale convolution layer which contains 3 different convolution kernel sizes. Secondly, to avoid overfitting of deep neural network, dropout is utilized, which randomly sleeps neurons, contributing to improve the classification accuracy a bit. In addition, new skills like ReLU in deep learning is utilized in this paper. We conduct experiments on University of Pavia and Salinas datasets, and obtained better classification accuracy compared with other methods.
JPRS Report, Science & Technology, China: Energy
1988-06-29
capacity. There are currently two types of HTGR reactor designs: the particle-bed core , which uses spherical fuel elements, and the rod type core , in...and trial operating experience with the HTGR reactor. Its main design features are as follows. 1. A particle-bed core , continuous fueling and...Favorable for Development of Small-Scale HTGR (Xu Jiming; HE DONGLI GONGCHENG, Feb 88) 47 ERRATUM: In JPRS-CEN-88-003 of 25 April 1988 in article
Problems in the design of multifunction meteor-radar networks
NASA Astrophysics Data System (ADS)
Nechitailenko, V. A.; Voloshchuk, Iu. I.
The design of meteor-radar networks is examined in connection with the need to conduct experiments on a mass scale in meteor geophysics and astronomy. Attention is given to network architecture features and procedures of communication-path selection in the organization of information transfer, with allowance for the features of the meteor communication link. The meteor link is considered as the main means to ensure traffic in the meteor-radar network.
Multidisciplinary Design, Analysis, and Optimization Tool Development using a Genetic Algorithm
NASA Technical Reports Server (NTRS)
Pak, Chan-gi; Li, Wesley
2008-01-01
Multidisciplinary design, analysis, and optimization using a genetic algorithm is being developed at the National Aeronautics and Space A dministration Dryden Flight Research Center to automate analysis and design process by leveraging existing tools such as NASTRAN, ZAERO a nd CFD codes to enable true multidisciplinary optimization in the pr eliminary design stage of subsonic, transonic, supersonic, and hypers onic aircraft. This is a promising technology, but faces many challe nges in large-scale, real-world application. This paper describes cur rent approaches, recent results, and challenges for MDAO as demonstr ated by our experience with the Ikhana fire pod design.
Tai, Mitchell; Ly, Amanda; Leung, Inne; Nayar, Gautam
2015-01-01
The burgeoning pipeline for new biologic drugs has increased the need for high-throughput process characterization to efficiently use process development resources. Breakthroughs in highly automated and parallelized upstream process development have led to technologies such as the 250-mL automated mini bioreactor (ambr250™) system. Furthermore, developments in modern design of experiments (DoE) have promoted the use of definitive screening design (DSD) as an efficient method to combine factor screening and characterization. Here we utilize the 24-bioreactor ambr250™ system with 10-factor DSD to demonstrate a systematic experimental workflow to efficiently characterize an Escherichia coli (E. coli) fermentation process for recombinant protein production. The generated process model is further validated by laboratory-scale experiments and shows how the strategy is useful for quality by design (QbD) approaches to control strategies for late-stage characterization. © 2015 American Institute of Chemical Engineers.
Joint U.S./Japan Conference on Adaptive Structures, 1st, Maui, HI, Nov. 13-15, 1990, Proceedings
NASA Technical Reports Server (NTRS)
Wada, Ben K. (Editor); Fanson, James L. (Editor); Miura, Koryo (Editor)
1991-01-01
The present volume of adaptive structures discusses the development of control laws for an orbiting tethered antenna/reflector system test scale model, the sizing of active piezoelectric struts for vibration suppression on a space-based interferometer, the control design of a space station mobile transporter with multiple constraints, and optimum configuration control of an intelligent truss structure. Attention is given to the formulation of full state feedback for infinite order structural systems, robustness issues in the design of smart structures, passive piezoelectric vibration damping, shape control experiments with a functional model for large optical reflectors, and a mathematical basis for the design optimization of adaptive trusses in precision control. Topics addressed include approaches to the optimal adaptive geometries of intelligent truss structures, the design of an automated manufacturing system for tubular smart structures, the Sandia structural control experiments, and the zero-gravity dynamics of space structures in parabolic aircraft flight.
Joint U.S./Japan Conference on Adaptive Structures, 1st, Maui, HI, Nov. 13-15, 1990, Proceedings
NASA Astrophysics Data System (ADS)
Wada, Ben K.; Fanson, James L.; Miura, Koryo
1991-11-01
The present volume of adaptive structures discusses the development of control laws for an orbiting tethered antenna/reflector system test scale model, the sizing of active piezoelectric struts for vibration suppression on a space-based interferometer, the control design of a space station mobile transporter with multiple constraints, and optimum configuration control of an intelligent truss structure. Attention is given to the formulation of full state feedback for infinite order structural systems, robustness issues in the design of smart structures, passive piezoelectric vibration damping, shape control experiments with a functional model for large optical reflectors, and a mathematical basis for the design optimization of adaptive trusses in precision control. Topics addressed include approaches to the optimal adaptive geometries of intelligent truss structures, the design of an automated manufacturing system for tubular smart structures, the Sandia structural control experiments, and the zero-gravity dynamics of space structures in parabolic aircraft flight.
Cummins, S.; Petticrew, M.; Higgins, C.; Findlay, A.; Sparks, L.
2005-01-01
Design: Prospective quasi-experimental design comparing baseline and follow up data in an "intervention" community with a matched "comparison" community in Glasgow, UK. Participants: 412 men and women aged 16 or over for whom follow up data on fruit and vegetable consumption and GHQ-12 were available. Main outcome measures: Fruit and vegetable consumption in portions per day, poor self reported health, and poor psychological health (GHQ-12). Main results: Adjusting for age, sex, educational attainment, and employment status there was no population impact on daily fruit and vegetable consumption, self reported, and psychological health. There was some evidence for a net reduction in the prevalence of poor psychological health for residents who directly engaged with the intervention. Conclusions: Government policy has advocated using large scale food retailing as a social intervention to improve diet and health in poor communities. In contrast with a previous uncontrolled study this study did not find evidence for a net intervention effect on fruit and vegetable consumption, although there was evidence for an improvement in psychological health for those who directly engaged with the intervention. Although definitive conclusions about the effect of large scale retailing on diet and health in deprived communities cannot be drawn from non-randomised controlled study designs, evaluations of the impacts of natural experiments may offer the best opportunity to generate evidence about the health impacts of retail interventions in poor communities. PMID:16286490
Biogasification products of water hyacinth wastewater reclamation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chynoweth, D.P.; Biljetina, R.; Srivastava, V.J.
1984-01-01
This paper describes the results of research in progress to evaluate the use of water hyacinth for wastewater treatment and subsequent conversion of hyacinth and sludge to methane by anaerobic digestion. Laboratory studies have been directed toward evaluating advanced biogasification concepts and establishing a data base for the design and operation of an experimental test unit (ETU) located at the water hyacinth wastewater treatment facility at Walt Disney World (WDW) located in Kissimmee, Florida. Laboratory-scale kinetic experiments have been conducted using continuously-stirred tank reactors (CSTR) and a novel non-mixed upflow solids reactor (USR) receiving a hyacinth/sludge blend at retention timesmore » of 15 down to 2.1 days. The data suggest that best performance is achieved in the USR which has longer solids and organism retention. A larger-scale ETU (160 cu ft) was designed and installed at WDW in 1983 and started up in 1984. The purpose of this unit is to validate laboratory experiments and to evaluate larger-scale equipment used for chopping, slurry preparation, mixing, and effluent dewatering. The ETU includes a front end designed for multiple feed processing and storage, a fully instrumented USR digester, and tanks for effluent and gas storage. The ETU is currently being operated on a 2:1 blend (dry wt basis) of water hyacinth and primary sludge. Performance is good without major operational problems. Results of laboratory studies and start-up and operation of the ETU will be presented. 7 references, 4 figures, 1 table.« less
Preliminary plan for a Shuttle Coherent Atmospheric Lidar Experiment (SCALE)
NASA Technical Reports Server (NTRS)
Fitzjarrald, D.; Beranek, R.; Bilbro, J.; Mabry, J.
1985-01-01
A study has been completed to define a Shuttle experiment that solves the most crucial scientific and engineering problems involved in building a satellite Doppler wind profiler for making global wind measurements. The study includes: (1) a laser study to determine the feasibility of using the existing NOAA Windvan laser in the Space Shuttle spacecraft; (2) a preliminary optics and telescope design; (3) an accommodations study including power, weight, thermal, and control system requirements; and (4) a flight trajectory and operations plan designed to accomplish the required scientific and engineering goals. The experiment will provide much-needed data on the global distribution of atmospheric aerosols and demonstrate the technique of making wind measurements from space, including scanning the laser beam and interpreting the data. Engineering accomplishments will include space qualification of the laser, development of signal processing and lag angle compensation hardware and software, and telescope and optics design. All of the results of this limited Spacelab experiment will be directly applicable to a complete satellite wind profiler for the Earth Observation System/Space Station or other free-flying satellite.
Chemical disinfection of combined sewer overflow waters using performic acid or peracetic acids.
Chhetri, Ravi Kumar; Thornberg, Dines; Berner, Jesper; Gramstad, Robin; Öjstedt, Ulrik; Sharma, Anitha Kumari; Andersen, Henrik Rasmus
2014-08-15
We investigated the possibility of applying performic acid (PFA) and peracetic acid (PAA) for disinfection of combined sewer overflow (CSO) in existing CSO management infrastructures. The disinfection power of PFA and PAA towards Escherichia coli (E. coli) and Enterococcus was studied in batch-scale and pre-field experiments. In the batch-scale experiment, 2.5 mg L(-1) PAA removed approximately 4 log unit of E. coli and Enterococcus from CSO with a 360 min contact time. The removal of E. coli and Enterococcus from CSO was always around or above 3 log units using 2-4 mg L(-1) PFA; with a 20 min contact time in both batch-scale and pre-field experiments. There was no toxicological effect measured by Vibrio fischeri when CSO was disinfected with PFA; a slight toxic effect was observed on CSO disinfected with PAA. When the design for PFA based disinfection was applied to CSO collected from an authentic event, the disinfection efficiencies were confirmed and degradation rates were slightly higher than predicted in simulated CSO. Copyright © 2014 Elsevier B.V. All rights reserved.
Lamellae spatial distribution modulates fracture behavior and toughness of african pangolin scales
Chon, Michael J.; Daly, Matthew; Wang, Bin; ...
2017-06-10
Pangolin scales form a durable armor whose hierarchical structure offers an avenue towards high performance bio-inspired materials design. In this paper, the fracture resistance of African pangolin scales is examined using single edge crack three-point bend fracture testing in order to understand toughening mechanisms arising from the structures of natural mammalian armors. In these mechanical tests, the influence of material orientation and hydration level are examined. The fracture experiments reveal an exceptional fracture resistance due to crack deflection induced by the internal spatial orientation of lamellae. An order of magnitude increase in the measured fracture resistance due to scale hydration,more » reaching up to ~ 25 kJ/m 2 was measured. Post-mortem analysis of the fracture samples was performed using a combination of optical and electron microscopy, and X-ray computerized tomography. Interestingly, the crack profile morphologies are observed to follow paths outlined by the keratinous lamellae structure of the pangolin scale. Most notably, the inherent structure of pangolin scales offers a pathway for crack deflection and fracture toughening. Finally, the results of this study are expected to be useful as design principles for high performance biomimetic applications.« less
Lamellae spatial distribution modulates fracture behavior and toughness of african pangolin scales.
Chon, Michael J; Daly, Matthew; Wang, Bin; Xiao, Xianghui; Zaheri, Alireza; Meyers, Marc A; Espinosa, Horacio D
2017-12-01
Pangolin scales form a durable armor whose hierarchical structure offers an avenue towards high performance bio-inspired materials design. In this study, the fracture resistance of African pangolin scales is examined using single edge crack three-point bend fracture testing in order to understand toughening mechanisms arising from the structures of natural mammalian armors. In these mechanical tests, the influence of material orientation and hydration level are examined. The fracture experiments reveal an exceptional fracture resistance due to crack deflection induced by the internal spatial orientation of lamellae. An order of magnitude increase in the measured fracture resistance due to scale hydration, reaching up to ~ 25kJ/m 2 was measured. Post-mortem analysis of the fracture samples was performed using a combination of optical and electron microscopy, and X-ray computerized tomography. Interestingly, the crack profile morphologies are observed to follow paths outlined by the keratinous lamellae structure of the pangolin scale. Most notably, the inherent structure of pangolin scales offers a pathway for crack deflection and fracture toughening. The results of this study are expected to be useful as design principles for high performance biomimetic applications. Copyright © 2017 Elsevier Ltd. All rights reserved.
Lamellae spatial distribution modulates fracture behavior and toughness of african pangolin scales
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chon, Michael J.; Daly, Matthew; Wang, Bin
Pangolin scales form a durable armor whose hierarchical structure offers an avenue towards high performance bio-inspired materials design. In this paper, the fracture resistance of African pangolin scales is examined using single edge crack three-point bend fracture testing in order to understand toughening mechanisms arising from the structures of natural mammalian armors. In these mechanical tests, the influence of material orientation and hydration level are examined. The fracture experiments reveal an exceptional fracture resistance due to crack deflection induced by the internal spatial orientation of lamellae. An order of magnitude increase in the measured fracture resistance due to scale hydration,more » reaching up to ~ 25 kJ/m 2 was measured. Post-mortem analysis of the fracture samples was performed using a combination of optical and electron microscopy, and X-ray computerized tomography. Interestingly, the crack profile morphologies are observed to follow paths outlined by the keratinous lamellae structure of the pangolin scale. Most notably, the inherent structure of pangolin scales offers a pathway for crack deflection and fracture toughening. Finally, the results of this study are expected to be useful as design principles for high performance biomimetic applications.« less
A study of rotor and platform design trade-offs for large-scale floating vertical axis wind turbines
NASA Astrophysics Data System (ADS)
Griffith, D. Todd; Paquette, Joshua; Barone, Matthew; Goupee, Andrew J.; Fowler, Matthew J.; Bull, Diana; Owens, Brian
2016-09-01
Vertical axis wind turbines are receiving significant attention for offshore siting. In general, offshore wind offers proximity to large populations centers, a vast & more consistent wind resource, and a scale-up opportunity, to name a few beneficial characteristics. On the other hand, offshore wind suffers from high levelized cost of energy (LCOE) and in particular high balance of system (BoS) costs owing to accessibility challenges and limited project experience. To address these challenges associated with offshore wind, Sandia National Laboratories is researching large-scale (MW class) offshore floating vertical axis wind turbines (VAWTs). The motivation for this work is that floating VAWTs are a potential transformative technology solution to reduce offshore wind LCOE in deep-water locations. This paper explores performance and cost trade-offs within the design space for floating VAWTs between the configurations for the rotor and platform.
Modeling of impulsive propellant reorientation
NASA Technical Reports Server (NTRS)
Hochstein, John I.; Patag, Alfredo E.; Chato, David J.
1988-01-01
The impulsive propellant reorientation process is modeled using the (Energy Calculations for Liquid Propellants in a Space Environment (ECLIPSE) code. A brief description of the process and the computational model is presented. Code validation is documented via comparison to experimentally derived data for small scale tanks. Predictions of reorientation performance are presented for two tanks designed for use in flight experiments and for a proposed full scale OTV tank. A new dimensionless parameter is developed to correlate reorientation performance in geometrically similar tanks. Its success is demonstrated.
Brepols, Ch; Schäfer, H; Engelhardt, N
2010-01-01
Based on the practical experience in design and operation of three full-scale membrane bioreactors (MBR) for municipal wastewater treatment that were commissioned since 1999, an overview on the different design concepts that were applied to the three MBR plants is given. The investment costs and the energy consumption of the MBRs and conventional activated sludge (CAS) plants (with and without tertiary treatment) in the Erft river region are compared. It is found that the specific investment costs of the MBR plants are lower than those of comparable CAS with tertiary treatment. A comparison of the specific energy demand of MBRs and conventional WWTPs is given. The structure of the MBRs actual operational costs is analysed. It can be seen that energy consumption is only responsible for one quarter to one third of all operational expenses. Based on a rough design and empirical cost data, a cost comparison of a full-scale MBR and a CAS is carried out. In this example the CAS employs a sand filtration and a disinfection in order to achieve comparable effluent quality. The influence of membrane lifetime on life cycle cost is assessed.
Design of a shape-memory alloy actuated macro-scale morphing aircraft mechanism
NASA Astrophysics Data System (ADS)
Manzo, Justin; Garcia, Ephrahim; Wickenheiser, Adam; Horner, Garnett C.
2005-05-01
As more alternative, lightweight actuators have become available, the conventional fixed-wing configuration seen on modern aircraft is under investigation for efficiency on a broad scale. If an aircraft could be designed with multiple functional equilibria of drastically varying aerodynamic parameters, one craft capable of 'morphing' its shape could be used to replace two or three designed with particular intentions. One proposed shape for large-scale (geometry change on the same order of magnitude as wingspan) morphing is the Hyper-Elliptical Cambered Span (HECS) wing, designed at NASA Langley to be implemented on an unmanned aerial vehicle (UAV). Proposed mechanisms to accomplish the spanwise curvature (in the y-z plane of the craft) that allow near-continuous bending of the wing are narrowed to a tendon-based DC motor actuated system, and a shape memory alloy-based (SMA) mechanism. At Cornell, simulations and wind tunnel experiments assess the validity of the HECS wing as a potential shape for a blended-wing body craft with the potential to effectively serve the needs of two conventional UAVs, and analyze the energetics of actuation associated with a morphing maneuver accomplished with both a DC motor and SMA wire.
Spatial reorientation experiments for NMR of solids and partially oriented liquids.
Martin, Rachel W; Kelly, John E; Collier, Kelsey A
2015-11-01
Motional reorientation experiments are extensions of Magic Angle Spinning (MAS) where the rotor axis is changed in order to average out, reintroduce, or scale anisotropic interactions (e.g. dipolar couplings, quadrupolar interactions or chemical shift anisotropies). This review focuses on Variable Angle Spinning (VAS), Switched Angle Spinning (SAS), and Dynamic Angle Spinning (DAS), all of which involve spinning at two or more different angles sequentially, either in successive experiments or during a multidimensional experiment. In all of these experiments, anisotropic terms in the Hamiltonian are scaled by changing the orientation of the spinning sample relative to the static magnetic field. These experiments vary in experimental complexity and instrumentation requirements. In VAS, many one-dimensional spectra are collected as a function of spinning angle. In SAS, dipolar couplings and/or chemical shift anisotropies are reintroduced by switching the sample between two different angles, often 0° or 90° and the magic angle, yielding a two-dimensional isotropic-anisotropic correlation spectrum. Dynamic Angle Spinning (DAS) is a related experiment that is used to simultaneously average out the first- and second-order quadrupolar interactions, which cannot be accomplished by spinning at any unique rotor angle in physical space. Although motional reorientation experiments generally require specialized instrumentation and data analysis schemes, some are accessible with only minor modification of standard MAS probes. In this review, the mechanics of each type of experiment are described, with representative examples. Current and historical probe and coil designs are discussed from the standpoint of how each one accomplishes the particular objectives of the experiment(s) it was designed to perform. Finally, applications to inorganic materials and liquid crystals, which present very different experimental challenges, are discussed. The review concludes with perspectives on how motional reorientation experiments can be applied to current problems in chemistry, molecular biology, and materials science, given the many advances in high-field NMR magnets, fast spinning, and sample preparation realized in recent years. Copyright © 2015 Elsevier B.V. All rights reserved.
Phase I Development of Neutral Beam Injector Solid-State Power System
NASA Astrophysics Data System (ADS)
Prager, James; Ziemba, Timothy; Miller, Kenneth E.; Slobodov, Ilia; Anderson, Seth
2017-10-01
Neutral beam injection (NBI) is an important tool for plasma heating, current drive and a diagnostic at fusion science experiments around the United States, including tokamaks, validation platform experiments, and privately funded fusion concepts. Currently, there are no vendors in the United States for NBI power systems. Eagle Harbor Technologies (EHT), Inc. is developing a new power system for NBI that takes advantage of the latest developments in solid-state switching. EHT has developed a resonant converter that can be scaled to the power levels required for NBI at small-scale validation platform experiments like the Lithium Tokamak Experiment. This power system can be used to modulate the NBI voltages over the course of a plasma shot, which can lead to improved control over the plasma. EHT will present initial modeling used to design this system as well as experimental data showing operation at 15 kV and 40 A for 10 ms into a test load. With support of DOE SBIR.
PandaX-III neutrinoless double beta decay experiment
NASA Astrophysics Data System (ADS)
Wang, Shaobo; PandaX-III Collaboration
2017-09-01
The PandaX-III experiment uses high pressure Time Projection Chambers (TPCs) to search for neutrinoless double-beta decay of Xe-136 with high energy resolution and sensitivity at the China Jin-Ping underground Laboratory II (CJPL-II). Fine-pitch Microbulk Micromegas will be used for charge amplification and readout in order to reconstruct both the energy and track of the neutrinoless double-beta decay event. In the first phase of the experiment, the detector, which contains 200 kg of 90% Xe-136 enriched gas operated at 10 bar, will be immersed in a large water tank to ensure 5 m of water shielding. For the second phase, a ton-scale experiment with multiple TPCs will be constructed to improve the detection probability and sensitivity. A 20-kg scale prototype TPC with 7 Micromegas modules has been built to optimize the design of Micromegas readout module, study the energy calibration of TPC and develop algorithm of 3D track reconstruction.
NASA Technical Reports Server (NTRS)
Aguilar, Jerry L.
1989-01-01
The technical requirements for a shuttle-attached Moving Belt Radiator (MBR) experiment are defined. The MBR is an advanced radiator concept in which a rotating belt radiates thermal energy to space. The requirements for integrating the MBR experiment in the shuttle bay are discussed. Requirements for the belt material and working fluid are outlined along with some possible options. The proposed size and relationship to a full scale Moving Belt Radiator are defined. The experiment is defined with the primary goal of dynamic testing and a secondary goal of demonstrating the sealing and heat transfer characteristics. A perturbation system which will simulate a docking maneuver or other type of short term acceleration is proposed for inclusion in the experimental apparatus. A deployment and retraction capability which will aid in evaluating the dynamics of a belt during such a maneuver is also described. The proposed test sequence for the experiment is presented. Details of the conceptual design are not presented herein, but rather in a separate Final Report.
Criticality as a Set-Point for Adaptive Behavior in Neuromorphic Hardware
Srinivasa, Narayan; Stepp, Nigel D.; Cruz-Albrecht, Jose
2015-01-01
Neuromorphic hardware are designed by drawing inspiration from biology to overcome limitations of current computer architectures while forging the development of a new class of autonomous systems that can exhibit adaptive behaviors. Several designs in the recent past are capable of emulating large scale networks but avoid complexity in network dynamics by minimizing the number of dynamic variables that are supported and tunable in hardware. We believe that this is due to the lack of a clear understanding of how to design self-tuning complex systems. It has been widely demonstrated that criticality appears to be the default state of the brain and manifests in the form of spontaneous scale-invariant cascades of neural activity. Experiment, theory and recent models have shown that neuronal networks at criticality demonstrate optimal information transfer, learning and information processing capabilities that affect behavior. In this perspective article, we argue that understanding how large scale neuromorphic electronics can be designed to enable emergent adaptive behavior will require an understanding of how networks emulated by such hardware can self-tune local parameters to maintain criticality as a set-point. We believe that such capability will enable the design of truly scalable intelligent systems using neuromorphic hardware that embrace complexity in network dynamics rather than avoiding it. PMID:26648839
Improving Design Efficiency for Large-Scale Heterogeneous Circuits
NASA Astrophysics Data System (ADS)
Gregerson, Anthony
Despite increases in logic density, many Big Data applications must still be partitioned across multiple computing devices in order to meet their strict performance requirements. Among the most demanding of these applications is high-energy physics (HEP), which uses complex computing systems consisting of thousands of FPGAs and ASICs to process the sensor data created by experiments at particles accelerators such as the Large Hadron Collider (LHC). Designing such computing systems is challenging due to the scale of the systems, the exceptionally high-throughput and low-latency performance constraints that necessitate application-specific hardware implementations, the requirement that algorithms are efficiently partitioned across many devices, and the possible need to update the implemented algorithms during the lifetime of the system. In this work, we describe our research to develop flexible architectures for implementing such large-scale circuits on FPGAs. In particular, this work is motivated by (but not limited in scope to) high-energy physics algorithms for the Compact Muon Solenoid (CMS) experiment at the LHC. To make efficient use of logic resources in multi-FPGA systems, we introduce Multi-Personality Partitioning, a novel form of the graph partitioning problem, and present partitioning algorithms that can significantly improve resource utilization on heterogeneous devices while also reducing inter-chip connections. To reduce the high communication costs of Big Data applications, we also introduce Information-Aware Partitioning, a partitioning method that analyzes the data content of application-specific circuits, characterizes their entropy, and selects circuit partitions that enable efficient compression of data between chips. We employ our information-aware partitioning method to improve the performance of the hardware validation platform for evaluating new algorithms for the CMS experiment. Together, these research efforts help to improve the efficiency and decrease the cost of the developing large-scale, heterogeneous circuits needed to enable large-scale application in high-energy physics and other important areas.
Cold Flow Testing for Liquid Propellant Rocket Injector Scaling and Throttling
NASA Technical Reports Server (NTRS)
Kenny, Jeremy R.; Moser, Marlow D.; Hulka, James; Jones, Gregg
2006-01-01
Scaling and throttling of combustion devices are important capabilities to demonstrate in development of liquid rocket engines for NASA's Space Exploration Mission. Scaling provides the ability to design new injectors and injection elements with predictable performance on the basis of test experience with existing injectors and elements, and could be a key aspect of future development programs. Throttling is the reduction of thrust with fixed designs and is a critical requirement in lunar and other planetary landing missions. A task in the Constellation University Institutes Program (CUIP) has been designed to evaluate spray characteristics when liquid propellant rocket engine injectors are scaled and throttled. The specific objectives of the present study are to characterize injection and primary atomization using cold flow simulations of the reacting sprays. These simulations can provide relevant information because the injection and primary atomization are believed to be the spray processes least affected by the propellant reaction. Cold flow studies also provide acceptable test conditions for a university environment. Three geometric scales - 1/4- scale, 1/2-scale, and full-scale - of two different injector element types - swirl coaxial and shear coaxial - will be designed, fabricated, and tested. A literature review is currently being conducted to revisit and compile the previous scaling documentation. Because it is simple to perform, throttling will also be examined in the present work by measuring primary atomization characteristics as the mass flow rate and pressure drop of the six injector element concepts are reduced, with corresponding changes in chamber backpressure. Simulants will include water and gaseous nitrogen, and an optically accessible chamber will be used for visual and laser-based diagnostics. The chamber will include curtain flow capability to repress recirculation, and additional gas injection to provide independent control of the backpressure. This paper provides a short review of the appropriate literature, as well as descriptions of plans for experimental hardware, test chamber instrumentation, diagnostics, and testing.
A practical approach for the scale-up of roller compaction process.
Shi, Weixian; Sprockel, Omar L
2016-09-01
An alternative approach for the scale-up of ribbon formation during roller compaction was investigated, which required only one batch at the commercial scale to set the operational conditions. The scale-up of ribbon formation was based on a probability method. It was sufficient in describing the mechanism of ribbon formation at both scales. In this method, a statistical relationship between roller compaction parameters and ribbon attributes (thickness and density) was first defined with DoE using a pilot Alexanderwerk WP120 roller compactor. While the milling speed was included in the design, it has no practical effect on granule properties within the study range despite its statistical significance. The statistical relationship was then adapted to a commercial Alexanderwerk WP200 roller compactor with one experimental run. The experimental run served as a calibration of the statistical model parameters. The proposed transfer method was then confirmed by conducting a mapping study on the Alexanderwerk WP200 using a factorial DoE, which showed a match between the predictions and the verification experiments. The study demonstrates the applicability of the roller compaction transfer method using the statistical model from the development scale calibrated with one experiment point at the commercial scale. Copyright © 2016 Elsevier B.V. All rights reserved.
Flow topologies and turbulence scales in a jet-in-cross-flow
Oefelein, Joseph C.; Ruiz, Anthony M.; Lacaze, Guilhem
2015-04-03
This study presents a detailed analysis of the flow topologies and turbulence scales in the jet-in-cross-flow experiment of [Su and Mungal JFM 2004]. The analysis is performed using the Large Eddy Simulation (LES) technique with a highly resolved grid and time-step and well controlled boundary conditions. This enables quantitative agreement with the first and second moments of turbulence statistics measured in the experiment. LES is used to perform the analysis since experimental measurements of time-resolved 3D fields are still in their infancy and because sampling periods are generally limited with direct numerical simulation. A major focal point is the comprehensivemore » characterization of the turbulence scales and their evolution. Time-resolved probes are used with long sampling periods to obtain maps of the integral scales, Taylor microscales, and turbulent kinetic energy spectra. Scalar-fluctuation scales are also quantified. In the near-field, coherent structures are clearly identified, both in physical and spectral space. Along the jet centerline, turbulence scales grow according to a classical one-third power law. However, the derived maps of turbulence scales reveal strong inhomogeneities in the flow. From the modeling perspective, these insights are useful to design optimized grids and improve numerical predictions in similar configurations.« less
A Simulated Research Problem for Undergraduate Metamorphic Petrology.
ERIC Educational Resources Information Center
Amenta, Roddy V.
1984-01-01
Presents a laboratory problem in metamorphic petrology designed to simulate a research experience. The problem deals with data on scales ranging from a geologic map to hand specimens to thin sections. Student analysis includes identifying metamorphic index minerals, locating their isograds on the map, and determining the folding sequence. (BC)
The Energy-Environment Simulator as a Classroom Aid.
ERIC Educational Resources Information Center
Sell, Nancy J.; Van Koevering, Thomas E.
1981-01-01
Energy-Environment Simulators, provided by the U.S. Department of Energy, can be used to help individuals experience the effects of unbridled energy consumption for the next century on a national or worldwide scale. The simulator described is a specially designed analog computer which models the real-world energy situation. (MP)
Understanding Ocean Acidification
ERIC Educational Resources Information Center
National Oceanic and Atmospheric Administration, 2011
2011-01-01
This curriculum module is designed for students who are taking high school chemistry. Students should already have some experience with the following: (1) Understanding and reading the pH scale; (2) Knowledge of the carbon cycle; (3) Using scientific notation to express large and small values; and (4) Reading chemical equations. This curriculum…
IMPLICATIONS OF INTER-HABITAT VARIATION FOR MONITORING GREAT RIVER ECOSYSTEMS: EMAP-UMR EXPERIENCE
Great River ecosystems (GREs) are complex mosaics of habitats that vary at multiple scales. GRE monitoring designs can capture some but not all of this variation. Each discrete habitat, however defined, must either be sampled as a separate strata or "resource population", combine...
Partial Identification of Treatment Effects: Applications to Generalizability
ERIC Educational Resources Information Center
Chan, Wendy
2016-01-01
Results from large-scale evaluation studies form the foundation of evidence-based policy. The randomized experiment is often considered the gold standard among study designs because the causal impact of a treatment or intervention can be assessed without threats of confounding from external variables. Policy-makers have become increasingly…
DEVELOPMENT OF AN AFFORDABLE FAMILY-SCALE BIOGAS GENERATOR
From laboratory experiments we calculated that our system would have to deliver 262 liters/hr of biogas to cook a meal. Biogas produced by slurries of various wastes was measured with a two liter bench-top digester system designed by the team. Gas volume was measured by displa...
Promoting Resiliency in Adolescent Girls through Adventure Programming
ERIC Educational Resources Information Center
Whittington, Anja; Aspelmeier, Jeffery E.; Budbill, Nadine W.
2016-01-01
This study examined whether participation in an adventure program increased the resiliency of adolescent girls. Eighty-seven girls who participated in Dirt Divas, a non-profit, adventure program, completed the Resiliency Scale for Children and Adolescents® before and after their experience. Means-comparison tests for within-subjects designs were…
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Critical infrastructures of the world are at constant risks for earthquakes. Most of these critical structures are designed using archaic, seismic, simulation methods that were built from early digital computers from the 1970s. Idaho National Laboratory’s Seismic Research Group are working to modernize the simulation methods through computational research and large-scale laboratory experiments.
Influence of contouring and hardness of foot orthoses on ratings of perceived comfort.
Mills, Kathryn; Blanch, Peter; Vicenzino, Bill
2011-08-01
Comfort is a vital component of orthosis therapy. The purpose of this study was to examine what features of orthoses (design or hardness) influence the perception of comfort by using previously established footwear comfort measures: 100-mm visual analog scale (VAS) and ranking scale. Twenty subjects were consecutively allocated to two experiments consisting of five sessions of repeated measures. Comfort measures were taken from four prefabricated orthosis in each session using the VAS (experiment 1) and ranking scale (experiment 2). Subjects in experiment 1 were also asked to rate each orthosis relative to their shoe using a criterion scale. Measures were taken in both walking and jogging. A soft-flat orthosis was found to be significantly more comfortable than all contoured orthoses, including one of the same hardness using both the VAS and ranking scale. Using the VAS, differences between the soft-flat and contoured orthoses were also found to be clinically meaningful for dimensions of overall comfort and arch cushioning (>10.2 mm). Perceived comfort of orthoses significantly differed between walking and jogging on the VAS but was not clinically meaningful. Comparisons between the VAS and criterion scale detected a VAS difference of 11.34 mm between orthoses judged as comfortable as my shoe and slightly more comfortable than my shoe. There was a VAS difference of 17.49 mm between orthoses judged as comfortable as my shoe and slightly less comfortable than my shoe. Healthy subjects prioritize contouring over hardness when judging the comfort of orthoses. Clinically meaningful changes were required to change or enhance the comfort of orthoses standardized in material type and fabrication.
NASA Astrophysics Data System (ADS)
Karpudewan, Mageswary; Ismail, Zurida; Roth, Wolff-Michael
2012-10-01
The global environmental crisis intensifies particularly in developing nations. Environmental educators have begun to understand that changing the environmental impact requires not only changes in pro-environmental knowledge and attitudes but also in associated, self-determined motivation. This study was designed to test the hypothesis that a green chemistry curriculum changes Malaysian pre-service teachers' environmental motivation. Two comparable groups of pre-service teachers participated in this study. The students in the experimental group ( N = 140) did green chemistry experiments whereas the control group ( N = 123) did equivalent experiments in a traditional manner. Posttest results indicate that there is significant difference between both the groups for intrinsic motivation, integration, identification, and introjections scales and no differences for external regulation and amotivation scales. The qualitative analysis of interview data suggests that the changes are predominantly due to the personal satisfaction that participants derived from engaging in pro-environmental behavior.
NASA Astrophysics Data System (ADS)
Strawitz, Barbara M.; Malone, Mark R.
The purpose of the study was to determine whether the field experience component of an undergraduate science methods course influenced teachers' concerns and attitudes toward science and science teaching. Age, grade-point average, openmindedness, and school assignment were examined as factors which might explain some of the variance in the dependent measures. A one-group pretest-posttest design was used. Students were administered the Teacher Concerns Questionnaire, the Science Teaching Attitude Scales, and the Rokeach Dogmatism Scale approximately eight weeks after the pretest. Results indicated that field experiences did not significantly change student concerns about teaching science but significantly improved student attitudes toward science and science teaching. Students differing in age, grade-point average, and openmindedness did not difer significantly in changes in concerns and changes in attitude toward science and science teaching. Students assigned to different schools differed significantly in changes in attitude toward science.
Observation of Flat Electron Temperature Profiles in the Lithium Tokamak Experiment
Boyle, D. P.; Majeski, R.; Schmitt, J. C.; ...
2017-07-05
It has been predicted for over a decade that low-recycling plasma-facing components in fusion devices would allow high edge temperatures and flat or nearly flat temperature profiles. In recent experiments with lithium wall coatings in the Lithium Tokamak Experiment (LTX), a hot edge ( > 200 eV ) and flat electron temperature profiles have been measured following the termination of external fueling. In this work, reduced recycling was demonstrated by retention of ~ 60% of the injected hydrogen in the walls following the discharge. Electron energy confinement followed typical Ohmic confinement scaling during fueling, but did not decrease with densitymore » after fueling terminated, ultimately exceeding the scaling by ~ 200% . Lastly, achievement of the low-recycling, hot edge regime has been an important goal of LTX and lithium plasma-facing component research in general, as it has potentially significant implications for the operation, design, and cost of fusion devices.« less
Observation of Flat Electron Temperature Profiles in the Lithium Tokamak Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyle, D. P.; Majeski, R.; Schmitt, J. C.
It has been predicted for over a decade that low-recycling plasma-facing components in fusion devices would allow high edge temperatures and flat or nearly flat temperature profiles. In recent experiments with lithium wall coatings in the Lithium Tokamak Experiment (LTX), a hot edge ( > 200 eV ) and flat electron temperature profiles have been measured following the termination of external fueling. In this work, reduced recycling was demonstrated by retention of ~ 60% of the injected hydrogen in the walls following the discharge. Electron energy confinement followed typical Ohmic confinement scaling during fueling, but did not decrease with densitymore » after fueling terminated, ultimately exceeding the scaling by ~ 200% . Lastly, achievement of the low-recycling, hot edge regime has been an important goal of LTX and lithium plasma-facing component research in general, as it has potentially significant implications for the operation, design, and cost of fusion devices.« less
Simulation of German PKL refill/reflood experiment K9A using RELAP4/MOD7. [PWR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hsu, M.T.; Davis, C.B.; Behling, S.R.
This paper describes a RELAP4/MOD7 simulation of West Germany's Kraftwerk Union (KWU) Primary Coolant Loop (PKL) refill/reflood experiment K9A. RELAP4/MOD7, a best-estimate computer program for the calculation of thermal and hydraulic phenomena in a nuclear reactor or related system, is the latest version in the RELAP4 code development series. This study was the first major simulation using RELAP4/MOD7 since its release by the Idaho National Engineering Laboratory (INEL). The PKL facility is a reduced scale (1:134) representation of a typical West German four-loop 1300 MW pressurized water reactor (PWR). A prototypical scale of the total volume to power ratio wasmore » maintained. The test facility was designed specifically for an experiment simulating the refill/reflood phase of a Loss-of-Coolant Accident (LOCA).« less
NASA Technical Reports Server (NTRS)
Finger, Barry W.; Strayer, Richard F.
1994-01-01
Three Intermediate-Scale Aerobic Bioreactors were designed, fabricated, and operated. They utilized mixed microbial communities to bio-degrade plant residues. The continuously stirred tank reactors operated at a working volume of 8 L, and the average oxygen mass transfer coefficient, k(sub L)a, was 0.01 s(exp -1). Mixing time was 35 s. An experiment using inedible wheat residues, a replenishment rate of 0.125/day, and a solids loading rate of 20 gdw/day yielded a 48% reduction in biomass. Bioreactor effluent was successfully used to regenerate a wheat hydroponic nutrient solution. Over 80% of available potassium, calcium, and other minerals were recovered and recycled in the 76-day wheat growth experiment.
An industrial approach to design compelling VR and AR experience
NASA Astrophysics Data System (ADS)
Richir, Simon; Fuchs, Philippe; Lourdeaux, Domitile; Buche, Cédric; Querrec, Ronan
2013-03-01
The convergence of technologies currently observed in the field of VR, AR, robotics and consumer electronic reinforces the trend of new applications appearing every day. But when transferring knowledge acquired from research to businesses, research laboratories are often at a loss because of a lack of knowledge of the design and integration processes in creating an industrial scale product. In fact, the innovation approaches that take a good idea from the laboratory to a successful industrial product are often little known to researchers. The objective of this paper is to present the results of the work of several research teams that have finalized a working method for researchers and manufacturers that allow them to design virtual or augmented reality systems and enable their users to enjoy "a compelling VR experience". That approach, called "the I2I method", present 11 phases from "Establishing technological and competitive intelligence and industrial property" to "Improvements" through the "Definition of the Behavioral Interface, Virtual Environment and Behavioral Software Assistance". As a result of the experience gained by various research teams, this design approach benefits from contributions from current VR and AR research. Our objective is to validate and continuously move such multidisciplinary design team methods forward.
Intuitive web-based experimental design for high-throughput biomedical data.
Friedrich, Andreas; Kenar, Erhan; Kohlbacher, Oliver; Nahnsen, Sven
2015-01-01
Big data bioinformatics aims at drawing biological conclusions from huge and complex biological datasets. Added value from the analysis of big data, however, is only possible if the data is accompanied by accurate metadata annotation. Particularly in high-throughput experiments intelligent approaches are needed to keep track of the experimental design, including the conditions that are studied as well as information that might be interesting for failure analysis or further experiments in the future. In addition to the management of this information, means for an integrated design and interfaces for structured data annotation are urgently needed by researchers. Here, we propose a factor-based experimental design approach that enables scientists to easily create large-scale experiments with the help of a web-based system. We present a novel implementation of a web-based interface allowing the collection of arbitrary metadata. To exchange and edit information we provide a spreadsheet-based, humanly readable format. Subsequently, sample sheets with identifiers and metainformation for data generation facilities can be created. Data files created after measurement of the samples can be uploaded to a datastore, where they are automatically linked to the previously created experimental design model.
NASA Astrophysics Data System (ADS)
Bastian, N.; O'Connell, R.; Kendrick, R.; Goldwin, J.; Forest, C. B.
1998-11-01
A liquid metal magneto-hydrodynamic (MHD) experiment at the University of Wisconsin is being constructed in order to validate 3 key elements of MHD dynamo theory: magnetic instabilities driven by flow shear, the effects of turbulence on current generation (primarily the α and β effects) and the nature of saturation for these on these processes. The experiment consists of two main stages, the first of which uses water to test impeller designs that are used to generate flows capable of supporting a dynamo. Since water has nearly the same viscosity and mass density as sodium, it is the ideal substance with which to test our impeller designs. The second stage of the experiment uses a one meter diameter sphere filled with ≈ 200 gallons of liquid sodium to directly test MHD theory. Impellers will be used to impose flows on the liquid sodium that are predicted by MHD theory to lead to a growing magnetic field. In addition, large scale flows will lead to small-scale turbulence which can produce a dynamo effect and a current. This is known as the turbulent α-effect which we will attempt to observe. The MHD theory also predicts an anomalously high resistivity or magnetic diffusivity (the β-effect). Once a growing magnetic field is present it should be possible to measure the effect that the growing magnetic field has on the flow that created it.
Solenoid for Laser Induced Plasma Experiments at Janus
NASA Astrophysics Data System (ADS)
Klein, Sallee; Leferve, Heath; Kemp, Gregory; Mariscal, Derek; Rasmus, Alex; Williams, Jackson; Gillespie, Robb; Manuel, Mario; Kuranz, Carolyn; Keiter, Paul; Drake, R.
2017-10-01
Creating invariant magnetic fields for experiments involving laser induced plasmas is particularly challenging due to the high voltages at which the solenoid must be pulsed. Creating a solenoid resilient enough to survive through large numbers of voltage discharges, enabling it to endure a campaign lasting several weeks, is exceptionally difficult. Here we present a solenoid that is robust through 40 μs pulses at a 13 kV potential. This solenoid is a vast improvement over our previously fielded designs in peak magnetic field capabilities and robustness. Designed to be operated at small-scale laser facilities, the solenoid housing allows for versatility of experimental set-ups among diagnostic and target positions. Within the perpendicular field axis at the center there is 300 degrees of clearance which can be easily modified to meet the needs of a specific experiment, as well as an f/3 cone for transmitted or backscattered light. After initial design efforts, these solenoids are relatively inexpensive to manufacture.
Demonstration of coal reburning for cyclone boiler NO{sub x} control. Appendix, Book 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
Based on the industry need for a pilot-scale cyclone boiler simulator, Babcock Wilcox (B&W) designed, fabricated, and installed such a facility at its Alliance Research Center (ARC) in 1985. The project involved conversion of an existing pulverized coal-fired facility to be cyclone-firing capable. Additionally, convective section tube banks were installed in the upper furnace in order to simulate a typical boiler convection pass. The small boiler simulator (SBS) is designed to simulate most fireside aspects of full-size utility boilers such as combustion and flue gas emissions characteristics, fireside deposition, etc. Prior to the design of the pilot-scale cyclone boiler simulator,more » the various cyclone boiler types were reviewed in order to identify the inherent cyclone boiler design characteristics which are applicable to the majority of these boilers. The cyclone boiler characteristics that were reviewed include NO{sub x} emissions, furnace exit gas temperature (FEGT) carbon loss, and total furnace residence time. Previous pilot-scale cyclone-fired furnace experience identified the following concerns: (1) Operability of a small cyclone furnace (e.g., continuous slag tapping capability). (2) The optimum cyclone(s) configuration for the pilot-scale unit. (3) Compatibility of NO{sub x} levels, carbon burnout, cyclone ash carryover to the convection pass, cyclone temperature, furnace residence time, and FEGT.« less
Computational Modeling as a Design Tool in Microelectronics Manufacturing
NASA Technical Reports Server (NTRS)
Meyyappan, Meyya; Arnold, James O. (Technical Monitor)
1997-01-01
Plans to introduce pilot lines or fabs for 300 mm processing are in progress. The IC technology is simultaneously moving towards 0.25/0.18 micron. The convergence of these two trends places unprecedented stringent demands on processes and equipments. More than ever, computational modeling is called upon to play a complementary role in equipment and process design. The pace in hardware/process development needs a matching pace in software development: an aggressive move towards developing "virtual reactors" is desirable and essential to reduce design cycle and costs. This goal has three elements: reactor scale model, feature level model, and database of physical/chemical properties. With these elements coupled, the complete model should function as a design aid in a CAD environment. This talk would aim at the description of various elements. At the reactor level, continuum, DSMC(or particle) and hybrid models will be discussed and compared using examples of plasma and thermal process simulations. In microtopography evolution, approaches such as level set methods compete with conventional geometric models. Regardless of the approach, the reliance on empricism is to be eliminated through coupling to reactor model and computational surface science. This coupling poses challenging issues of orders of magnitude variation in length and time scales. Finally, database development has fallen behind; current situation is rapidly aggravated by the ever newer chemistries emerging to meet process metrics. The virtual reactor would be a useless concept without an accompanying reliable database that consists of: thermal reaction pathways and rate constants, electron-molecule cross sections, thermochemical properties, transport properties, and finally, surface data on the interaction of radicals, atoms and ions with various surfaces. Large scale computational chemistry efforts are critical as experiments alone cannot meet database needs due to the difficulties associated with such controlled experiments and costs.
HPG operating experience at CEM-UT
NASA Astrophysics Data System (ADS)
Gully, J. H.; Aanstoos, T. A.; Nalty, K.; Walls, W. A.
1986-11-01
Design and functional features are presented for three homopolar generators (HPG) used in experiments during the last decade at the Center for Electromechanics at the University of Texas. The first, a disk-type, 10 MJ HPG, was built in 1973 as a prototype power source for fusion experiments. A second, compact HPG was built in 1980 for opening switch experiments as part of railgun research. The third device is an iron-core, full-scale, high speed bearing and brush test facility for supplying an energy density of 60 MJ/cu m. Engineering data obtained during studies of armature reactions actively cooled brushes morganite-copper graphite rim brushes, and peak currents, are summarized.
Xie, Yi; Mun, Sungyong; Kim, Jinhyun; Wang, Nien-Hwa Linda
2002-01-01
A tandem simulated moving bed (SMB) process for insulin purification has been proposed and validated experimentally. The mixture to be separated consists of insulin, high molecular weight proteins, and zinc chloride. A systematic approach based on the standing wave design, rate model simulations, and experiments was used to develop this multicomponent separation process. The standing wave design was applied to specify the SMB operating conditions of a lab-scale unit with 10 columns. The design was validated with rate model simulations prior to experiments. The experimental results show 99.9% purity and 99% yield, which closely agree with the model predictions and the standing wave design targets. The agreement proves that the standing wave design can ensure high purity and high yield for the tandem SMB process. Compared to a conventional batch SEC process, the tandem SMB has 10% higher yield, 400% higher throughput, and 72% lower eluant consumption. In contrast, a design that ignores the effects of mass transfer and nonideal flow cannot meet the purity requirement and gives less than 96% yield.
Operational experience of the OC-OTEC experiments at NELH
DOE Office of Scientific and Technical Information (OSTI.GOV)
Link, H
1989-02-01
The Solar Energy Research Institute, under funding and program direction from the US Department of Energy, has been operating a small-scale test apparatus to investigate key components of open- cycle ocean thermal energy conversion (OC-OTEC). The apparatus started operations in October 1987 and continues to provide valuable information on heat-and mass-transfer processes in evaporators and condensers, gas sorption processes as seawater is depressurized and repressurized, and control and instrumentation characteristics of open-cycle systems. Although other test facilities have been used to study some of these interactions, this is the largest apparatus of its kind to use seawater since Georges Claude`smore » efforts in 1926. The information obtained from experiments conducted in this apparatus is being used to design a larger scale experiment in which a positive net power production is expected to be demonstrated for the first time with OC-OTEC. This paper describes the apparatus, the major tests conducted during its first 18 months of operation, and the experience gained in OC-OTEC system operation. 13 refs., 8 figs.« less
Evaluating large-scale health programmes at a district level in resource-limited countries.
Svoronos, Theodore; Mate, Kedar S
2011-11-01
Recent experience in evaluating large-scale global health programmes has highlighted the need to consider contextual differences between sites implementing the same intervention. Traditional randomized controlled trials are ill-suited for this purpose, as they are designed to identify whether an intervention works, not how, when and why it works. In this paper we review several evaluation designs that attempt to account for contextual factors that contribute to intervention effectiveness. Using these designs as a base, we propose a set of principles that may help to capture information on context. Finally, we propose a tool, called a driver diagram, traditionally used in implementation that would allow evaluators to systematically monitor changing dynamics in project implementation and identify contextual variation across sites. We describe an implementation-related example from South Africa to underline the strengths of the tool. If used across multiple sites and multiple projects, the resulting driver diagrams could be pooled together to form a generalized theory for how, when and why a widely-used intervention works. Mechanisms similar to the driver diagram are urgently needed to complement existing evaluations of large-scale implementation efforts.
Bench-Scale Process for Low-Cost Carbon Dioxide (CO2) Capture Using a Phase-Changing Absorbent
DOE Office of Scientific and Technical Information (OSTI.GOV)
Westendorf, Tiffany; Caraher, Joel; Chen, Wei
2015-03-31
The objective of this project is to design and build a bench-scale process for a novel phase-changing aminosilicone-based CO2-capture solvent. The project will establish scalability and technical and economic feasibility of using a phase-changing CO2-capture absorbent for post-combustion capture of CO2 from coal-fired power plants with 90% capture efficiency and 95% CO2 purity at a cost of $40/tonne of CO2 captured by 2025 and a cost of <$10/tonne of CO2 captured by 2035. In the first budget period of this project, the bench-scale phase-changing CO2 capture process was designed using data and operating experience generated under a previous project (ARPA-emore » project DE-AR0000084). Sizing and specification of all major unit operations was completed, including detailed process and instrumentation diagrams. The system was designed to operate over a wide range of operating conditions to allow for exploration of the effect of process variables on CO2 capture performance.« less
ESF-X: a low-cost modular experiment computer for space flight experiments
NASA Astrophysics Data System (ADS)
Sell, Steven; Zapetis, Joseph; Littlefield, Jim; Vining, Joanne
2004-08-01
The high cost associated with spaceflight research often compels experimenters to scale back their research goals significantly purely for budgetary reasons; among experiment systems, control and data collection electronics are a major contributor to total project cost. ESF-X was developed as an architecture demonstration in response to this need: it is a highly capable, radiation-protected experiment support computer, designed to be configurable on demand to each investigator's particular experiment needs, and operational in LEO for missions lasting up to several years (e.g., ISS EXPRESS) without scheduled service or maintenance. ESF-X can accommodate up to 255 data channels (I/O, A/D, D/A, etc.), allocated per customer request, with data rates up to 40kHz. Additionally, ESF-X can be programmed using the graphical block-diagram based programming languages Simulink and MATLAB. This represents a major cost saving opportunity for future investigators, who can now obtain a customized, space-qualified experiment controller at steeply reduced cost compared to 'new' design, and without the performance compromises associated with using preexisting 'generic' systems. This paper documents the functional benchtop prototype, which utilizes a combination of COTS and space-qualified components, along with unit-gravity-specific provisions appropriate to laboratory environment evaluation of the ESF-X design concept and its physical implementation.
Development and Validation of the Elder Learning Barriers Scale Among Older Chinese Adults.
Wang, Renfeng; De Donder, Liesbeth; De Backer, Free; He, Tao; Van Regenmortel, Sofie; Li, Shihua; Lombaerts, Koen
2017-12-01
This study describes the development and validation of the Elder Learning Barriers (ELB) scale, which seeks to identify the obstacles that affect the level of educational participation of older adults. The process of item pool design and scale development is presented, as well as the testing and scale refinement procedure. The data were collected from a sample of 579 older Chinese adults (aged over 55) in the Xi'an region of China. After randomly splitting the sample for cross-validation purposes, the construct validity of the ELB scale was confirmed containing five dimensions: dispositional, informational, physical, situational, and institutional barriers. Furthermore, developmental differences in factor structure have been examined among older age groups. The results indicated that the scale demonstrated good reliability and validity. We conclude in general that the ELB scale appears to be a valuable instrument for examining the learning barriers that older Chinese citizens experience for participating in organized educational activities.
NASA Astrophysics Data System (ADS)
Frolov, Sergey; Garau, Bartolame; Bellingham, James
2014-08-01
Regular grid ("lawnmower") survey is a classical strategy for synoptic sampling of the ocean. Is it possible to achieve a more effective use of available resources if one takes into account a priori knowledge about variability in magnitudes of uncertainty and decorrelation scales? In this article, we develop and compare the performance of several path-planning algorithms: optimized "lawnmower," a graph-search algorithm (A*), and a fully nonlinear genetic algorithm. We use the machinery of the best linear unbiased estimator (BLUE) to quantify the ability of a vehicle fleet to synoptically map distribution of phytoplankton off the central California coast. We used satellite and in situ data to specify covariance information required by the BLUE estimator. Computational experiments showed that two types of sampling strategies are possible: a suboptimal space-filling design (produced by the "lawnmower" and the A* algorithms) and an optimal uncertainty-aware design (produced by the genetic algorithm). Unlike the space-filling designs that attempted to cover the entire survey area, the optimal design focused on revisiting areas of high uncertainty. Results of the multivehicle experiments showed that fleet performance predictors, such as cumulative speed or the weight of the fleet, predicted the performance of a homogeneous fleet well; however, these were poor predictors for comparing the performance of different platforms.
Vedam, Saraswathi; Stoll, Kathrin; Martin, Kelsey; Rubashkin, Nicholas; Partridge, Sarah; Thordarson, Dana; Jolicoeur, Ganga
2017-01-01
Shared decision making (SDM) is core to person-centered care and is associated with improved health outcomes. Despite this, there are no validated scales measuring women’s agency and ability to lead decision making during maternity care. Objective To develop and validate a new instrument that assesses women’s autonomy and role in decision making during maternity care. Design Through a community-based participatory research process, service users designed, content validated, and administered a cross-sectional quantitative survey, including 31 items on the experience of decision-making. Setting and participants Pregnancy experiences (n = 2514) were reported by 1672 women who saw a single type of primary maternity care provider in British Columbia. They described care by a midwife, family physician or obstetrician during 1, 2 or 3 maternity care cycles. We conducted psychometric testing in three separate samples. Main outcome measures We assessed reliability, item-to-total correlations, and the factor structure of the The Mothers’ Autonomy in Decision Making (MADM) scale. We report MADM scores by care provider type, length of prenatal appointments, preferences for role in decision-making, and satisfaction with experience of decision-making. Results The MADM scale measures a single construct: autonomy in decision-making during maternity care. Cronbach alphas for the scale exceeded 0.90 for all samples and all provider groups. All item-to-total correlations were replicable across three samples and exceeded 0.7. Eigenvalue and scree plots exhibited a clear 90-degree angle, and factor analysis generated a one factor scale. MADM median scores were highest among women who were cared for by midwives, and 10 or more points lower for those who saw physicians. Increased time for prenatal appointments was associated with higher scale scores, and there were significant differences between providers with respect to average time spent in prenatal appointments. Midwifery care was associated with higher MADM scores, even during short prenatal appointments (<15 minutes). Among women who preferred to lead decisions around their care (90.8%), and who were dissatisfied with their experience of decision making, MADM scores were very low (median 14). Women with physician carers were consistently more likely to report dissatisfaction with their involvement in decision making. Discussion The Mothers Autonomy in Decision Making (MADM) scale is a reliable instrument for assessment of the experience of decision making during maternity care. This new scale was developed and content validated by community members representing various populations of childbearing women in BC including women from vulnerable populations. MADM measures women’s ability to lead decision making, whether they are given enough time to consider their options, and whether their choices are respected. Women who experienced midwifery care reported greater autonomy than women under physician care, when engaging in decision-making around maternity care options. Differences in models of care, professional education, regulatory standards, and compensation for prenatal visits between midwives and physicians likely affect the time available for these discussions and prioritization of a shared decision making process. Conclusion The MADM scale reflects person-driven priorities, and reliably assesses interactions with maternity providers related to a person’s ability to lead decision-making over the course of maternity care. PMID:28231285
Earthquake source properties from instrumented laboratory stick-slip
Kilgore, Brian D.; McGarr, Arthur F.; Beeler, Nicholas M.; Lockner, David A.; Thomas, Marion Y.; Mitchell, Thomas M.; Bhat, Harsha S.
2017-01-01
Stick-slip experiments were performed to determine the influence of the testing apparatus on source properties, develop methods to relate stick-slip to natural earthquakes and examine the hypothesis of McGarr [2012] that the product of stiffness, k, and slip duration, Δt, is scale-independent and the same order as for earthquakes. The experiments use the double-direct shear geometry, Sierra White granite at 2 MPa normal stress and a remote slip rate of 0.2 µm/sec. To determine apparatus effects, disc springs were added to the loading column to vary k. Duration, slip, slip rate, and stress drop decrease with increasing k, consistent with a spring-block slider model. However, neither for the data nor model is kΔt constant; this results from varying stiffness at fixed scale.In contrast, additional analysis of laboratory stick-slip studies from a range of standard testing apparatuses is consistent with McGarr's hypothesis. kΔt is scale-independent, similar to that of earthquakes, equivalent to the ratio of static stress drop to average slip velocity, and similar to the ratio of shear modulus to wavespeed of rock. These properties result from conducting experiments over a range of sample sizes, using rock samples with the same elastic properties as the Earth, and scale-independent design practices.
Narayan, Angela J; Rivera, Luisa M; Bernstein, Rosemary E; Harris, William W; Lieberman, Alicia F
2018-04-01
This pilot study examined the psychometric properties of the Benevolent Childhood Experiences (BCEs) scale, a new instrument designed to assess positive early life experiences in adults with histories of childhood maltreatment and other adversities. A counterpart to the Adverse Childhood Experiences (ACEs) questionnaire, the BCEs was developed to be multiculturally-sensitive and applicable regardless of socioeconomic position, urban-rural background, or immigration status. Higher levels of BCEs were hypothesized to predict lower levels of psychopathology and stress beyond the effects of ACES in a sample of ethnically diverse, low-income pregnant women. BCEs were also expected to show adequate internal validity across racial/ethnic groups and test-retest stability from the prenatal to the postnatal period. Participants were 101 pregnant women (M=29.10years, SD=6.56, range=18-44; 37% Latina, 22% African-American, 20% White, 21% biracial/multiracial/other; 37% foreign-born, 26% Spanish-speaking) who completed the BCEs and ACEs scales; assessments of prenatal depression and post-traumatic stress disorder (PTSD) symptoms, perceived stress, and exposure to stressful life events (SLEs) during pregnancy; and demographic information. Higher levels of BCEs predicted less PTSD symptoms and SLEs, above and beyond ACEs. The BCEs showed excellent test-retest reliability, and mean levels were comparable across racial/ethnic and Spanish-English groups of women. Person-oriented analyses also showed that higher levels of BCEs offset the effects of ACEs on prenatal stress and psychopathology. The BCEs scale indexes promising promotive factors associated with lower trauma-related symptomatology and stress exposure during pregnancy and illuminates how favorable childhood experiences may counteract long-term effects of childhood adversity. Copyright © 2017 Elsevier Ltd. All rights reserved.
Small-Scale Hybrid Rocket Test Stand & Characterization of Swirl Injectors
NASA Astrophysics Data System (ADS)
Summers, Matt H.
Derived from the necessity to increase testing capabilities of hybrid rocket motor (HRM) propulsion systems for Daedalus Astronautics at Arizona State University, a small-scale motor and test stand were designed and developed to characterize all components of the system. The motor is designed for simple integration and setup, such that both the forward-end enclosure and end cap can be easily removed for rapid integration of components during testing. Each of the components of the motor is removable allowing for a broad range of testing capabilities. While examining injectors and their potential it is thought ideal to obtain the highest regression rates and overall motor performance possible. The oxidizer and fuel are N2O and hydroxyl-terminated polybutadiene (HTPB), respectively, due to previous experience and simplicity. The injector designs, selected for the same reasons, are designed such that they vary only in the swirl angle. This system provides the platform for characterizing the effects of varying said swirl angle on HRM performance.
The Atacama B-Mode Search: CMB Polarimetry with Transition-Edge-Sensor Bolometers
NASA Astrophysics Data System (ADS)
Essinger-Hileman, T.; Appel, J. W.; Beal, J. A.; Cho, H. M.; Fowler, J.; Halpern, M.; Hasselfield, M.; Irwin, K. D.; Marriage, T. A.; Niemack, M. D.; Page, L.; Parker, L. P.; Pufu, S.; Staggs, S. T.; Stryzak, O.; Visnjic, C.; Yoon, K. W.; Zhao, Y.
2009-12-01
The Atacama B-mode Search (ABS) experiment is a 145 GHz polarimeter designed to measure the B-mode polarization of the Cosmic Microwave Background (CMB) at large angular scales. The ABS instrument will ship to the Atacama Desert of Chile fully tested and ready to observe in 2010. ABS will image large-angular-scale CMB polarization anisotropies onto a focal plane of 240 feedhorn-coupled, transition-edge sensor (TES) polarimeters, using a cryogenic crossed-Dragone design. The ABS detectors, which are fabricated at NIST, use orthomode transducers to couple orthogonal polarizations of incoming radiation onto separate TES bolometers. The incoming radiation is modulated by an ambient-temperature half-wave plate in front of the vacuum window at an aperture stop. Preliminary detector characterization indicates that the ABS detectors can achieve a sensitivity of 300 μK√s in the field. This paper describes the ABS optical design and detector readout scheme, including feedhorn design and performance, magnetic shielding, focal plane architecture, and cryogenic electronics.
The aesthetics of city-scale preservation policy in Beijing.
Abramson, Daniel Benjamin
2007-01-01
Chinese cities today represent a historically important case of the relation between city-scale preservation policy and urban design, and the role they play in the rapid transformation of urban environments. This article reviews Beijing's preservation and urban design policies as they existed in 1990, and as they evolved and responded over the following fifteen years of radical change. Beijing's master plan in the 1990s ambitiously attempted to define the preservation-worthy image of the entire old city, but did so in narrowly picturesque terms. The practice of 'protecting' designated historic structures by clearing the space around them, and the dependence on a totalizing view-from-on-high to define Beijing's overall characteristic form (as opposed to an experience of the city from its myriad public and private spaces), produced a city-wide preservation policy that was particularly handicapped in its ability to accommodate change.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mathieu, Johanna L.; Gadgil, Ashok J.; Kowolik, Kristin
2009-09-14
Researchers have invented a material called ARUBA -- Arsenic Removal Using Bottom Ash -- that effectively and affordably removes arsenic from Bangladesh groundwater. Through analysis of studies across a range of disciplines, observations, and informal interviews conducted over three trips to Bangladesh, we have applied mechanical engineering design methodology to develop eight key design strategies, which were used in the development of a low-cost, community-scale water treatment system that uses ARUBA to removearsenic from drinking water. We have constructed, tested, and analysed a scale version of the system. Experiments have shown that the system is capable of reducing high levelsmore » of arsenic (nearly 600 ppb) to below the Bangladesh standard of 50 ppb, while remaining affordable to people living on less than US$2/day. The system could be sustainably implemented as a public-private partnership in rural Bangladesh.« less
Biomorphic architectures for autonomous Nanosat designs
NASA Technical Reports Server (NTRS)
Hasslacher, Brosl; Tilden, Mark W.
1995-01-01
Modern space tool design is the science of making a machine both massively complex while at the same time extremely robust and dependable. We propose a novel nonlinear control technique that produces capable, self-organizing, micron-scale space machines at low cost and in large numbers by parallel silicon assembly. Experiments using biomorphic architectures (with ideal space attributes) have produced a wide spectrum of survival-oriented machines that are reliably domesticated for work applications in specific environments. In particular, several one-chip satellite prototypes show interesting control properties that can be turned into numerous application-specific machines for autonomous, disposable space tasks. We believe that the real power of these architectures lies in their potential to self-assemble into larger, robust, loosely coupled structures. Assembly takes place at hierarchical space scales, with different attendant properties, allowing for inexpensive solutions to many daunting work tasks. The nature of biomorphic control, design, engineering options, and applications are discussed.
New designs of LMJ targets for early ignition experiments
NASA Astrophysics Data System (ADS)
C-Clérouin, C.; Bonnefille, M.; Dattolo, E.; Fremerye, P.; Galmiche, D.; Gauthier, P.; Giorla, J.; Laffite, S.; Liberatore, S.; Loiseau, P.; Malinie, G.; Masse, L.; Poggi, F.; Seytor, P.
2008-05-01
The LMJ experimental plans include the attempt of ignition and burn of an ICF capsule with 40 laser quads, delivering up to 1.4MJ and 380TW. New targets needing reduced laser energy with only a small decrease in robustness are then designed for this purpose. A first strategy is to use scaled-down cylindrical hohlraums and capsules, taking advantage of our better understanding of the problem, set on theoretical modelling, simulations and experiments. Another strategy is to work specifically on the coupling efficiency parameter, i.e. the ratio of the energy absorbed by the capsule to the laser energy, which is with parametric instabilities a crucial drawback of indirect drive. An alternative design is proposed, made up of the nominal 60 quads capsule, named A1040, in a rugby-shaped hohlraum. Robustness evaluations of these different targets are in progress.
The Full Scale Seal Experiment - A Seal Industrial Prototype for Cigeo - 13106
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lebon, P.; Bosgiraud, J.M.; Foin, R.
2013-07-01
The Full Scale Seal (FSS) Experiment is one of various experiments implemented by Andra, within the frame of the Cigeo (the French Deep Geological Repository) Project development, to demonstrate the technical construction feasibility and performance of seals to be constructed, at time of Repository components (shafts, ramps, drifts, disposal vaults) progressive closure. FSS is built inside a drift model fabricated on surface for the purpose. Prior to the scale 1:1 seal construction test, various design tasks are scheduled. They include the engineering work on the drift model to make it fit with the experimental needs, on the various work sequencesmore » anticipated for the swelling clay core emplacement and the concrete containment plugs construction, on the specialized handling tools (and installation equipment) manufactured and delivered for the purpose, and of course on the various swelling clay materials and low pH (below 11) concrete formulations developed for the application. The engineering of the 'seal-as-built' commissioning means (tools and methodology) must also be dealt with. The FSS construction experiment is a technological demonstrator, thus it is not focused on the phenomenological survey (and by consequence, on the performance and behaviour forecast). As such, no hydration (forced or natural) is planned. However, the FSS implementation (in particular via the construction and commissioning activities carried out) is a key milestone in view of comforting phenomenological extrapolation in time and scale. The FSS experiment also allows for qualifying the commissioning methods of a real sealing system in the Repository, as built, at time of industrial operations. (authors)« less
Kassab, Salah Eldin; Al-Shafei, Ahmad I; Salem, Abdel Halim; Otoom, Sameer
2015-01-01
Purpose This study examined the relationships between the different aspects of students’ course experience, self-regulated learning, and academic achievement of medical students in a blended learning curriculum. Methods Perceptions of medical students (n=171) from the Royal College of Surgeons in Ireland, Medical University of Bahrain (RCSI Bahrain), on the blended learning experience were measured using the Student Course Experience Questionnaire (SCEQ), with an added e-Learning scale. In addition, self-regulated learning was measured using the Motivated Strategies for Learning Questionnaire (MSLQ). Academic achievement was measured by the scores of the students at the end of the course. A path analysis was created to test the relationships between the different study variables. Results Path analysis indicated that the perceived quality of the face-to-face component of the blended experience directly affected the motivation of students. The SCEQ scale “quality of teaching” directly affected two aspects of motivation: control of learning and intrinsic goal orientation. Furthermore, appropriate course workload directly affected the self-efficacy of students. Moreover, the e-Learning scale directly affected students’ peer learning and critical thinking but indirectly affected metacognitive regulation. The resource management regulation strategies, time and study environment, and effort regulation directly affected students’ examination scores (17% of the variance explained). However, there were no significant direct relationships between the SCEQ scales and cognitive learning strategies or examination scores. Conclusion The results of this study will have important implications for designing blended learning courses in medical schools. PMID:25610011
Single-Vector Calibration of Wind-Tunnel Force Balances
NASA Technical Reports Server (NTRS)
Parker, P. A.; DeLoach, R.
2003-01-01
An improved method of calibrating a wind-tunnel force balance involves the use of a unique load application system integrated with formal experimental design methodology. The Single-Vector Force Balance Calibration System (SVS) overcomes the productivity and accuracy limitations of prior calibration methods. A force balance is a complex structural spring element instrumented with strain gauges for measuring three orthogonal components of aerodynamic force (normal, axial, and side force) and three orthogonal components of aerodynamic torque (rolling, pitching, and yawing moments). Force balances remain as the state-of-the-art instrument that provide these measurements on a scale model of an aircraft during wind tunnel testing. Ideally, each electrical channel of the balance would respond only to its respective component of load, and it would have no response to other components of load. This is not entirely possible even though balance designs are optimized to minimize these undesirable interaction effects. Ultimately, a calibration experiment is performed to obtain the necessary data to generate a mathematical model and determine the force measurement accuracy. In order to set the independent variables of applied load for the calibration 24 NASA Tech Briefs, October 2003 experiment, a high-precision mechanical system is required. Manual deadweight systems have been in use at Langley Research Center (LaRC) since the 1940s. These simple methodologies produce high confidence results, but the process is mechanically complex and labor-intensive, requiring three to four weeks to complete. Over the past decade, automated balance calibration systems have been developed. In general, these systems were designed to automate the tedious manual calibration process resulting in an even more complex system which deteriorates load application quality. The current calibration approach relies on a one-factor-at-a-time (OFAT) methodology, where each independent variable is incremented individually throughout its full-scale range, while all other variables are held at a constant magnitude. This OFAT approach has been widely accepted because of its inherent simplicity and intuitive appeal to the balance engineer. LaRC has been conducting research in a "modern design of experiments" (MDOE) approach to force balance calibration. Formal experimental design techniques provide an integrated view to the entire calibration process covering all three major aspects of an experiment; the design of the experiment, the execution of the experiment, and the statistical analyses of the data. In order to overcome the weaknesses in the available mechanical systems and to apply formal experimental techniques, a new mechanical system was required. The SVS enables the complete calibration of a six-component force balance with a series of single force vectors.
Blaauwendraat, Conny; Levy Berg, Adrienne; Gyllensten, Amanda Lundvik
2017-07-01
The present study with mixed methods design evaluated the long-term effects of Basic Body Awareness Therapy (BBAT) for patients with posttraumatic stress disorder (PTSD). Fifteen patients received 12 individual sessions of BBAT treatment as usual (TAU) when needed. The patients were assessed at baseline (T0), directly after treatment (T1) and at one-year follow-up (T2), using the Body Awareness Scale Movement Quality and Experience (BAS MQ-E), the Visual Analog Scale (VAS), and the Impact of Event Scale-Revised (IES-R). The results at T1 showed significant improvement in the quality of movement (p = 0.001), body experience (p = 0.007), and symptoms (p = 0.001). At T2, the improvements were sustained. Pain in stillness (p = 0.017) and during movement (p = 0.007) had decreased. The verbal ability to describe the body experiences in words was poor at T0, but became more detailed at T1 and even more so at T2. Our findings suggest that BBAT in addition to TAU can be a viable physiotherapeutic treatment for patients with PTSD. This knowledge may influence future treatment strategies for patients with PTSD and be of guidance to physiotherapists working with persons with trauma experiences in the community or psychiatry/mental healthcare areas.
Design, construction, and testing of the direct absorption receiver panel research experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chavez, J.M.; Rush, E.E.; Matthews, C.W.
1990-01-01
A panel research experiment (PRE) was designed, built, and tested as a scaled-down model of a direct absorption receiver (DAR). The PRE is a 3-MW{sub t}DAR experiment that will allow flow testing with molten nitrate salt and provide a test bed for DAR testing with actual solar heating. In a solar central receiver system DAR, the heat absorbing fluid (a blackened molten nitrate salt) flows in a thin film down a vertical panel (rather than through tubes as in conventional receiver designs) and absorbs the concentrated solar flux directly. The ability of the flowing salt film to absorb flux directly.more » The ability of the flowing salt film to absorb the incident solar flux depends on the panel design, hydraulic and thermal fluid flow characteristics, and fluid blackener properties. Testing of the PRE is being conducted to demonstrate the engineering feasibility of the DAR concept. The DAR concept is being investigated because it offers numerous potential performance and economic advantages for production of electricity when compared to other solar receiver designs. The PRE utilized a 1-m wide by 6-m long absorber panel. The salt flow tests are being used to investigate component performance, panel deformations, and fluid stability. Salt flow testing has demonstrated that all the DAR components work as designed and that there are fluid stability issues that need to be addressed. Future solar testing will include steady-state and transient experiments, thermal loss measurements, responses to severe flux and temperature gradients and determination of peak flux capability, and optimized operation. In this paper, we describe the design, construction, and some preliminary flow test results of the Panel Research Experiment. 11 refs., 8 figs., 2 tabs.« less
Plastic scintillator detector for pulsed flux measurements
NASA Astrophysics Data System (ADS)
Kadilin, V. V.; Kaplun, A. A.; Taraskin, A. A.
2017-01-01
A neutron detector, providing charged particle detection capability, has been designed. The main purpose of the detector is to measure pulsed fluxes of both charged particles and neutrons during scientific experiments. The detector consists of commonly used neutron-sensitive ZnS(Ag) / 6LiF scintillator screens wrapping a layer of polystyrene based scintillator (BC-454, EJ-254 or equivalent boron loaded plastic). This type of detector design is able to log a spatial distribution of events and may be scaled to any size. Different variations of the design were considered and modelled in specialized toolkits. The article presents a review of the detector design features as well as simulation results.
Mirsaleh, Y R; Rezai, H; Kivi, S R; Ghorbani, R
2010-12-01
to investigate the relationship between religiosity, coping styles, self-efficacy and personality dimensions as predictors of satisfaction with clinical experience in rehabilitation interns during transition from academic study to clinical internship. a cross-sectional survey design. five rehabilitation faculties. three hundred and eighteen undergraduate rehabilitation interns, including physical therapy, occupational therapy and speech and language pathology students. Islamic Religiosity Scale, Ways of Coping Questionnaire, General Self-efficacy Scale, NEO Five Factor Inventory, and Satisfaction with Clinical Experiences Questionnaire. religiosity, problem-focused coping and general self-efficacy had significant positive correlation with satisfaction with clinical internship in rehabilitation students. Among personality dimensions, openness, agreement and consciousness had significant positive correlation with satisfaction with clinical experience and neuroticism had significant negative correlation with satisfaction with clinical experience. The results of regression analysis demonstrated that religiosity and self-efficacy had important roles in the prediction of satisfaction with clinical experience in all the rehabilitation intern students of three disciplines (physical therapy, occupational therapy, and speech and language pathology). religiosity, problem-focused coping and general self-efficacy seem to be good predictors of satisfaction with clinical internship in rehabilitation students.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agnes, P.; et al.
The DarkSide-50 experiment, located at the “Laboratori Nazionali del Gran Sasso (INFN)”, is based on low-radioactivity argon double phase time projection chamber, surrounded by an active liquid scintillator veto, designed for the zero background achievement. The liquid argon features sufficient self shielding and easy scalability to multi-tons scale. The impressive reduction of the 39Ar isotope (compared to the atmospheric argon), along with the excellent pulse shape discrimination, make this technology a possible candidate for the forthcoming generation of multi-ton Dark Matter experiments.
2009-02-27
exchanged by means of line-of-sight sensors that experience periodic communication dropouts due to agent motion. Variation in network topology in...respiratory, and cardiovascular function by man- ual control based on the clinician’s experience and intuition. Open-loop control by clinical personnel can be...to ap- pear. [29] W. M. Haddad and J. M. Bailey, "Closed-Loop Control for Intensive Care Unit Seda- tion," Best Prac. Res. Clinical Anaesthesiology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stepinski, Dominique C.; Youker, Amanda J.; Krahn, Elizabeth O.
2017-03-01
Molybdenum-99 is a parent of the most widely used medical isotope technetium-99m. Proliferation concerns have prompted development of alternative Mo production methods utilizing low enriched uranium. Alumina and titania sorbents were evaluated for separation of Mo from concentrated uranyl nitrate solutions. System, mass transfer, and isotherm parameters were determined to enable design of Mo separation processes under a wide range of conditions. A model-based approach was utilized to design representative commercial-scale column processes. The designs and parameters were verified with bench-scale experiments. The results are essential for design of Mo separation processes from irradiated uranium solutions, selection of support materialmore » and process optimization. Mo uptake studies show that adsorption decreases with increasing concentration of uranyl nitrate; howeveL, examination of Mo adsorption as a function of nitrate ion concentration shows no dependency, indicating that uranium competes with Mo for adsorption sites. These results are consistent with reports indicating that Mo forms inner-sphere complexes with titania and alumina surface groups.« less
Use of a Modern Polymerization Pilot-Plant for Undergraduate Control Projects.
ERIC Educational Resources Information Center
Mendoza-Bustos, S. A.; And Others
1991-01-01
Described is a project where students gain experience in handling large volumes of hazardous materials, process start up and shut down, equipment failures, operational variations, scaling up, equipment cleaning, and run-time scheduling while working in a modern pilot plant. Included are the system design, experimental procedures, and results. (KR)
C. -Y. Hse; P. Koch; C.W. McMillin; E.W. Price
1975-01-01
A series of experiments was conducted to develop a 1/2-inch-thick, structural, exterior, mixed-species flakeboard functionally competitive with sheathing grades of plywood. The board design settled on is comprised of equal-weight portions throughout of Carya spp., Quercus alba L., Quercus falcata Michx.,
On the Hedges Correction for a "t"-Test
ERIC Educational Resources Information Center
VanHoudnos, Nathan M.; Greenhouse, Joel B.
2016-01-01
When cluster randomized experiments are analyzed as if units were independent, test statistics for treatment effects can be anticonservative. Hedges proposed a correction for such tests by scaling them to control their Type I error rate. This article generalizes the Hedges correction from a posttest-only experimental design to more common designs…
The Disabled Student Experience: Does the SERVQUAL Scale Measure Up?
ERIC Educational Resources Information Center
Vaughan, Elizabeth; Woodruffe-Burton, Helen
2011-01-01
Purpose: The purpose of this paper is to empirically test a new disabled service user-specific service quality model ARCHSECRET against a modified SERVQUAL model in the context of disabled students within higher education. Design/methodology/approach: The application of SERVQUAL in the voluntary sector had raised serious issues on its portability…
Chung-Yun Hse; Peter Koch; Charles W. Mcmillin; Eddie W. Price
1975-01-01
A seriex of experiments was conducted to develop a 1/2-inch-thick, structural, exterior, mixed-species flakeboard functionally competitive with sheathing grades of plywood. The board design settled on is comprised of equal-weight portions throughout of Carya spp., Quercus alba L., Quercus falcata Michx.,
Modeling disturbance and succession in forest landscapes using LANDIS: introduction
Brian R. Sturtevant; Eric J. Gustafson; Hong S. He
2004-01-01
Modeling forest landscape change is challenging because it involves the interaction of a variety of factors and processes, such as climate, succession, disturbance, and management. These processes occur at various spatial and temporal scales, and the interactions can be complex on heterogeneous landscapes. Because controlled field experiments designed to investigate...
Turkish Prospective English Teachers' Reflections on Teaching Practice
ERIC Educational Resources Information Center
Yildiz, Mine; Geçikli, Merve; Yesilyurt, Savas
2016-01-01
This study is an attempt to present the reflections of prospective English teachers in Turkey on teaching practice over their experiences and perceptions. A mixed-method research design was conducted through the use of a questionnaire involving a 5-Likert scale and one open-ended question. The participants were 120 senior students at ELT…
Extensive Listening in a Colombian University: Process, Product, and Perceptions
ERIC Educational Resources Information Center
Mayora, Carlos A.
2017-01-01
The current paper reports an experience implementing a small-scale narrow listening scheme (one of the varieties of extensive listening) with intermediate learners of English as a foreign language in a Colombian university. The paper presents (a) how the scheme was designed and implemented, including materials and procedures (the process); (b) how…
Li, Hua; Zhu, Jia; Flamming, James J; O'Connell, Jack; Shrader, Michael
2015-01-01
Many wastewater treatment plants in the USA, which were originally designed as secondary treatment systems with no or partial nitrification requirements, are facing increased flows, loads, and more stringent ammonia discharge limits. Plant expansion is often not cost-effective due to either high construction costs or lack of land. Under these circumstances, integrated fixed-film activated sludge (IFAS) systems using both suspended growth and biofilms that grow attached to a fixed plastic structured sheet media are found to be a viable solution for solving the challenges. Multiple plants have been retrofitted with such IFAS systems in the past few years. The system has proven to be efficient and reliable in achieving not only consistent nitrification, but also enhanced bio-chemical oxygen demand removal and sludge settling characteristics. This paper presents long-term practical experiences with the IFAS system design, operation and maintenance, and performance for three full-scale plants with distinct processes; that is, a trickling filter/solids contact process, a conventional plug flow activated sludge process and an extended aeration process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chu, T.Y.; Bentz, J.H.; Bergeron, K.D.
1994-04-01
The possibility of achieving in-vessel core retention by flooding the reactor cavity, or the ``flooded cavity``, is an accident management concept currently under consideration for advanced light water reactors (ALWR), as well as for existing light water reactors (LWR). The CYBL (CYlindrical BoiLing) facility is a facility specifically designed to perform large-scale confirmatory testing of the flooded cavity concept. CYBL has a tank-within-a-tank design; the inner 3.7 m diameter tank simulates the reactor vessel, and the outer tank simulates the reactor cavity. The energy deposition on the bottom head is simulated with an array of radiant heaters. The array canmore » deliver a tailored heat flux distribution corresponding to that resulting from core melt convection. The present paper provides a detailed description of the capabilities of the facility, as well as results of recent experiments with heat flux in the range of interest to those required for in-vessel retention in typical ALWRs. The paper concludes with a discussion of other experiments for the flooded cavity applications.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Zhijie; Lai, Canhai; Marcy, Peter William
2017-05-01
A challenging problem in designing pilot-scale carbon capture systems is to predict, with uncertainty, the adsorber performance and capture efficiency under various operating conditions where no direct experimental data exist. Motivated by this challenge, we previously proposed a hierarchical framework in which relevant parameters of physical models were sequentially calibrated from different laboratory-scale carbon capture unit (C2U) experiments. Specifically, three models of increasing complexity were identified based on the fundamental physical and chemical processes of the sorbent-based carbon capture technology. Results from the corresponding laboratory experiments were used to statistically calibrate the physical model parameters while quantifying some of theirmore » inherent uncertainty. The parameter distributions obtained from laboratory-scale C2U calibration runs are used in this study to facilitate prediction at a larger scale where no corresponding experimental results are available. In this paper, we first describe the multiphase reactive flow model for a sorbent-based 1-MW carbon capture system then analyze results from an ensemble of simulations with the upscaled model. The simulation results are used to quantify uncertainty regarding the design’s predicted efficiency in carbon capture. In particular, we determine the minimum gas flow rate necessary to achieve 90% capture efficiency with 95% confidence.« less
Gong, Yang; Zhang, Jiajie
2011-04-01
In a distributed information search task, data representation and cognitive distribution jointly affect user search performance in terms of response time and accuracy. Guided by UFuRT (User, Function, Representation, Task), a human-centered framework, we proposed a search model and task taxonomy. The model defines its application in the context of healthcare setting. The taxonomy clarifies the legitimate operations for each type of search task of relational data. We then developed experimental prototypes of hyperlipidemia data displays. Based on the displays, we tested the search tasks performance through two experiments. The experiments are of a within-subject design with a random sample of 24 participants. The results support our hypotheses and validate the prediction of the model and task taxonomy. In this study, representation dimensions, data scales, and search task types are the main factors in determining search efficiency and effectiveness. Specifically, the more external representations provided on the interface the better search task performance of users. The results also suggest the ideal search performance occurs when the question type and its corresponding data scale representation match. The implications of the study lie in contributing to the effective design of search interface for relational data, especially laboratory results, which could be more effectively designed in electronic medical records.
Kim, Moonkeun; Lee, Sang-Kyun; Yang, Yil Suk; Jeong, Jaehwa; Min, Nam Ki; Kwon, Kwang-Ho
2013-12-01
We fabricated dual-beam cantilevers on the microelectromechanical system (MEMS) scale with an integrated Si proof mass. A Pb(Zr,Ti)O3 (PZT) cantilever was designed as a mechanical vibration energy-harvesting system for low power applications. The resonant frequency of the multilayer composition cantilevers were simulated using the finite element method (FEM) with parametric analysis carried out in the design process. According to simulations, the resonant frequency, voltage, and average power of a dual-beam cantilever was 69.1 Hz, 113.9 mV, and 0.303 microW, respectively, at optimal resistance and 0.5 g (gravitational acceleration, m/s2). Based on these data, we subsequently fabricated cantilever devices using dual-beam cantilevers. The harvested power density of the dual-beam cantilever compared favorably with the simulation. Experiments revealed the resonant frequency, voltage, and average power density to be 78.7 Hz, 118.5 mV, and 0.34 microW, respectively. The error between the measured and simulated results was about 10%. The maximum average power and power density of the fabricated dual-beam cantilever at 1 g were 0.803 microW and 1322.80 microW cm(-3), respectively. Furthermore, the possibility of a MEMS-scale power source for energy conversion experiments was also tested.
NASA Astrophysics Data System (ADS)
Hu, S. X.; Michel, D. T.; Edgell, D. H.; Froula, D. H.; Follett, R. K.; Goncharov, V. N.; Myatt, J. F.; Skupsky, S.; Yaakobi, B.
2013-03-01
Direct-drive-ignition designs with plastic CH ablators create plasmas of long density scale lengths (Ln ≥ 500 μm) at the quarter-critical density (Nqc) region of the driving laser. The two-plasmon-decay (TPD) instability can exceed its threshold in such long-scale-length plasmas (LSPs). To investigate the scaling of TPD-induced hot electrons to laser intensity and plasma conditions, a series of planar experiments have been conducted at the Omega Laser Facility with 2-ns square pulses at the maximum laser energies available on OMEGA and OMEGA EP. Radiation-hydrodynamic simulations have been performed for these LSP experiments using the two-dimensional hydrocode draco. The simulated hydrodynamic evolution of such long-scale-length plasmas has been validated with the time-resolved full-aperture backscattering and Thomson-scattering measurements. draco simulations for CH ablator indicate that (1) ignition-relevant long-scale-length plasmas of Ln approaching ˜400 μm have been created; (2) the density scale length at Nqc scales as Ln(μm)≃(RDPP×I1/4/2); and (3) the electron temperature Te at Nqc scales as Te(keV)≃0.95×√I , with the incident intensity (I) measured in 1014 W/cm2 for plasmas created on both OMEGA and OMEGA EP configurations with different-sized (RDPP) distributed phase plates. These intensity scalings are in good agreement with the self-similar model predictions. The measured conversion fraction of laser energy into hot electrons fhot is found to have a similar behavior for both configurations: a rapid growth [fhot≃fc×(Gc/4)6 for Gc < 4] followed by a saturation of the form, fhot≃fc×(Gc/4)1.2 for Gc ≥ 4, with the common wave gain is defined as Gc=3 × 10-2×IqcLnλ0/Te, where the laser intensity contributing to common-wave gain Iqc, Ln, Te at Nqc, and the laser wavelength λ0 are, respectively, measured in [1014 W/cm2], [μm], [keV], and [μm]. The saturation level fc is observed to be fc ≃ 10-2 at around Gc ≃ 4. The hot-electron temperature scales roughly linear with Gc. Furthermore, to mitigate TPD instability in long-scale-length plasmas, different ablator materials such as saran and aluminum have been investigated on OMEGA EP. Hot-electron generation has been reduced by a factor of 3-10 for saran and aluminum plasmas, compared to the CH case at the same incident laser intensity. draco simulations suggest that saran might be a better ablator for direct-drive-ignition designs as it balances TPD mitigation with an acceptable hydro-efficiency.
Design and implementation of a distributed large-scale spatial database system based on J2EE
NASA Astrophysics Data System (ADS)
Gong, Jianya; Chen, Nengcheng; Zhu, Xinyan; Zhang, Xia
2003-03-01
With the increasing maturity of distributed object technology, CORBA, .NET and EJB are universally used in traditional IT field. However, theories and practices of distributed spatial database need farther improvement in virtue of contradictions between large scale spatial data and limited network bandwidth or between transitory session and long transaction processing. Differences and trends among of CORBA, .NET and EJB are discussed in details, afterwards the concept, architecture and characteristic of distributed large-scale seamless spatial database system based on J2EE is provided, which contains GIS client application, web server, GIS application server and spatial data server. Moreover the design and implementation of components of GIS client application based on JavaBeans, the GIS engine based on servlet, the GIS Application server based on GIS enterprise JavaBeans(contains session bean and entity bean) are explained.Besides, the experiments of relation of spatial data and response time under different conditions are conducted, which proves that distributed spatial database system based on J2EE can be used to manage, distribute and share large scale spatial data on Internet. Lastly, a distributed large-scale seamless image database based on Internet is presented.
Kim, Hyerin; Kang, NaNa; An, KyuHyeon; Koo, JaeHyung; Kim, Min-Soo
2016-01-01
Design of high-quality primers for multiple target sequences is essential for qPCR experiments, but is challenging due to the need to consider both homology tests on off-target sequences and the same stringent filtering constraints on the primers. Existing web servers for primer design have major drawbacks, including requiring the use of BLAST-like tools for homology tests, lack of support for ranking of primers, TaqMan probes and simultaneous design of primers against multiple targets. Due to the large-scale computational overhead, the few web servers supporting homology tests use heuristic approaches or perform homology tests within a limited scope. Here, we describe the MRPrimerW, which performs complete homology testing, supports batch design of primers for multi-target qPCR experiments, supports design of TaqMan probes and ranks the resulting primers to return the top-1 best primers to the user. To ensure high accuracy, we adopted the core algorithm of a previously reported MapReduce-based method, MRPrimer, but completely redesigned it to allow users to receive query results quickly in a web interface, without requiring a MapReduce cluster or a long computation. MRPrimerW provides primer design services and a complete set of 341 963 135 in silico validated primers covering 99% of human and mouse genes. Free access: http://MRPrimerW.com. PMID:27154272
Evaluation of ground motion scaling methods for analysis of structural systems
O'Donnell, A. P.; Beltsar, O.A.; Kurama, Y.C.; Kalkan, E.; Taflanidis, A.A.
2011-01-01
Ground motion selection and scaling comprises undoubtedly the most important component of any seismic risk assessment study that involves time-history analysis. Ironically, this is also the single parameter with the least guidance provided in current building codes, resulting in the use of mostly subjective choices in design. The relevant research to date has been primarily on single-degree-of-freedom systems, with only a few studies using multi-degree-of-freedom systems. Furthermore, the previous research is based solely on numerical simulations with no experimental data available for the validation of the results. By contrast, the research effort described in this paper focuses on an experimental evaluation of selected ground motion scaling methods based on small-scale shake-table experiments of re-configurable linearelastic and nonlinear multi-story building frame structure models. Ultimately, the experimental results will lead to the development of guidelines and procedures to achieve reliable demand estimates from nonlinear response history analysis in seismic design. In this paper, an overview of this research effort is discussed and preliminary results based on linear-elastic dynamic response are presented. ?? ASCE 2011.
Performance of a pilot-scale constructed wetland system for treating simulated ash basin water.
Dorman, Lane; Castle, James W; Rodgers, John H
2009-05-01
A pilot-scale constructed wetland treatment system (CWTS) was designed and built to decrease the concentration and toxicity of constituents of concern in ash basin water from coal-burning power plants. The CWTS was designed to promote the following treatment processes for metals and metalloids: precipitation as non-bioavailable sulfides, co-precipitation with iron oxyhydroxides, and adsorption onto iron oxides. Concentrations of Zn, Cr, Hg, As, and Se in simulated ash basin water were reduced by the CWTS to less than USEPA-recommended water quality criteria. The removal efficiency (defined as the percent concentration decrease from influent to effluent) was dependent on the influent concentration of the constituent, while the extent of removal (defined as the concentration of a constituent of concern in the CWTS effluent) was independent of the influent concentration. Results from toxicity experiments illustrated that the CWTS eliminated influent toxicity with regard to survival and reduced influent toxicity with regard to reproduction. Reduction in potential for scale formation and biofouling was achieved through treatment of the simulated ash basin water by the pilot-scale CWTS.
Model Wind Turbines Tested at Full-Scale Similarity
NASA Astrophysics Data System (ADS)
Miller, M. A.; Kiefer, J.; Westergaard, C.; Hultmark, M.
2016-09-01
The enormous length scales associated with modern wind turbines complicate any efforts to predict their mechanical loads and performance. Both experiments and numerical simulations are constrained by the large Reynolds numbers governing the full- scale aerodynamics. The limited fundamental understanding of Reynolds number effects in combination with the lack of empirical data affects our ability to predict, model, and design improved turbines and wind farms. A new experimental approach is presented, which utilizes a highly pressurized wind tunnel (up to 220 bar). It allows exact matching of the Reynolds numbers (no matter how it is defined), tip speed ratios, and Mach numbers on a geometrically similar, small-scale model. The design of a measurement and instrumentation stack to control the turbine and measure the loads in the pressurized environment is discussed. Results are then presented in the form of power coefficients as a function of Reynolds number and Tip Speed Ratio. Due to gearbox power loss, a preliminary study has also been completed to find the gearbox efficiency and the resulting correction has been applied to the data set.
Remote sensing technology research and instrumentation platform design
NASA Technical Reports Server (NTRS)
1992-01-01
An instrumented pallet concept and definition of an aircraft with performance and payload capability to meet NASA's airborne turbulent flux measurement needs for advanced multiple global climate research and field experiments is presented. The report addresses airborne measurement requirements for general circulation model sub-scale parameterization research, specifies instrumentation capable of making these measurements, and describes a preliminary support pallet design. Also, a review of aircraft types and a recommendation of a manned and an unmanned aircraft capable of meeting flux parameterization research needs is given.
Design of the NASA Robonaut Hand
NASA Technical Reports Server (NTRS)
Lovchik, Chris S.; Aldridge, H. A.; Driftler, Myron A.
1999-01-01
The design of a highly anthropomorphic human scale robot hand for space based operations is described. This five finger hand combined with its integrated wrist and forearm has fourteen independent degrees of freedom. The device approximates very well the kinematics and required strength of an astronaut's hand when operating through a pressurized space suit glove. The mechanisms used to meet these requirements are explained in detail along with the design philosophy behind them. Integration experiences reveal the challenges associated with obtaining the required capabilities within the desired size. The initial finger control strategy is presented along with examples of obtainable grasps.
Los Alamos Explosives Performance Key to Stockpile Stewardship
Dattelbaum, Dana
2018-02-14
As the U.S. Nuclear Deterrent ages, one essential factor in making sure that the weapons will continue to perform as designed is understanding the fundamental properties of the high explosives that are part of a nuclear weapons system. As nuclear weapons go through life extension programs, some changes may be advantageous, particularly through the addition of what are known as "insensitive" high explosives that are much less likely to accidentally detonate than the already very safe "conventional" high explosives that are used in most weapons. At Los Alamos National Laboratory explosives research includes a wide variety of both large- and small-scale experiments that include small contained detonations, gas and powder gun firings, larger outdoor detonations, large-scale hydrodynamic tests, and at the Nevada Nuclear Security Site, underground sub-critical experiments.
Reawakening reflective capacity in the psychotherapy of schizophrenia: a case study.
Bargenquast, Rebecca; Schweitzer, Robert D; Drake, Suzanne
2015-02-01
Disturbed sense of self has long been identified as a common experience among people suffering with schizophrenia. More recently, metacognitive deficits have been found to be a stable and independent feature of schizophrenia that contributes to disturbed self-experience and impedes recovery. Individual psychotherapy designed to target poor metacognition has been shown to promote a more coherent sense of self and enhanced recovery in people with schizophrenia. We provide a report of a 2-year individual psychotherapy with a patient suffering with chronic schizophrenia. Progress was assessed over the course of treatment using the Metacognition Assessment Scale and the Brief Psychiatric Rating Scale. The patient experienced improved metacognitive capacity and reduced symptom severity over the course of therapy. Implications for clinical practice are discussed. © 2015 Wiley Periodicals, Inc.
From Tomography to Material Properties of Thermal Protection Systems
NASA Technical Reports Server (NTRS)
Mansour, Nagi N.; Panerai, Francesco; Ferguson, Joseph C.; Borner, Arnaud; Barnhardt, Michael; Wright, Michael
2017-01-01
A NASA Ames Research Center (ARC) effort, under the Entry Systems Modeling (ESM) project, aims at developing micro-tomography (micro-CT) experiments and simulations for studying materials used in hypersonic entry systems. X-ray micro-tomography allows for non-destructive 3D imaging of a materials micro-structure at the sub-micron scale, providing fiber-scale representations of porous thermal protection systems (TPS) materials. The technique has also allowed for In-situ experiments that can resolve response phenomena under realistic environmental conditions such as high temperature, mechanical loads, and oxidizing atmospheres. Simulation tools have been developed at the NASA Ames Research Center to determine material properties and material response from the high-fidelity tomographic representations of the porous materials with the goal of informing macroscopic TPS response models and guiding future TPS design.
Practicing universal design to actual hand tool design process.
Lin, Kai-Chieh; Wu, Chih-Fu
2015-09-01
UD evaluation principles are difficult to implement in product design. This study proposes a methodology for implementing UD in the design process through user participation. The original UD principles and user experience are used to develop the evaluation items. Difference of product types was considered. Factor analysis and Quantification theory type I were used to eliminate considered inappropriate evaluation items and to examine the relationship between evaluation items and product design factors. Product design specifications were established for verification. The results showed that converting user evaluation into crucial design verification factors by the generalized evaluation scale based on product attributes as well as the design factors applications in product design can improve users' UD evaluation. The design process of this study is expected to contribute to user-centered UD application. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Design-of-experiments to Reduce Life-cycle Costs in Combat Aircraft Inlets
NASA Technical Reports Server (NTRS)
Anderson, Bernhard H.; Baust, Henry D.; Agrell, Johan
2003-01-01
It is the purpose of this study to demonstrate the viability and economy of Design- of-Experiments (DOE), to arrive at micro-secondary flow control installation designs that achieve optimal inlet performance for different mission strategies. These statistical design concepts were used to investigate the properties of "low unit strength" micro-effector installation. "Low unit strength" micro-effectors are micro-vanes, set a very low angle-of incidence, with very long chord lengths. They are designed to influence the neat wall inlet flow over an extended streamwise distance. In this study, however, the long chord lengths were replicated by a series of short chord length effectors arranged in series over multiple bands of effectors. In order to properly evaluate the performance differences between the single band extended chord length installation designs and the segmented multiband short chord length designs, both sets of installations must be optimal. Critical to achieving optimal micro-secondary flow control installation designs is the understanding of the factor interactions that occur between the multiple bands of micro-scale vane effectors. These factor interactions are best understood and brought together in an optimal manner through a structured DOE process, or more specifically Response Surface Methods (RSM).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheraghi, S. Hossein; Madden, Frank
The goal of this collaborative effort between Western New England University's College of Engineering and FloDesign Wind Turbine (FDWT) Corporation to wok on a novel areodynamic concept that could potentially lead to the next generation of wind turbines. Analytical studies and early scale model tests of FDWT's Mixer/Ejector Wind Turbine (MEWT) concept, which exploits jet-age advanced fluid dynamics, indicate that the concept has the potential to significantly reduce the cost of electricity over conventional Horizontal Axis Wind Turbines while reducing land usage. This project involved the design, fabrication, and wind tunnel testing of components of MEWT to provide the researchmore » and engineering data necessary to validate the design iterations and optimize system performance. Based on these tests, a scale model prototype called Briza was designed, fabricated, installed and tested on a portable tower to investigate and improve the design system in real world conditions. The results of these scale prototype efforts were very promising and have contributed significantly to FDWT's ongoing development of a product scale wind turbine for deployment in multiple locations around the U.S. This research was mutually beneficial to Western New England University, FDWT, and the DOE by utilizing over 30 student interns and a number of faculty in all efforts. It brought real-world wind turbine experience into the classroom to further enhance the Green Engineering Program at WNEU. It also provided on-the-job training to many students, improving their future employment opportunities, while also providing valuable information to further advance FDWT's mixer-ejector wind turbine technology, creating opportunities for future project innovation and job creation.« less
COPS: Large-scale nonlinearly constrained optimization problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bondarenko, A.S.; Bortz, D.M.; More, J.J.
2000-02-10
The authors have started the development of COPS, a collection of large-scale nonlinearly Constrained Optimization Problems. The primary purpose of this collection is to provide difficult test cases for optimization software. Problems in the current version of the collection come from fluid dynamics, population dynamics, optimal design, and optimal control. For each problem they provide a short description of the problem, notes on the formulation of the problem, and results of computational experiments with general optimization solvers. They currently have results for DONLP2, LANCELOT, MINOS, SNOPT, and LOQO.
Experimental characterization of vertical-axis wind turbine noise.
Pearson, C E; Graham, W R
2015-01-01
Vertical-axis wind turbines are wind-energy generators suitable for use in urban environments. Their associated noise thus needs to be characterized and understood. As a first step, this work investigates the relative importance of harmonic and broadband contributions via model-scale wind-tunnel experiments. Cross-spectra from a pair of flush-mounted wall microphones exhibit both components, but further analysis shows that the broadband dominates at frequencies corresponding to the audible range in full-scale operation. This observation has detrimental implications for noise-prediction reliability and hence also for acoustic design optimization.
Implementation of the Large-Scale Operations Management Test in the State of Washington.
1982-12-01
During FY 79, the U.S. Army Engineer Waterways Experiment Station (WES), Vicksburg, Miss., completed the first phase of its 3-year Large-Scale Operations Management Test (LSOMT). The LSOMT was designed to develop an operational plan to identify methodologies that can be implemented by the U.S. Army Engineer District, Seattle (NPS), to prevent the exotic aquatic macrophyte Eurasian watermilfoil (Myrophyllum spicatum L.) from reaching problem-level proportions in water bodies in the state of Washington. The WES developed specific plans as integral elements
NASA Technical Reports Server (NTRS)
Stutzman, Warren L.; Safaai-Jazi, A.; Pratt, Timothy; Nelson, B.; Laster, J.; Ajaz, H.
1993-01-01
Virginia Tech has performed a comprehensive propagation experiment using the Olympus satellite beacons at 12.5, 19.77, and 29.66 GHz (which we refer to as 12, 20, and 30 GHz). Four receive terminals were designed and constructed, one terminal at each frequency plus a portable one with 20 and 30 GHz receivers for microscale and scintillation studies. Total power radiometers were included in each terminal in order to set the clear air reference level for each beacon and also to predict path attenuation. More details on the equipment and the experiment design are found elsewhere. Statistical results for one year of data collection were analyzed. In addition, the following studies were performed: a microdiversity experiment in which two closely spaced 20 GHz receivers were used; a comparison of total power and Dicke switched radiometer measurements, frequency scaling of scintillations, and adaptive power control algorithm development. Statistical results are reported.
The Experience of Cognitive Intrusion of Pain: scale development and validation
Attridge, Nina; Crombez, Geert; Van Ryckeghem, Dimitri; Keogh, Edmund; Eccleston, Christopher
2015-01-01
Abstract Patients with chronic pain often report their cognition to be impaired by pain, and this observation has been supported by numerous studies measuring the effects of pain on cognitive task performance. Furthermore, cognitive intrusion by pain has been identified as one of 3 components of pain anxiety, alongside general distress and fear of pain. Although cognitive intrusion is a critical characteristic of pain, no specific measure designed to capture its effects exists. In 3 studies, we describe the initial development and validation of a new measure of pain interruption: the Experience of Cognitive Intrusion of Pain (ECIP) scale. In study 1, the ECIP scale was administered to a general population sample to assess its structure and construct validity. In study 2, the factor structure of the ECIP scale was confirmed in a large general population sample experiencing no pain, acute pain, or chronic pain. In study 3, we examined the predictive value of the ECIP scale in pain-related disability in fibromyalgia patients. The ECIP scale scores followed a normal distribution with good variance in a general population sample. The scale had high internal reliability and a clear 1-component structure. It differentiated between chronic pain and control groups, and it was a significant predictor of pain-related disability over and above pain intensity. Repairing attentional interruption from pain may become a novel target for pain management interventions, both pharmacologic and nonpharmacologic. PMID:26067388
Soneral, Paula A. G.; Wyse, Sara A.
2017-01-01
Student-centered learning environments with upside-down pedagogies (SCALE-UP) are widely implemented at institutions across the country, and learning gains from these classrooms have been well documented. This study investigates the specific design feature(s) of the SCALE-UP classroom most conducive to teaching and learning. Using pilot survey data from instructors and students to prioritize the most salient SCALE-UP classroom features, we created a low-tech “Mock-up” version of this classroom and tested the impact of these features on student learning, attitudes, and satisfaction using a quasi-experimental setup. The same instructor taught two sections of an introductory biology course in the SCALE-UP and Mock-up rooms. Although students in both sections were equivalent in terms of gender, grade point average, incoming ACT, and drop/fail/withdraw rate, the Mock-up classroom enrolled significantly more freshmen. Controlling for class standing, multiple regression modeling revealed no significant differences in exam, in-class, preclass, and Introduction to Molecular and Cellular Biology Concept Inventory scores between the SCALE-UP and Mock-up classrooms. Thematic analysis of student comments highlighted that collaboration and whiteboards enhanced the learning experience, but technology was not important. Student satisfaction and attitudes were comparable. These results suggest that the benefits of a SCALE-UP experience can be achieved at lower cost without technology features. PMID:28213582
Laser-driven magnetized liner inertial fusion
Davies, J. R.
2017-06-05
A laser-driven, magnetized liner inertial fusion (MagLIF) experiment is designed in this paper for the OMEGA Laser System by scaling down the Z point design to provide the first experimental data on MagLIF scaling. OMEGA delivers roughly 1000× less energy than Z, so target linear dimensions are reduced by factors of ~10. Magneto-inertial fusion electrical discharge system could provide an axial magnetic field of 10 T. Two-dimensional hydrocode modeling indicates that a single OMEGA beam can preheat the fuel to a mean temperature of ~200 eV, limited by mix caused by heat flow into the wall. One-dimensional magnetohydrodynamic (MHD) modelingmore » is used to determine the pulse duration and fuel density that optimize neutron yield at a fuel convergence ratio of roughly 25 or less, matching the Z point design, for a range of shell thicknesses. A relatively thinner shell, giving a higher implosion velocity, is required to give adequate fuel heating on OMEGA compared to Z because of the increase in thermal losses in smaller targets. Two-dimensional MHD modeling of the point design gives roughly a 50% reduction in compressed density, temperature, and magnetic field from 1-D because of end losses. Finally, scaling up the OMEGA point design to the MJ laser energy available on the National Ignition Facility gives a 500-fold increase in neutron yield in 1-D modeling.« less
Developing the RAL front end test stand source to deliver a 60 mA, 50 Hz, 2 ms H- beam
NASA Astrophysics Data System (ADS)
Faircloth, Dan; Lawrie, Scott; Letchford, Alan; Gabor, Christoph; Perkins, Mike; Whitehead, Mark; Wood, Trevor; Tarvainen, Olli; Komppula, Jani; Kalvas, Taneli; Dudnikov, Vadim; Pereira, Hugo; Izaola, Zunbeltz; Simkin, John
2013-02-01
All the Front End Test Stand (FETS) beam requirements have been achieved, but not simultaneously [1]. At 50 Hz repetition rates beam current droop becomes unacceptable for pulse lengths longer than 1 ms. This is fundamental limitation of the present source design. Previous researchers [2] have demonstrated that using a physically larger Penning surface plasma source should overcome these limitations. The scaled source development strategy is outlined in this paper. A study of time-varying plasma behavior has been performed using a V-UV spectrometer. Initial experiments to test scaled plasma volumes are outlined. A dedicated plasma and extraction test stand (VESPA-Vessel for Extraction and Source Plasma Analysis) is being developed to allow new source and extraction designs to be appraised. The experimental work is backed up by modeling and simulations. A detailed ANSYS thermal model has been developed. IBSimu is being used to design extraction and beam transport. A novel 3D plasma modeling code using beamlets is being developed by Cobham Vector Fields using SCALA OPERA, early source modeling results are very promising. Hardware on FETS is also being developed in preparation to run the scaled source. A new 2 ms, 50 Hz, 25 kV pulsed extraction voltage power supply has been constructed and a new discharge power supply is being designed. The design of the post acceleration electrode assembly has been improved.
Laser-driven magnetized liner inertial fusion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davies, J. R.
A laser-driven, magnetized liner inertial fusion (MagLIF) experiment is designed in this paper for the OMEGA Laser System by scaling down the Z point design to provide the first experimental data on MagLIF scaling. OMEGA delivers roughly 1000× less energy than Z, so target linear dimensions are reduced by factors of ~10. Magneto-inertial fusion electrical discharge system could provide an axial magnetic field of 10 T. Two-dimensional hydrocode modeling indicates that a single OMEGA beam can preheat the fuel to a mean temperature of ~200 eV, limited by mix caused by heat flow into the wall. One-dimensional magnetohydrodynamic (MHD) modelingmore » is used to determine the pulse duration and fuel density that optimize neutron yield at a fuel convergence ratio of roughly 25 or less, matching the Z point design, for a range of shell thicknesses. A relatively thinner shell, giving a higher implosion velocity, is required to give adequate fuel heating on OMEGA compared to Z because of the increase in thermal losses in smaller targets. Two-dimensional MHD modeling of the point design gives roughly a 50% reduction in compressed density, temperature, and magnetic field from 1-D because of end losses. Finally, scaling up the OMEGA point design to the MJ laser energy available on the National Ignition Facility gives a 500-fold increase in neutron yield in 1-D modeling.« less
Development and Characterization of 6Li-doped Liquid Scintillator Detectors for PROSPECT
NASA Astrophysics Data System (ADS)
Gaison, Jeremy; Prospect Collaboration
2016-09-01
PROSPECT, the Precision Reactor Oscillation and Spectrum experiment, is a phased reactor antineutrino experiment designed to search for eV-scale sterile neutrinos via short-baseline neutrino oscillations and to make a precision measurement of the 235U reactor antineutrino spectrum. A multi-ton, optically segmented detector will be deployed at Oak Ridge National Laboratory's (ORNL) High Flux Isotope Reactor (HFIR) to measure the reactor spectrum at baselines ranging from 7-12m. A two-segment detector prototype with 50 liters of active liquid scintillator target has been built to verify the detector design and to benchmark its performance. In this presentation, we will summarize the performance of this detector prototype and describe the optical and energy calibration of the segmented PROSPECT detectors.
Design of experiments for microencapsulation applications: A review.
Paulo, Filipa; Santos, Lúcia
2017-08-01
Microencapsulation techniques have been intensively explored by many research sectors such as pharmaceutical and food industries. Microencapsulation allows to protect the active ingredient from the external environment, mask undesired flavours, a possible controlled release of compounds among others. The purpose of this review is to provide a background of design of experiments in microencapsulation research context. Optimization processes are required for an accurate research in these fields and therefore, the right implementation of micro-sized techniques at industrial scale. This article critically reviews the use of the response surface methodologies in pharmaceutical and food microencapsulation research areas. A survey of optimization procedures in the literature, in the last few years is also presented. Copyright © 2017 Elsevier B.V. All rights reserved.
International Linear Collider Reference Design Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brau, James,; Okada, Yasuhiro,; Walker, Nicholas J.,
2007-08-13
{lg_bullet} What is the universe? How did it begin? {lg_bullet} What are matter and energy? What are space and time? These basic questions have been the subject of scientific theories and experiments throughout human history. The answers have revolutionized the enlightened view of the world, transforming society and advancing civilization. Universal laws and principles govern everyday phenomena, some of them manifesting themselves only at scales of time and distance far beyond everyday experience. Particle physics experiments using particle accelerators transform matter and energy, to reveal the basic workings of the universe. Other experiments exploit naturally occurring particles, such as solarmore » neutrinos or cosmic rays, and astrophysical observations, to provide additional insights.« less
Shrink-film microfluidic education modules: Complete devices within minutes
Nguyen, Diep; McLane, Jolie; Lew, Valerie; Pegan, Jonathan; Khine, Michelle
2011-01-01
As advances in microfluidics continue to make contributions to diagnostics and life sciences, broader awareness of this expanding field becomes necessary. By leveraging low-cost microfabrication techniques that require no capital equipment or infrastructure, simple, accessible, and effective educational modules can be made available for a broad range of educational needs from middle school demonstrations to college laboratory classes. These modules demonstrate key microfluidic concepts such as diffusion and separation as well as “laboratory on-chip” applications including chemical reactions and biological assays. These modules are intended to provide an interdisciplinary hands-on experience, including chip design, fabrication of functional devices, and experiments at the microscale. Consequently, students will be able to conceptualize physics at small scales, gain experience in computer-aided design and microfabrication, and perform experiments—all in the context of addressing real-world challenges by making their own lab-on-chip devices. PMID:21799715
Adiabatic model and design of a translating field reversed configuration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Intrator, T. P.; Siemon, R. E.; Sieck, P. E.
We apply an adiabatic evolution model to predict the behavior of a field reversed configuration (FRC) during decompression and translation, as well as during boundary compression. Semi-empirical scaling laws, which were developed and benchmarked primarily for collisionless FRCs, are expected to remain valid even for the collisional regime of FRX-L experiment. We use this approach to outline the design implications for FRX-L, the high density translated FRC experiment at Los Alamos National Laboratory. A conical theta coil is used to accelerate the FRC to the largest practical velocity so it can enter a mirror bounded compression region, where it mustmore » be a suitable target for a magnetized target fusion (MTF) implosion. FRX-L provides the physics basis for the integrated MTF plasma compression experiment at the Shiva-Star pulsed power facility at Kirtland Air Force Research Laboratory, where the FRC will be compressed inside a flux conserving cylindrical shell.« less
Ground Testing of a 10 K Sorption Cryocooler Flight Experiment (BETSCE)
NASA Technical Reports Server (NTRS)
Bard, S.; Wu, J.; Karlmann, P.; Cowgill, P.; Mirate, C.; Rodriguez, J.
1994-01-01
The Brilliant Eyes Ten-Kelvin Sorption Cryocooler Experiment (BETSCE) is a Space Shuttle side-wall-mounted flight experiment designed to demonstrate 10 K sorption cryocooler technology in a space environment. The BETSCE objectives are to: (1) provide a thorough end-to-end characterization and space performance validation of a complete, multistage, automated, closed-cycle hydride sorption cryocooler in the 10 to 30 K temperature range, (2) acquire the quantitative microgravity database required to provide confident engineering design, scaling, and optimization, (3) advance the enabling technologies and resolve integration issues, and (4) provide hardware qualification and safety verification heritage. BETSCE ground tests were the first-ever demonstration of a complete closed-cycle 10 K sorption cryocooler. Test results exceeded functional requirements. This paper summarizes functional and environmental ground test results, planned characterization tests, important development challenges that were overcome, and valuable lessons-learned.
NASA Technical Reports Server (NTRS)
Hoerz, Friedrich; Cintala, Mark; See, Thomas; Bernhard, Ronald; Cardenas, Frank; Davidson, William; Haynes, Jerry
1992-01-01
An experimental inquiry into the utility of discontinuous bumpers was conducted to investigate the collisional outcomes of impacts into single grid-like targets and to compare the results with more traditional bumper designs that employ continuous sheet stock. We performed some 35 experiments using 6.3 and 3.2 mm diameter spherical soda-lime glass projectiles at low velocities (less than 2.5 km/s) and 13 at velocities between 5 and 6 km/s, using 3.2 mm spheres only. The thrust of the experiments related to the characterization of collisional fragments as a function of target thickness or areal shield mass of both bumper designs. The primary product of these experiments was witness plates that record the resulting population of collisional fragments. Substantial interpretive and predictive insights into bumper performance were obtained. All qualitative observations (on the witness plates) and detailed measurements of displaced masses seem simply and consistently related only to bumper mass available for interaction with the impactor. This renders the grid bumper into the superior shield design. These findings present evidence that discontinuous bumpers are a viable concept for collisional shields, possibly superior to continuous geometries.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jernigan, Dann A.; Blanchat, Thomas K.
It is necessary to improve understanding and develop temporally- and spatially-resolved integral scale validation data of the heat flux incident to a complex object in addition to measuring the thermal response of said object located within the fire plume for the validation of the SIERRA/FUEGO/SYRINX fire and SIERRA/CALORE codes. To meet this objective, a complex calorimeter with sufficient instrumentation to allow validation of the coupling between FUEGO/SYRINX/CALORE has been designed, fabricated, and tested in the Fire Laboratory for Accreditation of Models and Experiments (FLAME) facility. Validation experiments are specifically designed for direct comparison with the computational predictions. Making meaningful comparisonmore » between the computational and experimental results requires careful characterization and control of the experimental features or parameters used as inputs into the computational model. Validation experiments must be designed to capture the essential physical phenomena, including all relevant initial and boundary conditions. This report presents the data validation steps and processes, the results of the penlight radiant heat experiments (for the purpose of validating the CALORE heat transfer modeling of the complex calorimeter), and the results of the fire tests in FLAME.« less
Joint US/Russia TU-144 Engine Ground Tests
NASA Technical Reports Server (NTRS)
Acosta, Waldo A.; Balser, Jeffrey S.; McCartney, Timothy P.; Richter, Charles A.; Woike, Mark R.
1997-01-01
Two engine research experiments were recently completed in Moscow, Russia using an engine from the Tu-144 supersonic transport airplane. This was a joint project between the United States and Russia. Personnel from the NASA Lewis Research Center, General Electric Aircraft Engines, Pratt & Whitney, the Tupolev Design Bureau, and EBP Aircraft LTD worked together as a team to overcome the many technical and cultural challenges. The objective was to obtain large scale inlet data that could be used in the development of a supersonic inlet system for a future High Speed Civil Transport (HSCT). The-first experiment studied the impact of typical inlet structures that have trailing edges in close proximity to the inlet/engine interface plane on the flow characteristics at that plane. The inlet structure simulated the subsonic diffuser of a supersonic inlet using a bifurcated splitter design. The centerbody maximum diameter was designed to permit choking and slightly supercritical operation. The second experiment measured the reflective characteristics of the engine face to incoming perturbations of pressure amplitude. The basic test rig from the first experiment was used with a longer spacer equipped with fast actuated doors. All the objectives set forth at the beginning of the project were met.
Stygar, William A.; Reisman, David B.; Stoltzfus, Brian S.; ...
2016-07-07
In this study, we have developed a conceptual design of a next-generation pulsed-power accelerator that is optmized for driving megajoule-class dynamic-material-physics experiments at pressures as high as 1 TPa. The design is based on an accelerator architecture that is founded on three concepts: single-stage electrical-pulse compression, impedance matching, and transit-time-isolated drive circuits. Since much of the accelerator is water insulated, we refer to this machine as Neptune. The prime power source of Neptune consists of 600 independent impedance-matched Marx generators. As much as 0.8 MJ and 20 MA can be delivered in a 300-ns pulse to a 16-mΩ physics load;more » hence Neptune is a megajoule-class 20-MA arbitrary waveform generator. Neptune will allow the international scientific community to conduct dynamic equation-of-state, phase-transition, mechanical-property, and other material-physics experiments with a wide variety of well-defined drive-pressure time histories. Because Neptune can deliver on the order of a megajoule to a load, such experiments can be conducted on centimeter-scale samples at terapascal pressures with time histories as long as 1 μs.« less
Impact of adverse life events on individuals with low and high schizotypy in a nonpatient sample.
Kocsis-Bogár, Krisztina; Miklósi, Mónika; Forintos, Dóra Perczel
2013-03-01
The aims of this study were to gain a better understanding of adverse life events connected with the development of schizotypal personality traits and, also, to examine whether subclinical schizotypy has a relationship with vulnerability to traumatic intrusions and avoidance. In a cross-sectional design, 198 undergraduate students completed the Oxford-Liverpool Inventory of Feelings and Experiences (O-LIFE), the Impact of Event Scale (IES), and Paykel's Life Events Scale, together with other relevant scales. The number of adverse life events was significantly related to overall schizotypy measured by O-LIFE scores and positive schizotypy measured by the Unusual Experiences (UnEx) subscale. The subjective severity of life events was significantly related to Cognitive Disorganization (CogDis). Measures of positive schizotypy (UnEx and CogDis) were significantly related to the scores on the IES and on the intrusion and avoidance subscales, too. Adverse life events are associated with schizotypal personality traits, which contribute to a tendency for traumatic intrusions, even in a nonpatient sample.
Recent Results from the MAJORANA DEMONSTRATOR
NASA Astrophysics Data System (ADS)
Gilliss, T.; Alvis, S. I.; Arnquist, I. J.; Avignone, F. T.; Barabash, A. S.; Barton, C. J.; Bertrand, F. E.; Bode, T.; Brudanin, V.; Busch, M.; Buuck, M.; Caldwell, T. S.; Chan, Y.-D.; Christofferson, C. D.; Chu, P.-H.; Cuesta, C.; Detwiler, J. A.; Dunagan, C.; Efremenko, Yu; Ejiri, H.; Elliott, S. R.; Giovanetti, G. K.; Green, M. P.; Gruszko, J.; Guinn, I. S.; Guiseppe, V. E.; Haufe, C. R.; Hehn, L.; Henning, R.; Hoppe, E. W.; Howe, M. A.; Keeter, K. J.; Kidd, M. F.; Konovalov, S. I.; Kouzes, R. T.; Lopez, A. M.; Martin, R. D.; Massarczyk, R.; Meijer, S. J.; Mertens, S.; Myslik, J.; O’Shaughnessy, C.; Othman, G.; Pettus, W.; Poon, A. W. P.; Radford, D. C.; Rager, J.; Reine, A. L.; Rielage, K.; Robertson, R. G. H.; Ruof, N. W.; Shanks, B.; Shirchenko, M.; Suriano, A. M.; Tedeschi, D.; Varner, R. L.; Vasilyev, S.; Vetter, K.; Vorren, K.; White, B. R.; Wilkerson, J. F.; Wiseman, C.; Xu, W.; Yakushev, E.; Yu, C.-H.; Yumatov, V.; Zhitnikov, I.; Zhu, B. X.
The MAJORANA Collaboration has completed construction and is now operating an array of high purity Ge detectors searching for neutrinoless double-beta decay (0νββ) in 76Ge. The array, known as the MAJORANA DEMONSTRATOR, is comprised of 44 kg of Ge detectors (30 kg enriched to 88% in 76Ge) installed in an ultra-low background compact shield at the Sanford Underground Research Facility in Lead, South Dakota. The primary goal of the DEMONSTRATOR is to establish a low-background design that can be scaled to a next-generation tonne-scale experiment. This work reports initial background levels in the 0νββ region of interest. Also presented are recent physics results leveraging P-type point-contact detectors with sub-keV energy thresholds to search for physics beyond the Standard Model; first results from searches for bosonic dark matter, solar axions, Pauli exclusion principle violation, and electron decay have been published. Finally, this work discusses the proposed tonne-scale 76Ge 0νββ LEGEND experiment.
Perception of speaker size and sex of vowel sounds
NASA Astrophysics Data System (ADS)
Smith, David R. R.; Patterson, Roy D.
2005-04-01
Glottal-pulse rate (GPR) and vocal-tract length (VTL) are both related to speaker size and sex-however, it is unclear how they interact to determine our perception of speaker size and sex. Experiments were designed to measure the relative contribution of GPR and VTL to judgements of speaker size and sex. Vowels were scaled to represent people with different GPRs and VTLs, including many well beyond the normal population values. In a single interval, two response rating paradigm, listeners judged the size (using a 7-point scale) and sex/age of the speaker (man, woman, boy, or girl) of these scaled vowels. Results from the size-rating experiments show that VTL has a much greater influence upon judgements of speaker size than GPR. Results from the sex-categorization experiments show that judgements of speaker sex are influenced about equally by GPR and VTL for vowels with normal GPR and VTL values. For abnormal combinations of GPR and VTL, where low GPRs are combined with short VTLs, VTL has more influence than GPR in sex judgements. [Work supported by the UK MRC (G9901257) and the German Volkswagen Foundation (VWF 1/79 783).
Summary of spin technology as related to light general-aviation airplanes
NASA Technical Reports Server (NTRS)
Bowman, J. S., Jr.
1971-01-01
A summary was made of all NASA (and NACA) research and experience related to the spin and recovery characteristics of light personal-owner-type general-aviation airplanes. Very little of the research deals with light general-aviation airplanes as such, but many of the airplanes and models tested before and during World War II were similar to present-day light general-aviation airplanes with regard to the factors that are important in spinning. The material is based mainly on the results of spin-tunnel tests of free-spinning dynamically scaled models of about 100 different airplane designs and, whenever possible, includes correlation with full-scale spin tests. The research results are discussed in terms of airplane design considerations and the proper use of controls for recovery.
Modeling and Design of a Full-Scale Rotor Blade with Embedded Piezocomposite Actuators
NASA Astrophysics Data System (ADS)
Kovalovs, A.; Barkanov, E.; Ruchevskis, S.; Wesolowski, M.
2017-05-01
An optimization methodology for the design of a full-scale rotor blade with an active twist in order to enhance its ability to reduce vibrations and noise is presented. It is based on a 3D finite-element model, the planning of experiments, and the response surface technique to obtain high piezoelectric actuation forces and displacements with a minimum actuator weight and energy applied. To investigate an active twist of the helicopter rotor blade, a structural static analysis using a 3D finite-element model was carried out. Optimum results were obtained at two possible applications of macrofiber composite actuators. The torsion angle found from the finite-element simulation of helicopter rotor blades was successfully validated by its experimental values, which confirmed the modeling accuracy.
Barrett, Lisa Feldman; Barsalou, Lawrence W.
2015-01-01
The tremendous variability within categories of human emotional experience receives little empirical attention. We hypothesized that atypical instances of emotion categories (e.g. pleasant fear of thrill-seeking) would be processed less efficiently than typical instances of emotion categories (e.g. unpleasant fear of violent threat) in large-scale brain networks. During a novel fMRI paradigm, participants immersed themselves in scenarios designed to induce atypical and typical experiences of fear, sadness or happiness (scenario immersion), and then focused on and rated the pleasant or unpleasant feeling that emerged (valence focus) in most trials. As predicted, reliably greater activity in the ‘default mode’ network (including medial prefrontal cortex and posterior cingulate) was observed for atypical (vs typical) emotional experiences during scenario immersion, suggesting atypical instances require greater conceptual processing to situate the socio-emotional experience. During valence focus, reliably greater activity was observed for atypical (vs typical) emotional experiences in the ‘salience’ network (including anterior insula and anterior cingulate), suggesting atypical instances place greater demands on integrating shifting body signals with the sensory and social context. Consistent with emerging psychological construction approaches to emotion, these findings demonstrate that is it important to study the variability within common categories of emotional experience. PMID:24563528
Structural Similitude and Scaling Laws
NASA Technical Reports Server (NTRS)
Simitses, George J.
1998-01-01
Aircraft and spacecraft comprise the class of aerospace structures that require efficiency and wisdom in design, sophistication and accuracy in analysis and numerous and careful experimental evaluations of components and prototype, in order to achieve the necessary system reliability, performance and safety. Preliminary and/or concept design entails the assemblage of system mission requirements, system expected performance and identification of components and their connections as well as of manufacturing and system assembly techniques. This is accomplished through experience based on previous similar designs, and through the possible use of models to simulate the entire system characteristics. Detail design is heavily dependent on information and concepts derived from the previous steps. This information identifies critical design areas which need sophisticated analyses, and design and redesign procedures to achieve the expected component performance. This step may require several independent analysis models, which, in many instances, require component testing. The last step in the design process, before going to production, is the verification of the design. This step necessitates the production of large components and prototypes in order to test component and system analytical predictions and verify strength and performance requirements under the worst loading conditions that the system is expected to encounter in service. Clearly then, full-scale testing is in many cases necessary and always very expensive. In the aircraft industry, in addition to full-scale tests, certification and safety necessitate large component static and dynamic testing. Such tests are extremely difficult, time consuming and definitely absolutely necessary. Clearly, one should not expect that prototype testing will be totally eliminated in the aircraft industry. It is hoped, though, that we can reduce full-scale testing to a minimum. Full-scale large component testing is necessary in other industries as well, Ship building, automobile and railway car construction all rely heavily on testing. Regardless of the application, a scaled-down (by a large factor) model (scale model) which closely represents the structural behavior of the full-scale system (prototype) can prove to be an extremely beneficial tool. This possible development must be based on the existence of certain structural parameters that control the behavior of a structural system when acted upon by static and/or dynamic loads. If such structural parameters exist, a scaled-down replica can be built, which will duplicate the response of the full-scale system. The two systems are then said to be structurally similar. The term, then, that best describes this similarity is structural similitude. Similarity of systems requires that the relevant system parameters be identical and these systems be governed by a unique set of characteristic equations. Thus, if a relation or equation of variables is written for a system, it is valid for all systems which are similar to it. Each variable in a model is proportional to the corresponding variable of the prototype. This ratio, which plays an essential role in predicting the relationship between the model and its prototype, is called the scale factor.
Development of a design model for airfoil leading edge film cooling
NASA Astrophysics Data System (ADS)
Wadia, A. R.; Nealy, D. A.
1985-03-01
A series of experiments on scaled cylinder models having injection through holes inclined at 20, 30, 45, and 90 degrees are presented. The experiments were conducted in a wind tunnel on several stainless steel test specimens in which flow and heat transfer parameters were measured over simulated airfoil leading edge surfaces. On the basis of the experimental results, an engineering design model is proposed that treats the gas-to-surface heat transfer coefficient with film cooling in a manner suggested by Luckey and L'Ecuyer (1981). It is shown that the main factor influencing the averaged film cooling effectiveness in the showerhead region is the inclination of the injection holes. The effectiveness parameter was not affected by variations in the coolant-to-gas stream pressure ratio, the freestream Mach number, the gas to coolant temperature ratio, or the gas stream Reynolds number. Experience in the wind tunnel tests is reflected in the design of the model in which the coolant side heat transfer coefficient is offset by a simultaneous increase in the gas side film coefficient. The design applications of the analytical model are discussed, with emphasis given to high temperature first stage turbine vanes and rotor blades.
The National Ignition Facility: alignment from construction to shot operations
NASA Astrophysics Data System (ADS)
Burkhart, S. C.; Bliss, E.; Di Nicola, P.; Kalantar, D.; Lowe-Webb, R.; McCarville, T.; Nelson, D.; Salmon, T.; Schindler, T.; Villanueva, J.; Wilhelmsen, K.
2010-08-01
The National Ignition Facility in Livermore, California, completed it's commissioning milestone on March 10, 2009 when it fired all 192 beams at a combined energy of 1.1 MJ at 351nm. Subsequently, a target shot series from August through December of 2009 culminated in scale ignition target design experiments up to 1.2 MJ in the National Ignition Campaign. Preparations are underway through the first half of of 2010 leading to DT ignition and gain experiments in the fall of 2010 into 2011. The top level requirement for beam pointing to target of 50μm rms is the culmination of 15 years of engineering design of a stable facility, commissioning of precision alignment, and precise shot operations controls. Key design documents which guided this project were published in the mid 1990's, driving systems designs. Precision Survey methods were used throughout construction, commissioning and operations for precision placement. Rigorous commissioning processes were used to ensure and validate placement and alignment throughout commissioning and in present day operations. Accurate and rapid system alignment during operations is accomplished by an impressive controls system to align and validate alignment readiness, assuring machine safety and productive experiments.
Fluid flow and heat convection studies for actively cooled airframes
NASA Technical Reports Server (NTRS)
Mills, A. F.
1993-01-01
This report details progress made on the jet impingement - liquid crystal - digital imaging experiment. With the design phase complete, the experiment is currently in the construction phase. In order to reach this phase two design related issues were resolved. The first issue was to determine NASP leading edge active cooling design parameters. Meetings were arranged with personnel at SAIC International, Torrance, CA in order to obtain recent publications that characterized expected leading edge heat fluxes as well as other details of NASP operating conditions. The information in these publications was used to estimate minimum and maximum jet Reynolds numbers needed to accomplish the required leading edge cooling, and to determine the parameters of the experiment. The details of this analysis are shown in Appendix A. One of the concerns for the NASP design is that of thermal stress due to large surface temperature gradients. Using a series of circular jets to cool the leading edge will cause a non-uniform temperature distribution and potentially large thermal stresses. Therefore it was decided to explore the feasibility of using a slot jet to cool the leading edge. The literature contains many investigations into circular jet heat transfer but few investigations of slot jet heat transfer. The first experiments will be done on circular jets impinging on a fiat plate and results compared to previously published data to establish the accuracy of the method. Subsequent experiments will be slot jets impinging on full scale models of the NASP leading edge. Table 1 shows the range of parameters to be explored. Next a preliminary design of the experiment was done. Previous papers which used a similar experimental technique were studied and elements of those experiments adapted to the jet impingement study. Trade-off studies were conducted to determine which design was the least expensive, easy to construct, and easy to use. Once the final design was settled, vendors were contacted to verify that equipment could be obtained to meet our specifications. Much of the equipment required to complete the construction of the experiment has been ordered or received. The material status list is shown in Appendix B.
Intelligent systems engineering methodology
NASA Technical Reports Server (NTRS)
Fouse, Scott
1990-01-01
An added challenge for the designers of large scale systems such as Space Station Freedom is the appropriate incorporation of intelligent system technology (artificial intelligence, expert systems, knowledge-based systems, etc.) into their requirements and design. This presentation will describe a view of systems engineering which successfully addresses several aspects of this complex problem: design of large scale systems, design with requirements that are so complex they only completely unfold during the development of a baseline system and even then continue to evolve throughout the system's life cycle, design that involves the incorporation of new technologies, and design and development that takes place with many players in a distributed manner yet can be easily integrated to meet a single view of the requirements. The first generation of this methodology was developed and evolved jointly by ISX and the Lockheed Aeronautical Systems Company over the past five years on the Defense Advanced Research Projects Agency/Air Force Pilot's Associate Program, one of the largest, most complex, and most successful intelligent systems constructed to date. As the methodology has evolved it has also been applied successfully to a number of other projects. Some of the lessons learned from this experience may be applicable to Freedom.
Harnessing Alternative Energy Sources to Enhance the Design of a Wave Generator
NASA Astrophysics Data System (ADS)
Bravo, A.
2017-12-01
Wave energy has the power to replace a non-renewable source of electricity for a home near the ocean. I built a small-scale wave generator capable of producing approximately 5 volts of electricity. The generator is an array of 16 small generators, each consisting of 200 feet of copper wire, 12 magnets, and a buoy. I tested my design in the Pacific Ocean and was able to power a string of lights I had attached to the generator. While the waves in the ocean moved my buoys, my design was powered by the vertical motion of the waves. My generator was hit with significant horizontal wave motion, and I realized I wasn't taking advantage of that direction of motion. To make my generator produce more electricity, I experimented with capturing the energy of the horizontal motion of water and incorporated that into my generator design. My generator, installed in the ocean, is also exposed to sun and wind, and I am exploring the potential of solar and wind energy collection in my design to increase the electricity output. Once I have maximized my electricity output, I would like to explore scaling up my design.
"Genetically Engineered" Nanoelectronics
NASA Technical Reports Server (NTRS)
Klimeck, Gerhard; Salazar-Lazaro, Carlos H.; Stoica, Adrian; Cwik, Thomas
2000-01-01
The quantum mechanical functionality of nanoelectronic devices such as resonant tunneling diodes (RTDs), quantum well infrared-photodetectors (QWIPs), quantum well lasers, and heterostructure field effect transistors (HFETs) is enabled by material variations on an atomic scale. The design and optimization of such devices requires a fundamental understanding of electron transport in such dimensions. The Nanoelectronic Modeling Tool (NEMO) is a general-purpose quantum device design and analysis tool based on a fundamental non-equilibrium electron transport theory. NEW was combined with a parallelized genetic algorithm package (PGAPACK) to evolve structural and material parameters to match a desired set of experimental data. A numerical experiment that evolves structural variations such as layer widths and doping concentrations is performed to analyze an experimental current voltage characteristic. The genetic algorithm is found to drive the NEMO simulation parameters close to the experimentally prescribed layer thicknesses and doping profiles. With such a quantitative agreement between theory and experiment design synthesis can be performed.
Gopalakrishnan, V; Subramanian, V; Baskaran, R; Venkatraman, B
2015-07-01
Wireless based custom built aerosol sampling network is designed, developed, and implemented for environmental aerosol sampling. These aerosol sampling systems are used in field measurement campaign, in which sodium aerosol dispersion experiments have been conducted as a part of environmental impact studies related to sodium cooled fast reactor. The sampling network contains 40 aerosol sampling units and each contains custom built sampling head and the wireless control networking designed with Programmable System on Chip (PSoC™) and Xbee Pro RF modules. The base station control is designed using graphical programming language LabView. The sampling network is programmed to operate in a preset time and the running status of the samplers in the network is visualized from the base station. The system is developed in such a way that it can be used for any other environment sampling system deployed in wide area and uneven terrain where manual operation is difficult due to the requirement of simultaneous operation and status logging.
Numerical investigation of design and operation parameters on CHI spheromak performance
NASA Astrophysics Data System (ADS)
O'Bryan, J. B.; Romero-Talamás, C. R.; Woodruff, S.
2017-10-01
Nonlinear, numerical computation with the NIMROD code is used to explore magnetic self-organization in spheromaks formed with coaxial helicity injection, particularly with regard to how externally controllable parameters affect the resulting spheromak performance. The overall goal of our study is to inform the design and operational parameters of a future proof-of-principle spheromak experiment. Our calculations start from vacuum magnetic fields and model multiple distinct phases of evolution. Results indicate that modest changes to the design and operation of past experiments, e.g. SSPX [E.B. Hooper et al. PPCF 2012], could have significantly improved the plasma-current injector coupling efficiency and performance, particularly with respect to peak temperature and lifetime. While we frequently characterize performance relative to SSPX, our conclusions extrapolate to fundamentally different experimental designs. We also explore adiabatic magnetic compression of spheromaks, which may allow for a small-scale, high-performance and high-yield pulsed neutron source. This work is supported by DAPRA under Grant No. N66001-14-1-4044.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gopalakrishnan, V.; Subramanian, V.; Baskaran, R.
2015-07-15
Wireless based custom built aerosol sampling network is designed, developed, and implemented for environmental aerosol sampling. These aerosol sampling systems are used in field measurement campaign, in which sodium aerosol dispersion experiments have been conducted as a part of environmental impact studies related to sodium cooled fast reactor. The sampling network contains 40 aerosol sampling units and each contains custom built sampling head and the wireless control networking designed with Programmable System on Chip (PSoC™) and Xbee Pro RF modules. The base station control is designed using graphical programming language LabView. The sampling network is programmed to operate in amore » preset time and the running status of the samplers in the network is visualized from the base station. The system is developed in such a way that it can be used for any other environment sampling system deployed in wide area and uneven terrain where manual operation is difficult due to the requirement of simultaneous operation and status logging.« less
NASA Astrophysics Data System (ADS)
Galaev, S. A.; Ris, V. V.; Smirnov, E. M.; Babiev, A. N.
2018-06-01
Experience gained from designing exhaust hoods for modernized versions of K-175/180-12.8 and K-330-23.5-1 steam turbines is presented. The hood flow path is optimized based on the results of analyzing equilibrium wet steam 3D flow fields calculated using up-to-date computation fluid dynamics techniques. The mathematical model constructed on the basis of Reynolds-averaged Navier-Stokes equations is validated by comparing the calculated kinetic energy loss with the published data on full-scale experiments for the hood used in the K-160-130 turbine produced by the Kharkiv Turbine-Generator Works. Test calculations were carried out for four turbine operation modes. The obtained results from validating the model with the K-160-130 turbine hood taken as an example were found to be equally positive with the results of the previously performed calculations of flow pattern in the K-300-240 turbine hood. It is shown that the calculated coefficients of total losses in the K-160-130 turbine hood differ from the full-scale test data by no more than 5%. As a result of optimizing the K-175/180-12.8 turbine hood flow path, the total loss coefficient has been decreased from 1.50 for the initial design to 1.05 for the best of the modification versions. The optimized hood is almost completely free from supersonic flow areas, and the flow through it has become essentially more uniform both inside the hood and at its outlet. In the modified version of the K-330-23.5-1 turbine hood, the total loss coefficient has been decreased by more than a factor of 2: from 2.3 in the hood initial design to a value of 1.1 calculated for the hood final design version and sizes adopted for developing the detailed design. Cardinally better performance of both the hoods with respect to their initial designs was achieved as a result of multicase calculations, during which the flow path geometrical characteristics were sequentially varied, including options involving its maximally possible expansion and removal of the guiding plates producing an adverse effect.
Hsu, Li-Ling; Hsieh, Suh-Ing
2011-11-01
This article is a report of a quasi-experimental study of the effects of blended modules on nursing students' learning of ethics course content. There is yet to be an empirically supported mix of strategies on which a working blended learning model can be built for nursing education. This was a two-group pretest and post-test quasi-experimental study in 2008 involving a total of 233 students. Two of the five clusters were designated the experimental group to experience a blended learning model, and the rest were designated the control group to be given classroom lectures only. The Case Analysis Attitude Scale, Case Analysis Self-Evaluation Scale, Blended Learning Satisfaction Scale, and Metacognition Scale were used in pretests and post-tests for the students to rate their own performance. In this study, the experimental group did not register significantly higher mean scores on the Case Analysis Attitude Scale at post-test and higher mean ranks on the Case Analysis Self-Evaluation Scale, the Blended Learning Satisfaction Scale, and the Metacognition Scale at post-test than the control group. Moreover, the experimental group registered significant progress in the mean ranks on the Case Analysis Self-Evaluation Scale and the Metacognition Scale from pretest to post-test. No between-subjects effects of four scales at post-test were found. Newly developed course modules, be it blended learning or a combination of traditional and innovative components, should be tested repeatedly for effectiveness and popularity for the purpose of facilitating the ultimate creation of a most effective course module for nursing education. © 2011 Blackwell Publishing Ltd.
Rhyme as reason in commercial and social advertising.
Filkuková, Petra; Klempe, Sven Hroar
2013-10-01
This study investigated the rhyme-as-reason effect on new artificially created advertising slogans. Rhymes and non-rhymes were in Experiment 1 and 2 compared in a between-subjects design and in Experiment 3 in a within-subjects design. The quality of the form and content of the slogans was always evaluated by separate groups. In Experiment 1, we found a strong preference for rhyming slogans as opposed to their non-rhyming counterparts. Rhymes were rated as more likeable, more original, easier to remember, more suitable for campaigns, more persuasive and more trustworthy. In Experiment 2, social advertising messages were evaluated favorably in both rhyming and non-rhyming versions. However, when participants directly compared rhymes and non-rhymes on the same scale (Experiment 3), the difference between commercial and social advertising disappeared and for all slogans rhymes were clearly preferred to non-rhymes in terms of both form and content. A detailed analysis revealed that the rhymes scoring high on formal aspects were also favored in the questionnaire investigating content aspects. © 2013 The Scandinavian Psychological Associations.
Bor, Jacob; Geldsetzer, Pascal; Venkataramani, Atheendar; Bärnighausen, Till
2015-01-01
Purpose of review Randomized, population-representative trials of clinical interventions are rare. Quasi-experiments have been used successfully to generate causal evidence on the cascade of HIV care in a broad range of real-world settings. Recent findings Quasi-experiments exploit exogenous, or quasi-random, variation occurring naturally in the world or because of an administrative rule or policy change to estimate causal effects. Well designed quasi-experiments have greater internal validity than typical observational research designs. At the same time, quasi-experiments may also have potential for greater external validity than experiments and can be implemented when randomized clinical trials are infeasible or unethical. Quasi-experimental studies have established the causal effects of HIV testing and initiation of antiretroviral therapy on health, economic outcomes and sexual behaviors, as well as indirect effects on other community members. Recent quasi-experiments have evaluated specific interventions to improve patient performance in the cascade of care, providing causal evidence to optimize clinical management of HIV. Summary Quasi-experiments have generated important data on the real-world impacts of HIV testing and treatment and on interventions to improve the cascade of care. With the growth in large-scale clinical and administrative data, quasi-experiments enable rigorous evaluation of policies implemented in real-world settings. PMID:26371463
Bor, Jacob; Geldsetzer, Pascal; Venkataramani, Atheendar; Bärnighausen, Till
2015-11-01
Randomized, population-representative trials of clinical interventions are rare. Quasi-experiments have been used successfully to generate causal evidence on the cascade of HIV care in a broad range of real-world settings. Quasi-experiments exploit exogenous, or quasi-random, variation occurring naturally in the world or because of an administrative rule or policy change to estimate causal effects. Well designed quasi-experiments have greater internal validity than typical observational research designs. At the same time, quasi-experiments may also have potential for greater external validity than experiments and can be implemented when randomized clinical trials are infeasible or unethical. Quasi-experimental studies have established the causal effects of HIV testing and initiation of antiretroviral therapy on health, economic outcomes and sexual behaviors, as well as indirect effects on other community members. Recent quasi-experiments have evaluated specific interventions to improve patient performance in the cascade of care, providing causal evidence to optimize clinical management of HIV. Quasi-experiments have generated important data on the real-world impacts of HIV testing and treatment and on interventions to improve the cascade of care. With the growth in large-scale clinical and administrative data, quasi-experiments enable rigorous evaluation of policies implemented in real-world settings.
ISMIP6 - initMIP: Greenland ice sheet model initialisation experiments
NASA Astrophysics Data System (ADS)
Goelzer, Heiko; Nowicki, Sophie; Payne, Tony; Larour, Eric; Abe Ouchi, Ayako; Gregory, Jonathan; Lipscomb, William; Seroussi, Helene; Shepherd, Andrew; Edwards, Tamsin
2016-04-01
Earlier large-scale Greenland ice sheet sea-level projections e.g. those run during ice2sea and SeaRISE initiatives have shown that ice sheet initialisation can have a large effect on the projections and gives rise to important uncertainties. This intercomparison exercise (initMIP) aims at comparing, evaluating and improving the initialization techniques used in the ice sheet modeling community and to estimate the associated uncertainties. It is the first in a series of ice sheet model intercomparison activities within ISMIP6 (Ice Sheet Model Intercomparison Project for CMIP6). The experiments are conceived for the large-scale Greenland ice sheet and are designed to allow intercomparison between participating models of 1) the initial present-day state of the ice sheet and 2) the response in two schematic forward experiments. The latter experiments serve to evaluate the initialisation in terms of model drift (forward run without any forcing) and response to a large perturbation (prescribed surface mass balance anomaly). We present and discuss first results of the intercomparison and highlight important uncertainties with respect to projections of the Greenland ice sheet sea-level contribution.
Computational Fluid Dynamics (CFD) Simulations of Jet Mixing in Tanks of Different Scales
NASA Technical Reports Server (NTRS)
Breisacher, Kevin; Moder, Jeffrey
2010-01-01
For long-duration in-space storage of cryogenic propellants, an axial jet mixer is one concept for controlling tank pressure and reducing thermal stratification. Extensive ground-test data from the 1960s to the present exist for tank diameters of 10 ft or less. The design of axial jet mixers for tanks on the order of 30 ft diameter, such as those planned for the Ares V Earth Departure Stage (EDS) LH2 tank, will require scaling of available experimental data from much smaller tanks, as well designing for microgravity effects. This study will assess the ability for Computational Fluid Dynamics (CFD) to handle a change of scale of this magnitude by performing simulations of existing ground-based axial jet mixing experiments at two tank sizes differing by a factor of ten. Simulations of several axial jet configurations for an Ares V scale EDS LH2 tank during low Earth orbit (LEO) coast are evaluated and selected results are also presented. Data from jet mixing experiments performed in the 1960s by General Dynamics with water at two tank sizes (1 and 10 ft diameter) are used to evaluate CFD accuracy. Jet nozzle diameters ranged from 0.032 to 0.25 in. for the 1 ft diameter tank experiments and from 0.625 to 0.875 in. for the 10 ft diameter tank experiments. Thermally stratified layers were created in both tanks prior to turning on the jet mixer. Jet mixer efficiency was determined by monitoring the temperatures on thermocouple rakes in the tanks to time when the stratified layer was mixed out. Dye was frequently injected into the stratified tank and its penetration recorded. There were no velocities or turbulence quantities available in the experimental data. A commercially available, time accurate, multi-dimensional CFD code with free surface tracking (FLOW-3D from Flow Science, Inc.) is used for the simulations presented. Comparisons are made between computed temperatures at various axial locations in the tank at different times and those observed experimentally. The affect of various modeling parameters on the agreement obtained are assessed.
Guidelines for Genome-Scale Analysis of Biological Rhythms.
Hughes, Michael E; Abruzzi, Katherine C; Allada, Ravi; Anafi, Ron; Arpat, Alaaddin Bulak; Asher, Gad; Baldi, Pierre; de Bekker, Charissa; Bell-Pedersen, Deborah; Blau, Justin; Brown, Steve; Ceriani, M Fernanda; Chen, Zheng; Chiu, Joanna C; Cox, Juergen; Crowell, Alexander M; DeBruyne, Jason P; Dijk, Derk-Jan; DiTacchio, Luciano; Doyle, Francis J; Duffield, Giles E; Dunlap, Jay C; Eckel-Mahan, Kristin; Esser, Karyn A; FitzGerald, Garret A; Forger, Daniel B; Francey, Lauren J; Fu, Ying-Hui; Gachon, Frédéric; Gatfield, David; de Goede, Paul; Golden, Susan S; Green, Carla; Harer, John; Harmer, Stacey; Haspel, Jeff; Hastings, Michael H; Herzel, Hanspeter; Herzog, Erik D; Hoffmann, Christy; Hong, Christian; Hughey, Jacob J; Hurley, Jennifer M; de la Iglesia, Horacio O; Johnson, Carl; Kay, Steve A; Koike, Nobuya; Kornacker, Karl; Kramer, Achim; Lamia, Katja; Leise, Tanya; Lewis, Scott A; Li, Jiajia; Li, Xiaodong; Liu, Andrew C; Loros, Jennifer J; Martino, Tami A; Menet, Jerome S; Merrow, Martha; Millar, Andrew J; Mockler, Todd; Naef, Felix; Nagoshi, Emi; Nitabach, Michael N; Olmedo, Maria; Nusinow, Dmitri A; Ptáček, Louis J; Rand, David; Reddy, Akhilesh B; Robles, Maria S; Roenneberg, Till; Rosbash, Michael; Ruben, Marc D; Rund, Samuel S C; Sancar, Aziz; Sassone-Corsi, Paolo; Sehgal, Amita; Sherrill-Mix, Scott; Skene, Debra J; Storch, Kai-Florian; Takahashi, Joseph S; Ueda, Hiroki R; Wang, Han; Weitz, Charles; Westermark, Pål O; Wijnen, Herman; Xu, Ying; Wu, Gang; Yoo, Seung-Hee; Young, Michael; Zhang, Eric Erquan; Zielinski, Tomasz; Hogenesch, John B
2017-10-01
Genome biology approaches have made enormous contributions to our understanding of biological rhythms, particularly in identifying outputs of the clock, including RNAs, proteins, and metabolites, whose abundance oscillates throughout the day. These methods hold significant promise for future discovery, particularly when combined with computational modeling. However, genome-scale experiments are costly and laborious, yielding "big data" that are conceptually and statistically difficult to analyze. There is no obvious consensus regarding design or analysis. Here we discuss the relevant technical considerations to generate reproducible, statistically sound, and broadly useful genome-scale data. Rather than suggest a set of rigid rules, we aim to codify principles by which investigators, reviewers, and readers of the primary literature can evaluate the suitability of different experimental designs for measuring different aspects of biological rhythms. We introduce CircaInSilico, a web-based application for generating synthetic genome biology data to benchmark statistical methods for studying biological rhythms. Finally, we discuss several unmet analytical needs, including applications to clinical medicine, and suggest productive avenues to address them.
Guidelines for Genome-Scale Analysis of Biological Rhythms
Hughes, Michael E.; Abruzzi, Katherine C.; Allada, Ravi; Anafi, Ron; Arpat, Alaaddin Bulak; Asher, Gad; Baldi, Pierre; de Bekker, Charissa; Bell-Pedersen, Deborah; Blau, Justin; Brown, Steve; Ceriani, M. Fernanda; Chen, Zheng; Chiu, Joanna C.; Cox, Juergen; Crowell, Alexander M.; DeBruyne, Jason P.; Dijk, Derk-Jan; DiTacchio, Luciano; Doyle, Francis J.; Duffield, Giles E.; Dunlap, Jay C.; Eckel-Mahan, Kristin; Esser, Karyn A.; FitzGerald, Garret A.; Forger, Daniel B.; Francey, Lauren J.; Fu, Ying-Hui; Gachon, Frédéric; Gatfield, David; de Goede, Paul; Golden, Susan S.; Green, Carla; Harer, John; Harmer, Stacey; Haspel, Jeff; Hastings, Michael H.; Herzel, Hanspeter; Herzog, Erik D.; Hoffmann, Christy; Hong, Christian; Hughey, Jacob J.; Hurley, Jennifer M.; de la Iglesia, Horacio O.; Johnson, Carl; Kay, Steve A.; Koike, Nobuya; Kornacker, Karl; Kramer, Achim; Lamia, Katja; Leise, Tanya; Lewis, Scott A.; Li, Jiajia; Li, Xiaodong; Liu, Andrew C.; Loros, Jennifer J.; Martino, Tami A.; Menet, Jerome S.; Merrow, Martha; Millar, Andrew J.; Mockler, Todd; Naef, Felix; Nagoshi, Emi; Nitabach, Michael N.; Olmedo, Maria; Nusinow, Dmitri A.; Ptáček, Louis J.; Rand, David; Reddy, Akhilesh B.; Robles, Maria S.; Roenneberg, Till; Rosbash, Michael; Ruben, Marc D.; Rund, Samuel S.C.; Sancar, Aziz; Sassone-Corsi, Paolo; Sehgal, Amita; Sherrill-Mix, Scott; Skene, Debra J.; Storch, Kai-Florian; Takahashi, Joseph S.; Ueda, Hiroki R.; Wang, Han; Weitz, Charles; Westermark, Pål O.; Wijnen, Herman; Xu, Ying; Wu, Gang; Yoo, Seung-Hee; Young, Michael; Zhang, Eric Erquan; Zielinski, Tomasz; Hogenesch, John B.
2017-01-01
Genome biology approaches have made enormous contributions to our understanding of biological rhythms, particularly in identifying outputs of the clock, including RNAs, proteins, and metabolites, whose abundance oscillates throughout the day. These methods hold significant promise for future discovery, particularly when combined with computational modeling. However, genome-scale experiments are costly and laborious, yielding “big data” that are conceptually and statistically difficult to analyze. There is no obvious consensus regarding design or analysis. Here we discuss the relevant technical considerations to generate reproducible, statistically sound, and broadly useful genome-scale data. Rather than suggest a set of rigid rules, we aim to codify principles by which investigators, reviewers, and readers of the primary literature can evaluate the suitability of different experimental designs for measuring different aspects of biological rhythms. We introduce CircaInSilico, a web-based application for generating synthetic genome biology data to benchmark statistical methods for studying biological rhythms. Finally, we discuss several unmet analytical needs, including applications to clinical medicine, and suggest productive avenues to address them. PMID:29098954
Taheri, Mohammadreza; Moazeni-Pourasil, Roudabeh Sadat; Sheikh-Olia-Lavasani, Majid; Karami, Ahmad; Ghassempour, Alireza
2016-03-01
Chromatographic method development for preparative targets is a time-consuming and subjective process. This can be particularly problematic because of the use of valuable samples for isolation and the large consumption of solvents in preparative scale. These processes could be improved by using statistical computations to save time, solvent and experimental efforts. Thus, contributed by ESI-MS, after applying DryLab software to gain an overview of the most effective parameters in separation of synthesized celecoxib and its co-eluted compounds, design of experiment software that relies on multivariate modeling as a chemometric approach was used to predict the optimized touching-band overloading conditions by objective functions according to the relationship between selectivity and stationary phase properties. The loadability of the method was investigated on the analytical and semi-preparative scales, and the performance of this chemometric approach was approved by peak shapes beside recovery and purity of products. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Pretreatment optimization of Sorghum pioneer biomass for bioethanol production and its scale-up.
Koradiya, Manoj; Duggirala, Srinivas; Tipre, Devayani; Dave, Shailesh
2016-01-01
Based on one parameter at a time, saccharification of delignified sorghum biomass by 4% and 70% v/v sulfuric acid resulted in maximum 30.8 and 33.8 g% sugar production from biomass respectively. The Box Behnken Design was applied for further optimization of acid hydrolysis. As a result of the designed experiment 36.3g% sugar production was achieved when 3% v/v H2SO4 treatment given for 60 min at 180°C. The process was scaled-up to treat 2 kg of biomass. During the screening of yeast cultures, isolate C, MK-I and N were found to be potent ethanol producers from sorghum hydrolyzate. Culture MK-I was the best so used for scale up of ethanol production up to 25 L capacity, which gave a yield of 0.49 g ethanol/g sugar from hydrolyzate obtained from 2 kg of sorghum biomass. Copyright © 2015 Elsevier Ltd. All rights reserved.
Ji, Yu; Tian, Yang; Ahnfelt, Mattias; Sui, Lili
2014-06-27
Multivalent pneumococcal vaccines were used worldwide to protect human beings from pneumococcal diseases. In order to eliminate the toxic organic solutions used in the traditional vaccine purification process, an alternative chromatographic process for Streptococcus pneumoniae serotype 23F capsular polysaccharide (CPS) was proposed in this study. The strategy of Design of Experiments (DoE) was introduced into the process development to solve the complicated design procedure. An initial process analysis was given to review the whole flowchart, identify the critical factors of chromatography through FMEA and chose the flowthrough mode due to the property of the feed. A resin screening study was then followed to select candidate resins. DoE was utilized to generate a resolution IV fractional factorial design to further compare candidates and narrow down the design space. After Capto Adhere was selected, the Box-Behnken DoE was executed to model the process and characterize all effects of factors on the responses. Finally, Monte Carlo simulation was used to optimize the process, test the chosen optimal conditions and define the control limit. The results of three scale-up runs at set points verified the DoE and simulation predictions. The final results were well in accordance with the EU pharmacopeia requirements: Protein/CPS (w/w) 1.08%; DNA/CPS (w/w) 0.61%; the phosphorus content 3.1%; the nitrogen 0.315% and the Methyl-pentose percentage 47.9%. Other tests of final pure CPS also met the pharmacopeia specifications. This alternative chromatographic purification process for pneumococcal vaccine without toxic organic solvents was successfully developed by the DoE approach and proved scalability, robustness and suitability for large scale manufacturing. Copyright © 2014 Elsevier B.V. All rights reserved.
Kuluski, Kerry; Bechsgaard, Gitte; Ridgway, Jennifer; Katz, Joel
2016-01-01
Introduction. The purpose of this study was to evaluate a specialized yoga intervention for inpatients in a rehabilitation and complex continuing care hospital. Design. Single-cohort repeated measures design. Methods. Participants (N = 10) admitted to a rehabilitation and complex continuing care hospital were recruited to participate in a 50–60 min Hatha Yoga class (modified for wheelchair users/seated position) once a week for eight weeks, with assigned homework practice. Questionnaires on pain (pain, pain interference, and pain catastrophizing), psychological variables (depression, anxiety, and experiences with injustice), mindfulness, self-compassion, and spiritual well-being were collected at three intervals: pre-, mid-, and post-intervention. Results. Repeated measures ANOVAs revealed a significant main effect of time indicating improvements over the course of the yoga program on the (1) anxiety subscale of the Hospital Anxiety and Depression Scale, F(2,18) = 4.74, p < .05, and η p 2 = .35, (2) Self-Compassion Scale-Short Form, F(2,18) = 3.71, p < .05, and η p 2 = .29, and (3) Magnification subscale of the Pain Catastrophizing Scale, F(2,18) = 3. 66, p < .05, and η p 2 = .29. Discussion. The results suggest that an 8-week Hatha Yoga program improves pain-related factors and psychological experiences in individuals admitted to a rehabilitation and complex continuing care hospital. PMID:28115969
Close, Kristin L; Baxter, Linden S; Ravelojaona, Vaonandianina A; Rakotoarison, Hasiniaina N; Bruno, Emily; Herbert, Alison; Andean, Vanessa; Callahan, James; Andriamanjato, Hery H
2017-01-01
The WHO Surgical Safety Checklist was launched in 2009, and appropriate use reduces mortality, surgical site infections and complications after surgery by up to 50%. Implementation across low-income and middle-income countries has been slow; published evidence is restricted to reports from a few single institutions, and significant challenges to successful implementation have been identified and presented. The Mercy Ships Medical Capacity Building team developed a multidisciplinary 3-day Surgical Safety Checklist training programme designed for rapid wide-scale implementation in all regional referral hospitals in Madagascar. Particular attention was given to addressing previously reported challenges to implementation. We taught 427 participants in 21 hospitals; at 3–4 months postcourse, we collected surveys from 183 participants in 20 hospitals and conducted one focus group per hospital. We used a concurrent embedded approach in this mixed-methods design to evaluate participants’ experiences and behavioural change as a result of the training programme. Quantitative and qualitative data were analysed using descriptive statistics and inductive thematic analysis, respectively. This analysis paper describes our field experiences and aims to report participants’ responses to the training course, identify further challenges to implementation and describe the lessons learnt. Recommendations are given for stakeholders seeking widespread rapid scale up of quality improvement initiatives to promote surgical safety worldwide. PMID:29225958
DEVELOPMENT AND DEPLOYMENT OF VACUUM SALT DISTILLATION AT THE SAVANNAH RIVER SITE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pierce, R.; Pak, D.; Edwards, T.
2010-10-28
The Savannah River Site has a mission to dissolve fissile materials and disposition them. The primary fissile material is plutonium dioxide (PuO{sub 2}). To support dissolution of these materials, the Savannah River National Laboratory (SRNL) designed and demonstrated a vacuum salt distillation (VSD) apparatus using both representative radioactive samples and non-radioactive simulant materials. Vacuum salt distillation, through the removal of chloride salts, increases the quantity of materials suitable for processing in the site's HB-Line Facility. Small-scale non-radioactive experiments at 900-950 C show that >99.8 wt % of the initial charge of chloride salt distilled from the sample boat with recoverymore » of >99.8 wt % of the ceric oxide (CeO{sub 2}) - the surrogate for PuO{sub 2} - as a non-chloride bearing 'product'. Small-scale radioactive testing in a glovebox demonstrated the removal of sodium chloride (NaCl) and potassium chloride (KCl) from 13 PuO{sub 2} samples. Chloride concentrations were distilled from a starting concentration of 1.8-10.8 wt % to a final concentration <500 mg/kg chloride. Initial testing of a non-radioactive, full-scale production prototype is complete. A designed experiment evaluated the impact of distillation temperature, time at temperature, vacuum, product depth, and presence of a boat cover. Significant effort has been devoted to mechanical considerations to facilitate simplified operation in a glovebox.« less
ERIC Educational Resources Information Center
Cheung, Derek
2011-01-01
One of the characteristics of teaching chemistry through inquiry is that teachers need to encourage students to design their experimental procedures. Although the benefits of inquiry teaching are well documented in the literature, few teachers implement it in schools. The purpose of this study was to develop a guided-inquiry scale (GIS) to measure…
Using Longitudinal Assessment Data to Improve Retention and Student Experiences
ERIC Educational Resources Information Center
Trosset, Carol; Weisler, Steven
2010-01-01
The Wabash National Study of Liberal Arts Education presents a longitudinal analysis of how students change on a number of scales that purport to measure many of the outcomes of liberal learning over the span of a college education. The Wabash Study is designed to collect information longitudinally from students at the beginning and end of their…
ERIC Educational Resources Information Center
Looi, Chee-Kit; Wong, Lung-Hsiang
2014-01-01
Many countries, regions and education districts in the world have experimented with models of one-device-per-student as an enabler of new or effective pedagogies supported by mobile technologies. Researchers have also designed innovations or interventions for possible adoption by schools or for informal learning. Of critical interest to the…
Assessment of the Measurement Properties of the NHCAHPS Family Survey: A Rasch Scaling Approach
ERIC Educational Resources Information Center
O'Connor, Matthew S.
2013-01-01
The introduction of the Consumer Assessment of Healthcare Providers and Systems (CAHPS), a family of survey instruments designed to capture and report people's experiences obtaining health care could soon add satisfaction as a consistent dimension of quality that skilled nursing facilities (SNFs) are required to assess and report. The SNF setting…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duignan, M. R.; Herman, D. T.; Restivo, M. L.
Experiments at several different scales were performed to understand the removal of spherical resorcinol formaldehyde (sRF) ion exchange resin using a gravity drain system with a valve located above the resin screen in the ion exchange column (IXC). This is being considered as part of the design for the Low Activity Waste Pretreatment System (LAWPS) to be constructed at the DOE Hanford Site.
Online Learning Self-Efficacy in Students with and without Online Learning Experience
ERIC Educational Resources Information Center
Zimmerman, Whitney Alicia; Kulikowich, Jonna M.
2016-01-01
A need was identified for an instrument to measure online learning self-efficacy, which encompassed the wide variety of tasks required of successful online students. The Online Learning Self-Efficacy Scale (OLSES) was designed to include tasks required of students enrolled in paced online courses at one university. In the present study, the…
"Life Stage-Specific" Variations in Performance in Response to Age Stereotypes
ERIC Educational Resources Information Center
Hehman, Jessica A.; Bugental, Daphne Blunt
2013-01-01
In a test of life stage-specific responses to age-based stigma, older (n = 54, ages 62-92) and younger (n = 81, ages 17-22) adults were told that a task (Weschler Adult Intelligence Scale-III block design) required either (a) speed/contemporary knowledge (YA; "youth advantage") or (b) life experience/wisdom (OA; "age…
A Multivariate Analysis of Secondary Students' Experience of Web-Based Language Acquisition
ERIC Educational Resources Information Center
Felix, Uschi
2004-01-01
This paper reports on a large-scale project designed to replicate an earlier investigation of tertiary students (Felix, 2001) in a secondary school environment. The new project was carried out in five settings, again investigating the potential of the Web as a medium of language instruction. Data was collected by questionnaires and observational…
NASA Astrophysics Data System (ADS)
Nouri, N. M.; Mostafapour, K.; Kamran, M.
2018-02-01
In a closed water-tunnel circuit, the multi-component strain gauge force and moment sensor (also known as balance) are generally used to measure hydrodynamic forces and moments acting on scaled models. These balances are periodically calibrated by static loading. Their performance and accuracy depend significantly on the rig and the method of calibration. In this research, a new calibration rig was designed and constructed to calibrate multi-component internal strain gauge balances. The calibration rig has six degrees of freedom and six different component-loading structures that can be applied separately and synchronously. The system was designed based on the applicability of formal experimental design techniques, using gravity for balance loading and balance positioning and alignment relative to gravity. To evaluate the calibration rig, a six-component internal balance developed by Iran University of Science and Technology was calibrated using response surface methodology. According to the results, calibration rig met all design criteria. This rig provides the means by which various methods of formal experimental design techniques can be implemented. The simplicity of the rig saves time and money in the design of experiments and in balance calibration while simultaneously increasing the accuracy of these activities.
Characterization of an Ionization Readout Tile for nEXO
Jewell, M.; Schubert, A.; Cen, W. R.; ...
2018-01-10
Here, a new design for the anode of a time projection chamber, consisting of a charge-detecting "tile", is investigated for use in large scale liquid xenon detectors. The tile is produced by depositing 60 orthogonal metal charge-collecting strips, 3 mm wide, on a 10 cm × 10 cm fused-silica wafer. These charge tiles may be employed by large detectors, such as the proposed tonne-scale nEXO experiment to search for neutrinoless double-beta decay. Modular by design, an array of tiles can cover a sizable area. The width of each strip is small compared to the size of the tile, so amore » Frisch grid is not required. A grid-less, tiled anode design is beneficial for an experiment such as nEXO, where a wire tensioning support structure and Frisch grid might contribute radioactive backgrounds and would have to be designed to accommodate cycling to cryogenic temperatures. The segmented anode also reduces some degeneracies in signal reconstruction that arise in large-area crossed-wire time projection chambers. A prototype tile was tested in a cell containing liquid xenon. Very good agreement is achieved between the measured ionization spectrum of a 207Bi source and simulations that include the microphysics of recombination in xenon and a detailed modeling of the electrostatic field of the detector. An energy resolution σ/ E=5.5% is observed at 570 keV, comparable to the best intrinsic ionization-only resolution reported in literature for liquid xenon at 936 V/cm.« less
Characterization of an Ionization Readout Tile for nEXO
NASA Astrophysics Data System (ADS)
Jewell, M.; Schubert, A.; Cen, W. R.; Dalmasson, J.; DeVoe, R.; Fabris, L.; Gratta, G.; Jamil, A.; Li, G.; Odian, A.; Patel, M.; Pocar, A.; Qiu, D.; Wang, Q.; Wen, L. J.; Albert, J. B.; Anton, G.; Arnquist, I. J.; Badhrees, I.; Barbeau, P.; Beck, D.; Belov, V.; Bourque, F.; Brodsky, J. P.; Brown, E.; Brunner, T.; Burenkov, A.; Cao, G. F.; Cao, L.; Chambers, C.; Charlebois, S. A.; Chiu, M.; Cleveland, B.; Coon, M.; Craycraft, A.; Cree, W.; Côté, M.; Daniels, T.; Daugherty, S. J.; Daughhetee, J.; Delaquis, S.; Der Mesrobian-Kabakian, A.; Didberidze, T.; Dilling, J.; Ding, Y. Y.; Dolinski, M. J.; Dragone, A.; Fairbank, W.; Farine, J.; Feyzbakhsh, S.; Fontaine, R.; Fudenberg, D.; Giacomini, G.; Gornea, R.; Hansen, E. V.; Harris, D.; Hasan, M.; Heffner, M.; Hoppe, E. W.; House, A.; Hufschmidt, P.; Hughes, M.; Hößl, J.; Ito, Y.; Iverson, A.; Jiang, X. S.; Johnston, S.; Karelin, A.; Kaufman, L. J.; Koffas, T.; Kravitz, S.; Krücken, R.; Kuchenkov, A.; Kumar, K. S.; Lan, Y.; Leonard, D. S.; Li, S.; Li, Z.; Licciardi, C.; Lin, Y. H.; MacLellan, R.; Michel, T.; Mong, B.; Moore, D.; Murray, K.; Newby, R. J.; Ning, Z.; Njoya, O.; Nolet, F.; Odgers, K.; Oriunno, M.; Orrell, J. L.; Ostrovskiy, I.; Overman, C. T.; Ortega, G. S.; Parent, S.; Piepke, A.; Pratte, J.-F.; Radeka, V.; Raguzin, E.; Rao, T.; Rescia, S.; Retiere, F.; Robinson, A.; Rossignol, T.; Rowson, P. C.; Roy, N.; Saldanha, R.; Sangiorgio, S.; Schmidt, S.; Schneider, J.; Sinclair, D.; Skarpaas, K.; Soma, A. K.; St-Hilaire, G.; Stekhanov, V.; Stiegler, T.; Sun, X. L.; Tarka, M.; Todd, J.; Tolba, T.; Tsang, R.; Tsang, T.; Vachon, F.; Veeraraghavan, V.; Visser, G.; Vuilleumier, J.-L.; Wagenpfeil, M.; Weber, M.; Wei, W.; Wichoski, U.; Wrede, G.; Wu, S. X.; Wu, W. H.; Yang, L.; Yen, Y.-R.; Zeldovich, O.; Zhang, X.; Zhao, J.; Zhou, Y.; Ziegler, T.
2018-01-01
A new design for the anode of a time projection chamber, consisting of a charge-detecting "tile", is investigated for use in large scale liquid xenon detectors. The tile is produced by depositing 60 orthogonal metal charge-collecting strips, 3 mm wide, on a 10 cm × 10 cm fused-silica wafer. These charge tiles may be employed by large detectors, such as the proposed tonne-scale nEXO experiment to search for neutrinoless double-beta decay. Modular by design, an array of tiles can cover a sizable area. The width of each strip is small compared to the size of the tile, so a Frisch grid is not required. A grid-less, tiled anode design is beneficial for an experiment such as nEXO, where a wire tensioning support structure and Frisch grid might contribute radioactive backgrounds and would have to be designed to accommodate cycling to cryogenic temperatures. The segmented anode also reduces some degeneracies in signal reconstruction that arise in large-area crossed-wire time projection chambers. A prototype tile was tested in a cell containing liquid xenon. Very good agreement is achieved between the measured ionization spectrum of a 207Bi source and simulations that include the microphysics of recombination in xenon and a detailed modeling of the electrostatic field of the detector. An energy resolution σ/E=5.5% is observed at 570 keV, comparable to the best intrinsic ionization-only resolution reported in literature for liquid xenon at 936 V/cm.
Characterization of an Ionization Readout Tile for nEXO
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jewell, M.; Schubert, A.; Cen, W. R.
Here, a new design for the anode of a time projection chamber, consisting of a charge-detecting "tile", is investigated for use in large scale liquid xenon detectors. The tile is produced by depositing 60 orthogonal metal charge-collecting strips, 3 mm wide, on a 10 cm × 10 cm fused-silica wafer. These charge tiles may be employed by large detectors, such as the proposed tonne-scale nEXO experiment to search for neutrinoless double-beta decay. Modular by design, an array of tiles can cover a sizable area. The width of each strip is small compared to the size of the tile, so amore » Frisch grid is not required. A grid-less, tiled anode design is beneficial for an experiment such as nEXO, where a wire tensioning support structure and Frisch grid might contribute radioactive backgrounds and would have to be designed to accommodate cycling to cryogenic temperatures. The segmented anode also reduces some degeneracies in signal reconstruction that arise in large-area crossed-wire time projection chambers. A prototype tile was tested in a cell containing liquid xenon. Very good agreement is achieved between the measured ionization spectrum of a 207Bi source and simulations that include the microphysics of recombination in xenon and a detailed modeling of the electrostatic field of the detector. An energy resolution σ/ E=5.5% is observed at 570 keV, comparable to the best intrinsic ionization-only resolution reported in literature for liquid xenon at 936 V/cm.« less
A Robust Adaptive Autonomous Approach to Optimal Experimental Design
NASA Astrophysics Data System (ADS)
Gu, Hairong
Experimentation is the fundamental tool of scientific inquiries to understand the laws governing the nature and human behaviors. Many complex real-world experimental scenarios, particularly in quest of prediction accuracy, often encounter difficulties to conduct experiments using an existing experimental procedure for the following two reasons. First, the existing experimental procedures require a parametric model to serve as the proxy of the latent data structure or data-generating mechanism at the beginning of an experiment. However, for those experimental scenarios of concern, a sound model is often unavailable before an experiment. Second, those experimental scenarios usually contain a large number of design variables, which potentially leads to a lengthy and costly data collection cycle. Incompetently, the existing experimental procedures are unable to optimize large-scale experiments so as to minimize the experimental length and cost. Facing the two challenges in those experimental scenarios, the aim of the present study is to develop a new experimental procedure that allows an experiment to be conducted without the assumption of a parametric model while still achieving satisfactory prediction, and performs optimization of experimental designs to improve the efficiency of an experiment. The new experimental procedure developed in the present study is named robust adaptive autonomous system (RAAS). RAAS is a procedure for sequential experiments composed of multiple experimental trials, which performs function estimation, variable selection, reverse prediction and design optimization on each trial. Directly addressing the challenges in those experimental scenarios of concern, function estimation and variable selection are performed by data-driven modeling methods to generate a predictive model from data collected during the course of an experiment, thus exempting the requirement of a parametric model at the beginning of an experiment; design optimization is performed to select experimental designs on the fly of an experiment based on their usefulness so that fewest designs are needed to reach useful inferential conclusions. Technically, function estimation is realized by Bayesian P-splines, variable selection is realized by Bayesian spike-and-slab prior, reverse prediction is realized by grid-search and design optimization is realized by the concepts of active learning. The present study demonstrated that RAAS achieves statistical robustness by making accurate predictions without the assumption of a parametric model serving as the proxy of latent data structure while the existing procedures can draw poor statistical inferences if a misspecified model is assumed; RAAS also achieves inferential efficiency by taking fewer designs to acquire useful statistical inferences than non-optimal procedures. Thus, RAAS is expected to be a principled solution to real-world experimental scenarios pursuing robust prediction and efficient experimentation.
Influence of Immersive Human Scale Architectural Representation on Design Judgment
NASA Astrophysics Data System (ADS)
Elder, Rebecca L.
Unrealistic visual representation of architecture within our existing environments have lost all reference to the human senses. As a design tool, visual and auditory stimuli can be utilized to determine human's perception of design. This experiment renders varying building inputs within different sites, simulated with corresponding immersive visual and audio sensory cues. Introducing audio has been proven to influence the way a person perceives a space, yet most inhabitants rely strictly on their sense of vision to make design judgments. Though not as apparent, users prefer spaces that have a better quality of sound and comfort. Through a series of questions, we can begin to analyze whether a design is fit for both an acoustic and visual environment.
The project ownership survey: measuring differences in scientific inquiry experiences.
Hanauer, David I; Dolan, Erin L
2014-01-01
A growing body of research documents the positive outcomes of research experiences for undergraduates, including increased persistence in science. Study of undergraduate lab learning experiences has demonstrated that the design of the experience influences the extent to which students report ownership of the project and that project ownership is one of the psychosocial factors involved in student retention in the sciences. To date, methods for measuring project ownership have not been suitable for the collection of larger data sets. The current study aims to rectify this by developing, presenting, and evaluating a new instrument for measuring project ownership. Eighteen scaled items were generated based on prior research and theory related to project ownership and combined with 30 items shown to measure respondents' emotions about an experience, resulting in the Project Ownership survey (POS). The POS was analyzed to determine its dimensionality, reliability, and validity. The POS had a coefficient alpha of 0.92 and thus has high internal consistency. Known-groups validity was analyzed through the ability of the instrument to differentiate between students who studied in traditional versus research-based laboratory courses. The POS scales as differentiated between the groups and findings paralleled previous results in relation to the characteristics of project ownership.
Microgravity Level Measurement of the Beijing Drop Tower Using a Sensitive Accelerometer
Liu, T. Y.; Wu, Q. P.; Sun, B. Q.; Han, F. T.
2016-01-01
Drop tower is the most common ground-based facility to provide microgravity environment and widely used in many science experiments. A differential space accelerometer has been proposed to test the spin-gravity interaction between rotating extended bodies onboard a drag-free satellite. In order to assist design and test of this inertial sensor in a series of ground- based pre-flight experiments, it is very important to know accurately the residual acceleration of drop towers. In this report, a sensitive instrument for this purpose was built with a high-performance servo quartz accelerometer, and the dedicated interface electronics design providing small full-scale range and high sensitivity, up to 136.8 V/g0. The residual acceleration at the Beijing drop tower was measured using two different drop capsules. The experimental result shows that the microgravity level of the free-falling double capsule is better than 2 × 10−4g0 (Earth’s gravity). The measured data in this report provides critical microgravity information for design of the following ground experiments. PMID:27530726
Scalability, Timing, and System Design Issues for Intrinsic Evolvable Hardware
NASA Technical Reports Server (NTRS)
Hereford, James; Gwaltney, David
2004-01-01
In this paper we address several issues pertinent to intrinsic evolvable hardware (EHW). The first issue is scalability; namely, how the design space scales as the programming string for the programmable device gets longer. We develop a model for population size and the number of generations as a function of the programming string length, L, and show that the number of circuit evaluations is an O(L2) process. We compare our model to several successful intrinsic EHW experiments and discuss the many implications of our model. The second issue that we address is the timing of intrinsic EHW experiments. We show that the processing time is a small part of the overall time to derive or evolve a circuit and that major improvements in processor speed alone will have only a minimal impact on improving the scalability of intrinsic EHW. The third issue we consider is the system-level design of intrinsic EHW experiments. We review what other researchers have done to break the scalability barrier and contend that the type of reconfigurable platform and the evolutionary algorithm are tied together and impose limits on each other.
Information on the Advanced Plant Experiment (APEX) Test Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Curtis Lee
The purpose of this report provides information related to the design of the Oregon State University Advanced Plant Experiment (APEX) test facility. Information provided in this report have been pulled from the following information sources: Reference 1: R. Nourgaliev and et.al, "Summary Report on NGSAC (Next-Generation Safety Analysis Code) Development and Testing," Idaho National Laboratory, 2011. Note that this is report has not been released as an external report. Reference 2: O. Stevens, Characterization of the Advanced Plant Experiment (APEX) Passive Residual Heat Removal System Heat Exchanger, Master Thesis, June 1996. Reference 3: J. Reyes, Jr., Q. Wu, and J.more » King, Jr., Scaling Assessment for the Design of the OSU APEX-1000 Test Facility, OSU-APEX-03001 (Rev. 0), May 2003. Reference 4: J. Reyes et al, Final Report of the NRC AP600 Research Conducted at Oregon State University, NUREG/CR-6641, July 1999. Reference 5: K. Welter et al, APEX-1000 Confirmatory Testing to Support AP1000 Design Certification (non-proprietary), NUREG-1826, August 2005.« less
Grant, Yitzchak; Matejtschuk, Paul; Bird, Christopher; Wadhwa, Meenu; Dalby, Paul A
2012-04-01
The lyophilization of proteins in microplates, to assess and optimise formulations rapidly, has been applied for the first time to a therapeutic protein and, in particular, one that requires a cell-based biological assay, in order to demonstrate the broader usefulness of the approach. Factorial design of experiment methods were combined with lyophilization in microplates to identify optimum formulations that stabilised granulocyte colony-stimulating factor during freeze drying. An initial screen rapidly identified key excipients and potential interactions, which was then followed by a central composite face designed optimisation experiment. Human serum albumin and Tween 20 had significant effects on maintaining protein stability. As previously, the optimum formulation was then freeze-dried in stoppered vials to verify that the microscale data is relevant to pilot scales. However, to validate the approach further, the selected formulation was also assessed for solid-state shelf-life through the use of accelerated stability studies. This approach allows for a high-throughput assessment of excipient options early on in product development, while also reducing costs in terms of time and quantity of materials required.
The Iterative Design Process in Research and Development: A Work Experience Paper
NASA Technical Reports Server (NTRS)
Sullivan, George F. III
2013-01-01
The iterative design process is one of many strategies used in new product development. Top-down development strategies, like waterfall development, place a heavy emphasis on planning and simulation. The iterative process, on the other hand, is better suited to the management of small to medium scale projects. Over the past four months, I have worked with engineers at Johnson Space Center on a multitude of electronics projects. By describing the work I have done these last few months, analyzing the factors that have driven design decisions, and examining the testing and verification process, I will demonstrate that iterative design is the obvious choice for research and development projects.
Greased Lightning (GL-10) Flight Testing Campaign
NASA Technical Reports Server (NTRS)
Fredericks, William J.; McSwain, Robert G.; Beaton, Brian F.; Klassman, David W.; Theodore, Colin R.
2017-01-01
Greased Lightning (GL-10) is an aircraft configuration that combines the characteristics of a cruise efficient airplane with the ability to perform vertical takeoff and landing (VTOL). This aircraft has been designed, fabricated and flight tested at the small unmanned aerial system (UAS) scale. This technical memorandum will document the procedures and findings of the flight test experiments. The GL-10 design utilized two key technologies to enable this unique aircraft design; namely, distributed electric propulsion (DEP) and inexpensive closed loop controllers. These technologies enabled the flight of this inherently unstable aircraft. Overall it has been determined thru flight test that a design that leverages these new technologies can yield a useful VTOL cruise efficient aircraft.
High Resolution Model Intercomparison Project (HighResMIP v1.0) for CMIP6
NASA Astrophysics Data System (ADS)
Haarsma, Reindert J.; Roberts, Malcolm J.; Vidale, Pier Luigi; Senior, Catherine A.; Bellucci, Alessio; Bao, Qing; Chang, Ping; Corti, Susanna; Fučkar, Neven S.; Guemas, Virginie; von Hardenberg, Jost; Hazeleger, Wilco; Kodama, Chihiro; Koenigk, Torben; Leung, L. Ruby; Lu, Jian; Luo, Jing-Jia; Mao, Jiafu; Mizielinski, Matthew S.; Mizuta, Ryo; Nobre, Paulo; Satoh, Masaki; Scoccimarro, Enrico; Semmler, Tido; Small, Justin; von Storch, Jin-Song
2016-11-01
Robust projections and predictions of climate variability and change, particularly at regional scales, rely on the driving processes being represented with fidelity in model simulations. The role of enhanced horizontal resolution in improved process representation in all components of the climate system is of growing interest, particularly as some recent simulations suggest both the possibility of significant changes in large-scale aspects of circulation as well as improvements in small-scale processes and extremes. However, such high-resolution global simulations at climate timescales, with resolutions of at least 50 km in the atmosphere and 0.25° in the ocean, have been performed at relatively few research centres and generally without overall coordination, primarily due to their computational cost. Assessing the robustness of the response of simulated climate to model resolution requires a large multi-model ensemble using a coordinated set of experiments. The Coupled Model Intercomparison Project 6 (CMIP6) is the ideal framework within which to conduct such a study, due to the strong link to models being developed for the CMIP DECK experiments and other model intercomparison projects (MIPs). Increases in high-performance computing (HPC) resources, as well as the revised experimental design for CMIP6, now enable a detailed investigation of the impact of increased resolution up to synoptic weather scales on the simulated mean climate and its variability. The High Resolution Model Intercomparison Project (HighResMIP) presented in this paper applies, for the first time, a multi-model approach to the systematic investigation of the impact of horizontal resolution. A coordinated set of experiments has been designed to assess both a standard and an enhanced horizontal-resolution simulation in the atmosphere and ocean. The set of HighResMIP experiments is divided into three tiers consisting of atmosphere-only and coupled runs and spanning the period 1950-2050, with the possibility of extending to 2100, together with some additional targeted experiments. This paper describes the experimental set-up of HighResMIP, the analysis plan, the connection with the other CMIP6 endorsed MIPs, as well as the DECK and CMIP6 historical simulations. HighResMIP thereby focuses on one of the CMIP6 broad questions, "what are the origins and consequences of systematic model biases?", but we also discuss how it addresses the World Climate Research Program (WCRP) grand challenges.
Neutrinoless double-beta decay search with CUORE and CUORE-0 experiments
Moggi, N.; Artusa, D. R.; F. T. Avignone; ...
2015-03-24
The Cryogenic Underground Observatory for Rare Events (CUORE) is an upcoming experiment designed to search for the neutrinoless double-beta decays. Observation of the process would unambiguously establish that neutrinos are Majorana particles and provide information on their absolute mass scale hierarchy. CUORE is now under construction and will consist of an array of 988 TeO 2 crystal bolometers operated at 10 mK, but the first tower (CUORE-0) is already taking data. The experimental techniques used will be presented as well as the preliminary CUORE-0 results. The current status of the full-mass experiment and its expected sensitivity will then be discussed.
Garcia, John A.; Sanchez, Gabriel R.; Sanchez-Youngman, Shannon; Vargas, Edward D.; Ybarra, Vickie D.
2015-01-01
A growing body of social science research has sought to conceptualize race as a multidimensional concept in which context, societal relations, and institutional dynamics are key components. Utilizing a specially designed survey, we develop and use multiple measures of race (skin color, ascribed race, and discrimination experiences) to capture race as “lived experience” and assess their impact on Latinos’ self-rated health status. We model these measures of race as a lived experience to test the explanatory power of race, both independently and as an integrated scale with categorical regression, scaling, and dimensional analyses. Our analyses show that our multiple measures of race have significant and negative effects on Latinos’ self-reported health. Skin color is a dominant factor that impacts self-reported health both directly and indirectly. We then advocate for the utilization of multiple measures of race, adding to those used in our analysis, and their application to other health and social outcomes. Our analysis provides important contributions across a wide range of health, illness, social, and political outcomes for communities of color. PMID:26681972
Wang, Yanjun; Li, Haoyu; Liu, Xingbin; Zhang, Yuhui; Xie, Ronghua; Huang, Chunhui; Hu, Jinhai; Deng, Gang
2016-10-14
First, the measuring principle, the weight function, and the magnetic field of the novel downhole inserted electromagnetic flowmeter (EMF) are described. Second, the basic design of the EMF is described. Third, the dynamic experiments of two EMFs in oil-water two-phase flow are carried out. The experimental errors are analyzed in detail. The experimental results show that the maximum absolute value of the full-scale errors is better than 5%, the total flowrate is 5-60 m³/d, and the water-cut is higher than 60%. The maximum absolute value of the full-scale errors is better than 7%, the total flowrate is 2-60 m³/d, and the water-cut is higher than 70%. Finally, onsite experiments in high-water-cut oil-producing wells are conducted, and the possible reasons for the errors in the onsite experiments are analyzed. It is found that the EMF can provide an effective technology for measuring downhole oil-water two-phase flow.
Wang, Yanjun; Li, Haoyu; Liu, Xingbin; Zhang, Yuhui; Xie, Ronghua; Huang, Chunhui; Hu, Jinhai; Deng, Gang
2016-01-01
First, the measuring principle, the weight function, and the magnetic field of the novel downhole inserted electromagnetic flowmeter (EMF) are described. Second, the basic design of the EMF is described. Third, the dynamic experiments of two EMFs in oil-water two-phase flow are carried out. The experimental errors are analyzed in detail. The experimental results show that the maximum absolute value of the full-scale errors is better than 5%, the total flowrate is 5–60 m3/d, and the water-cut is higher than 60%. The maximum absolute value of the full-scale errors is better than 7%, the total flowrate is 2–60 m3/d, and the water-cut is higher than 70%. Finally, onsite experiments in high-water-cut oil-producing wells are conducted, and the possible reasons for the errors in the onsite experiments are analyzed. It is found that the EMF can provide an effective technology for measuring downhole oil-water two-phase flow. PMID:27754412
The active movement scale: an evaluative tool for infants with obstetrical brachial plexus palsy.
Curtis, Christine; Stephens, Derek; Clarke, Howard M; Andrews, David
2002-05-01
Newborns with peripheral nerve lesions involving the upper extremity are difficult to evaluate. The reliability of the Active Movement Scale (AMS), a tool for assessing motor function in infants with obstetrical brachial plexus palsy (OBPP), was examined in 2 complementary studies. Part A was an interrater reliability study in which 63 infants younger than 1 year with OBPP were independently evaluated by 2 physical therapists using the AMS. The scores were compared for reliability and controlled for chance agreement by using kappa statistics. Overall kappa analysis of the 15 tested movements showed a moderate strength of score agreement (kappa = 0.51). Quadratic-weighted kappa (kappa(quad)) statistics showed that 8 of the 15 movements tested were in the highest strength of agreement category (kappa(quad) = 0.81-1.00). Five movements showed substantial agreement (kappa(quad) = 0.61-0.80), and 2 movements had moderate agreement (kappa(quad) = 0.41- 0.60). The overall kappa(quad) was 0.89. Part B was a variability study designed to examine the dispersion of scores when infants with OBPP were evaluated with the AMS by multiple raters. Ten pediatric physical therapists with varying degrees of experience using the scale attended a 1(1/2)-hour instructional workshop on administration of the tool for infants with OBPP. A chain-block study design was used to obtain 30 assessments of 10 infants by 10 raters. A 2-way analysis of variance indicated that the variability of scores due to rater factors was low compared with the variability due to patient factors and that variation in scores due to rater experience was minimal. The results of part A indicate that the AMS is a reliable tool for the assessment of infants with OBPP when raters familiar with the scale are compared. The results of part B suggest that, with minimal training, raters with a range of experience using the AMS are able to reliably evaluate infants with upper-extremity paralysis.
Results from a scaled reactor cavity cooling system with water at steady state
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lisowski, D. D.; Albiston, S. M.; Tokuhiro, A.
We present a summary of steady-state experiments performed with a scaled, water-cooled Reactor Cavity Cooling System (RCCS) at the Univ. of Wisconsin - Madison. The RCCS concept is used for passive decay heat removal in the Next Generation Nuclear Plant (NGNP) design and was based on open literature of the GA-MHTGR, HTR-10 and AVR reactor. The RCCS is a 1/4 scale model of the full scale prototype system, with a 7.6 m structure housing, a 5 m tall test section, and 1,200 liter water storage tank. Radiant heaters impose a heat flux onto a three riser tube test section, representingmore » a 5 deg. radial sector of the actual 360 deg. RCCS design. The maximum heat flux and power levels are 25 kW/m{sup 2} and 42.5 kW, and can be configured for variable, axial, or radial power profiles to simulate prototypic conditions. Experimental results yielded measurements of local surface temperatures, internal water temperatures, volumetric flow rates, and pressure drop along the test section and into the water storage tank. The majority of the tests achieved a steady state condition while remaining single-phase. A selected number of experiments were allowed to reach saturation and subsequently two-phase flow. RELAP5 simulations with the experimental data have been refined during test facility development and separate effects validation of the experimental facility. This test series represents the completion of our steady-state testing, with future experiments investigating normal and off-normal accident scenarios with two-phase flow effects. The ultimate goal of the project is to combine experimental data from UW - Madison, UI, ANL, and Texas A and M, with system model simulations to ascertain the feasibility of the RCCS as a successful long-term heat removal system during accident scenarios for the NGNP. (authors)« less
Watershed Allied Telemetry Experimental Research
NASA Astrophysics Data System (ADS)
Li, Xin; Li, Xiaowen; Li, Zengyuan; Ma, Mingguo; Wang, Jian; Xiao, Qing; Liu, Qiang; Che, Tao; Chen, Erxue; Yan, Guangjian; Hu, Zeyong; Zhang, Lixin; Chu, Rongzhong; Su, Peixi; Liu, Qinhuo; Liu, Shaomin; Wang, Jindi; Niu, Zheng; Chen, Yan; Jin, Rui; Wang, Weizhen; Ran, Youhua; Xin, Xiaozhou; Ren, Huazhong
2009-11-01
The Watershed Allied Telemetry Experimental Research (WATER) is a simultaneous airborne, satellite-borne, and ground-based remote sensing experiment aiming to improve the observability, understanding, and predictability of hydrological and related ecological processes at a catchment scale. WATER consists of the cold region, forest, and arid region hydrological experiments as well as a hydrometeorology experiment and took place in the Heihe River Basin, a typical inland river basin in the northwest of China. The field campaigns have been completed, with an intensive observation period lasting from 7 March to 12 April, from 15 May to 22 July, and from 23 August to 5 September 2008: in total, 120 days. Twenty-five airborne missions were flown. Airborne sensors including microwave radiometers at L, K, and Ka bands, imaging spectrometer, thermal imager, CCD, and lidar were used. Various satellite data were collected. Ground measurements were carried out at four scales, that is, key experimental area, foci experimental area, experiment site, and elementary sampling plot, using ground-based remote sensing instruments, densified network of automatic meteorological stations, flux towers, and hydrological stations. On the basis of these measurements, the remote sensing retrieval models and algorithms of water cycle variables are to be developed or improved, and a catchment-scale land/hydrological data assimilation system is being developed. This paper reviews the background, scientific objectives, experiment design, filed campaign implementation, and current status of WATER. The analysis of the data will continue over the next 2 years, and limited revisits to the field are anticipated.
Conceptual design of initial opacity experiments on the national ignition facility
NASA Astrophysics Data System (ADS)
Heeter, R. F.; Bailey, J. E.; Craxton, R. S.; Devolder, B. G.; Dodd, E. S.; Garcia, E. M.; Huffman, E. J.; Iglesias, C. A.; King, J. A.; Kline, J. L.; Liedahl, D. A.; McKenty, P. W.; Opachich, Y. P.; Rochau, G. A.; Ross, P. W.; Schneider, M. B.; Sherrill, M. E.; Wilson, B. G.; Zhang, R.; Perry, T. S.
2017-02-01
Accurate models of X-ray absorption and re-emission in partly stripped ions are necessary to calculate the structure of stars, the performance of hohlraums for inertial confinement fusion and many other systems in high-energy-density plasma physics. Despite theoretical progress, a persistent discrepancy exists with recent experiments at the Sandia Z facility studying iron in conditions characteristic of the solar radiative-convective transition region. The increased iron opacity measured at Z could help resolve a longstanding issue with the standard solar model, but requires a radical departure for opacity theory. To replicate the Z measurements, an opacity experiment has been designed for the National Facility (NIF). The design uses established techniques scaled to NIF. A laser-heated hohlraum will produce X-ray-heated uniform iron plasmas in local thermodynamic equilibrium (LTE) at temperatures eV and electron densities 21~\\text{cm}-3$ . The iron will be probed using continuum X-rays emitted in a ps, diameter source from a 2 mm diameter polystyrene (CH) capsule implosion. In this design, of the NIF beams deliver 500 kJ to the mm diameter hohlraum, and the remaining directly drive the CH capsule with 200 kJ. Calculations indicate this capsule backlighter should outshine the iron sample, delivering a point-projection transmission opacity measurement to a time-integrated X-ray spectrometer viewing down the hohlraum axis. Preliminary experiments to develop the backlighter and hohlraum are underway, informing simulated measurements to guide the final design.
Review of design optimization methods for turbomachinery aerodynamics
NASA Astrophysics Data System (ADS)
Li, Zhihui; Zheng, Xinqian
2017-08-01
In today's competitive environment, new turbomachinery designs need to be not only more efficient, quieter, and ;greener; but also need to be developed at on much shorter time scales and at lower costs. A number of advanced optimization strategies have been developed to achieve these requirements. This paper reviews recent progress in turbomachinery design optimization to solve real-world aerodynamic problems, especially for compressors and turbines. This review covers the following topics that are important for optimizing turbomachinery designs. (1) optimization methods, (2) stochastic optimization combined with blade parameterization methods and the design of experiment methods, (3) gradient-based optimization methods for compressors and turbines and (4) data mining techniques for Pareto Fronts. We also present our own insights regarding the current research trends and the future optimization of turbomachinery designs.
NASA Technical Reports Server (NTRS)
Pavlock, Kate M.
2011-01-01
The National Aeronautics and Space Administration's Dryden Flight Research Center completed flight testing of adaptive controls research on the Full-Scale Advance Systems Testbed (FAST) in January of 2011. The research addressed technical challenges involved with reducing risk in an increasingly complex and dynamic national airspace. Specific challenges lie with the development of validated, multidisciplinary, integrated aircraft control design tools and techniques to enable safe flight in the presence of adverse conditions such as structural damage, control surface failures, or aerodynamic upsets. The testbed is an F-18 aircraft serving as a full-scale vehicle to test and validate adaptive flight control research and lends a significant confidence to the development, maturation, and acceptance process of incorporating adaptive control laws into follow-on research and the operational environment. The experimental systems integrated into FAST were designed to allow for flexible yet safe flight test evaluation and validation of modern adaptive control technologies and revolve around two major hardware upgrades: the modification of Production Support Flight Control Computers (PSFCC) and integration of two, fourth-generation Airborne Research Test Systems (ARTS). Post-hardware integration verification and validation provided the foundation for safe flight test of Nonlinear Dynamic Inversion and Model Reference Aircraft Control adaptive control law experiments. To ensure success of flight in terms of cost, schedule, and test results, emphasis on risk management was incorporated into early stages of design and flight test planning and continued through the execution of each flight test mission. Specific consideration was made to incorporate safety features within the hardware and software to alleviate user demands as well as into test processes and training to reduce human factor impacts to safe and successful flight test. This paper describes the research configuration, experiment functionality, overall risk mitigation, flight test approach and results, and lessons learned of adaptive controls research of the Full-Scale Advanced Systems Testbed.
Let us keep observing and play in sand boxes (Henry Darcy Medal Lecture)
NASA Astrophysics Data System (ADS)
Illangasekare, T. H.
2012-04-01
Henry Darcy was a civil engineer recognized for a number of technical achievements and scientific discoveries. The sand column experiments for which he is known revealed the linear relationship that exists between fluid motion and driving forces at low velocities. Freeze and Back (1983) stated, ''The experiments carried out by Darcy with the help of his assistant, Ritter, in Dijon, France in 1855 and 1856 represent the beginning of groundwater hydrology as a quantitative science." Because of the prominence given to this experiment, two important facts behind Darcy's contributions to subsurface hydrology have not received much attention. First, Darcy was not only a good engineer, but he was also a highly respected scientist whose knowledge of both the fundamentals of fluid mechanics and the natural world of geology led to better conceptualizing and quantifying of groundwater processes at relevant scales to solve practical problems. The experiments for which he is known may have already been conceived, based on his theoretical understanding, and the results were anticipated (Brown 2002). Second, Darcy, through his contributions with Dupuit, showed that they understood hydrologeology at a regional scale and developed methods for quantification at the scale of geologic stratum (Ritz and Bobek, 2008). The primary thesis of this talk is that scientific contributions such as the one Darcy made require appreciation and a thorough understanding of fundamental theory coupled with observation and recording of phenomena both in nature and in the laboratory. Along with all of the significant theoretical, mathematical modeling, and computational advances we have made in the last several decades, laboratory experiments designed to observe phenomena and processes for better insight, accurate data generation, and hypothesis development are critically important to make scientific and engineering advances to address some of the emerging and societally important problems in hydrology and water resources engineering. Kleinhans et al. (2010) convincingly argued the same point, noting, "Many major issues of hydrology are open to experimental investigation." Current and emerging problems with water supply and their hydrologic implications are associated with sustainability of water as a resource for global food production, clean water for potable use, protection of human health, and impacts and implications of global warming and climate change on water resources. This talk will address the subsurface hydrologic science issues that are central to these problems and the role laboratory experimentation can play in helping to advance the basic knowledge. Improved understanding of fundamental flow, transport, reactive, and biological processes that occur at the pore-scale and their manifestation at different modeling and observational scales will continue to advance the subsurface science. Challenges also come from the need to integrate porous media systems with bio-geochemical and atmospheric systems, requiring observing and quantifying complex phenomena across interfaces (e.g., fluid/fluid in pores to land/atmospheric in the field). This talk will discuss how carefully designed and theory driven experiments at various test scales can play a central role in providing answers to critical scientific questions and how they will help to fill knowledge gaps. It will also be shown that careful observations will lead to the refinement of existing theories or the development of new ones. Focusing on the subsurface, the need to keep observing through controlled laboratory experimentation in various test scales from small cells to large sand boxes will be emphasized. How the insights obtained from such experiments will complement modeling and field investigations are highlighted through examples.
Reitzel, Lorraine R; Smith, Nathan Grant; Obasi, Ezemenari M; Forney, Margot; Leventhal, Adam M
2017-05-01
Sexual orientation-related discrimination experiences have been implicated in elevated rates of anxiety symptoms within sexual minority groups. Theory suggests that chronic discrimination experiences may dampen the ability to tolerate distress, increasing vulnerability for anxiety. This study examined the role of distress tolerance, or the capacity to withstand negative emotions, as a construct underlying associations between discriminatory experiences and anxiety among sexual minority adults. Participants (N=119;M age =36.4±14.8; 50% cisgender male, 31% cisgender female, 19% transgender; 37% non-Latino white) were recruited from Houston, Texas. Measures administered included the Heterosexist Harassment, Rejection, and Discrimination Scale (discrimination experiences), Distress Tolerance Scale (distress tolerance), and the State-Trait Inventory for Cognitive and Somatic Anxiety (anxiety). The association of discrimination experiences and anxiety through distress tolerance was assessed using covariate-adjusted mediation modeling. Results indicated that sexual orientation-related discrimination experiences were significantly and positively associated with anxiety and that this association was mediated through lower distress tolerance. Significant indirect effects were specific to cognitive (versus somatic) anxiety symptoms. Results suggest that distress tolerance may be an explanatory mechanism in the association between discriminatory experiences and cognitive symptoms of anxiety and a potentially relevant target within clinical interventions to address anxiety-related health disparities among sexual minority adults. However, more sophisticated designs are needed to delineate causal associations. Copyright © 2016 Elsevier Ltd. All rights reserved.
Operation and Development Status of the Spacecraft Fire Experiments (Saffire)
NASA Technical Reports Server (NTRS)
Ruff, Gary A.; Urban, David L.
2016-01-01
Since 2012, a series of Spacecraft Fire Experiments (Saffire) have been under development by the Spacecraft Fire Safety Demonstration (SFS Demo) project, funded by NASA's Advanced Exploration Systems Division. The overall objective of this project is to reduce the uncertainty and risk associated with the design of spacecraft fire safety systems for NASA's exploration missions. The approach to achieving this goal has been to define, develop, and conduct experiments that address gaps in spacecraft fire safety knowledge and capabilities identified by NASA's Fire Safety System Maturation Team. The Spacecraft Fire Experiments (Saffire-I, -II, and -III) are material flammability tests at length scales that are realistic for a spacecraft fire in low-gravity. The specific objectives of these three experiments are to (1) determine how rapidly a large scale fire grows in low-gravity and (2) investigate the low-g flammability limits compared to those obtained in NASA's normal gravity material flammability screening test. The experiments will be conducted in Orbital ATK's Cygnus vehicle after it has unberthed from the International Space Station. The tests will be fully automated with the data downlinked at the conclusion of the test before the Cygnus vehicle reenters the atmosphere. This paper discusses the status of the Saffire-I, II, and III experiments followed by a review of the fire safety technology gaps that are driving the development of objectives for the next series of experiments, Saffire-IV, V, and VI.
Brühlmann, David; Sokolov, Michael; Butté, Alessandro; Sauer, Markus; Hemberger, Jürgen; Souquet, Jonathan; Broly, Hervé; Jordan, Martin
2017-07-01
Rational and high-throughput optimization of mammalian cell culture media has a great potential to modulate recombinant protein product quality. We present a process design method based on parallel design-of-experiment (DoE) of CHO fed-batch cultures in 96-deepwell plates to modulate monoclonal antibody (mAb) glycosylation using medium supplements. To reduce the risk of losing valuable information in an intricate joint screening, 17 compounds were separated into five different groups, considering their mode of biological action. The concentration ranges of the medium supplements were defined according to information encountered in the literature and in-house experience. The screening experiments produced wide glycosylation pattern ranges. Multivariate analysis including principal component analysis and decision trees was used to select the best performing glycosylation modulators. Subsequent D-optimal quadratic design with four factors (three promising compounds and temperature shift) in shake tubes confirmed the outcome of the selection process and provided a solid basis for sequential process development at a larger scale. The glycosylation profile with respect to the specifications for biosimilarity was greatly improved in shake tube experiments: 75% of the conditions were equally close or closer to the specifications for biosimilarity than the best 25% in 96-deepwell plates. Biotechnol. Bioeng. 2017;114: 1448-1458. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Donovan, M A; Drasgow, F; Munson, L J
1998-10-01
The Perceptions of Fair Interpersonal Treatment (PFIT) scale was designed to assess employees' perceptions of the interpersonal treatment in their work environment. Analyses of the factor structure and reliability of this new instrument indicate that the PFIT scale is a reliable instrument composed of 2 factors: supervisor treatment and coworker treatment. It was hypothesized that the PFIT scale would be positively correlated with job satisfaction variables and negatively correlated with work withdrawal, job withdrawal, experiences of sexual harassment, and an organization's tolerance of sexual harassment. Results based on 509 employees in a private-sector organization and 217 female faculty and staff members at a large midwestern university supported these hypotheses. Arguments that common method variance and employees' dispositions are responsible for the significant correlations between the PFIT scale and other job-related variables were eliminated. The implications of these results are discussed.