Large area crop inventory experiment crop assessment subsystem software requirements document
NASA Technical Reports Server (NTRS)
1975-01-01
The functional data processing requirements are described for the Crop Assessment Subsystem of the Large Area Crop Inventory Experiment. These requirements are used as a guide for software development and implementation.
NASA Technical Reports Server (NTRS)
1979-01-01
The performance, design and verification requirements for the space Construction Automated Fabrication Experiment (SCAFE) are defined. The SCAFE program defines, develops, and demonstrates the techniques, processes, and equipment required for the automatic fabrication of structural elements in space and for the assembly of such elements into a large, lightweight structure. The program defines a large structural platform to be constructed in orbit using the space shuttle as a launch vehicle and construction base.
NASA Technical Reports Server (NTRS)
Murphy, J. D.; Dideriksen, R. I.
1975-01-01
The application of remote sensing technology by the U.S. Department of Agriculture (USDA) is examined. The activities of the USDA Remote-Sensing User Requirement Task Force which include cataloging USDA requirements for earth resources data, determining those requirements that would return maximum benefits by using remote sensing technology and developing a plan for acquiring, processing, analyzing, and distributing data to satisfy those requirements are described. Emphasis is placed on the large area crop inventory experiment and its relationship to the task force.
Control of Flexible Structures (COFS) Flight Experiment Background and Description
NASA Technical Reports Server (NTRS)
Hanks, B. R.
1985-01-01
A fundamental problem in designing and delivering large space structures to orbit is to provide sufficient structural stiffness and static configuration precision to meet performance requirements. These requirements are directly related to control requirements and the degree of control system sophistication available to supplement the as-built structure. Background and rationale are presented for a research study in structures, structural dynamics, and controls using a relatively large, flexible beam as a focus. This experiment would address fundamental problems applicable to large, flexible space structures in general and would involve a combination of ground tests, flight behavior prediction, and instrumented orbital tests. Intended to be multidisciplinary but basic within each discipline, the experiment should provide improved understanding and confidence in making design trades between structural conservatism and control system sophistication for meeting static shape and dynamic response/stability requirements. Quantitative results should be obtained for use in improving the validity of ground tests for verifying flight performance analyses.
NASA Technical Reports Server (NTRS)
Brumfield, M. L. (Compiler)
1984-01-01
A plan to develop a space technology experiments platform (STEP) was examined. NASA Langley Research Center held a STEP Experiment Requirements Workshop on June 29 and 30 and July 1, 1983, at which experiment proposers were invited to present more detailed information on their experiment concept and requirements. A feasibility and preliminary definition study was conducted and the preliminary definition of STEP capabilities and experiment concepts and expected requirements for support services are presented. The preliminary definition of STEP capabilities based on detailed review of potential experiment requirements is investigated. Topics discussed include: Shuttle on-orbit dynamics; effects of the space environment on damping materials; erectable beam experiment; technology for development of very large solar array deployers; thermal energy management process experiment; photovoltaic concentrater pointing dynamics and plasma interactions; vibration isolation technology; flight tests of a synthetic aperture radar antenna with use of STEP.
Gene Expression Analysis: Teaching Students to Do 30,000 Experiments at Once with Microarray
ERIC Educational Resources Information Center
Carvalho, Felicia I.; Johns, Christopher; Gillespie, Marc E.
2012-01-01
Genome scale experiments routinely produce large data sets that require computational analysis, yet there are few student-based labs that illustrate the design and execution of these experiments. In order for students to understand and participate in the genomic world, teaching labs must be available where students generate and analyze large data…
NASA Astrophysics Data System (ADS)
Matter, John; Gnanvo, Kondo; Liyanage, Nilanga; Solid Collaboration; Moller Collaboration
2017-09-01
The JLab Parity Violation In Deep Inelastic Scattering (PVDIS) experiment will use the upgraded 12 GeV beam and proposed Solenoidal Large Intensity Device (SoLID) to measure the parity-violating electroweak asymmetry in DIS of polarized electrons with high precision in order to search for physics beyond the Standard Model. Unlike many prior Parity-Violating Electron Scattering (PVES) experiments, PVDIS is a single-particle tracking experiment. Furthermore the experiment's high luminosity combined with the SoLID spectrometer's open configuration creates high-background conditions. As such, the PVDIS experiment has the most demanding tracking detector needs of any PVES experiment to date, requiring precision detectors capable of operating at high-rate conditions in PVDIS's full production luminosity. Developments in large-area GEM detector R&D and SoLID simulations have demonstrated that GEMs provide a cost-effective solution for PVDIS's tracking needs. The integrating-detector-based JLab Measurement Of Lepton Lepton Electroweak Reaction (MOLLER) experiment requires high-precision tracking for acceptance calibration. Large-area GEMs will be used as tracking detectors for MOLLER as well. The conceptual designs of GEM detectors for the PVDIS and MOLLER experiments will be presented.
Industrial metrology as applied to large physics experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Veal, D.
1993-05-01
A physics experiment is a large complex 3-D object (typ. 1200 m{sup 3}, 35000 tonnes), with sub-millimetric alignment requirements. Two generic survey alignment tasks can be identified; first, an iterative positioning of the apparatus subsystems in space and, second, a quantification of as-built parameters. The most convenient measurement technique is industrial triangulation but the complexity of the measured object and measurement environment constraints frequently requires a more sophisticated approach. To enlarge the ``survey alignment toolbox`` measurement techniques commonly associated with other disciplines such as geodesy, applied geodesy for accelerator alignment, and mechanical engineering are also used. Disparate observables require amore » heavy reliance on least squares programs for campaign pre-analysis and calculation. This paper will offer an introduction to the alignment of physics experiments and will identify trends for the next generation of SSC experiments.« less
Trumbo, Michael C; Leiting, Kari A; McDaniel, Mark A; Hodge, Gordon K
2016-06-01
A robust finding within laboratory research is that structuring information as a test confers benefit on long-term retention-referred to as the testing effect. Although well characterized in laboratory environments, the testing effect has been explored infrequently within ecologically valid contexts. We conducted a series of 3 experiments within a very large introductory college-level course. Experiment 1 examined the impact of required versus optional frequent low-stakes testing (quizzes) on student grades, revealing students were much more likely to take advantage of quizzing if it was a required course component. Experiment 2 implemented a method of evaluating pedagogical intervention within a single course (thereby controlling for instructor bias and student self-selection), which revealed a testing effect. Experiment 3 ruled out additional exposure to information as an explanation for the findings of Experiment 2 and suggested that students at the college level, enrolled in very large sections, accept frequent quizzing well. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Wind-turbine-performance assessment
NASA Astrophysics Data System (ADS)
Vachon, W. A.
1982-06-01
An updated summary of recent test data and experiences is reported from both federally and privately funded large wind turbine (WT) development and test programs, and from key WT programs in Europe. Progress and experiences on both the cluster of three MOD-2 2.5-MW WT's, the MOD-1 2-MW WT, and other WT installations are described. An examination of recent test experiences and plans from approximately five privately funded large WT programs in the United States indicates that, during machine checkout and startup, technical problems are identified, which require and startup, a number of technical problems are identified, which will require design changes and create program delays.
2005-06-01
The Office of Personnel Management (OPM) is issuing this final regulation to amend the Federal Employees Health Benefits Acquisition Regulation (FEHBAR). It establishes requirements, including audit, for Federal Employees Health Benefits Program (FEHB) experience-rated carriers' Large Provider Agreements. It also modifies the dollar threshold for review of carriers' subcontract agreements; revises the definitions of Cost or Pricing Data and Experience-rate to reflect mental health parity requirements; updates the contract records retention requirement; updates the FEHB Clause Matrix; and conforms subpart and paragraph references to Federal Acquisition Regulation (FAR) revisions made since we last updated the FEHBAR.
Big questions, big science: meeting the challenges of global ecology.
Schimel, David; Keller, Michael
2015-04-01
Ecologists are increasingly tackling questions that require significant infrastucture, large experiments, networks of observations, and complex data and computation. Key hypotheses in ecology increasingly require more investment, and larger data sets to be tested than can be collected by a single investigator's or s group of investigator's labs, sustained for longer than a typical grant. Large-scale projects are expensive, so their scientific return on the investment has to justify the opportunity cost-the science foregone because resources were expended on a large project rather than supporting a number of individual projects. In addition, their management must be accountable and efficient in the use of significant resources, requiring the use of formal systems engineering and project management to mitigate risk of failure. Mapping the scientific method into formal project management requires both scientists able to work in the context, and a project implementation team sensitive to the unique requirements of ecology. Sponsoring agencies, under pressure from external and internal forces, experience many pressures that push them towards counterproductive project management but a scientific community aware and experienced in large project science can mitigate these tendencies. For big ecology to result in great science, ecologists must become informed, aware and engaged in the advocacy and governance of large ecological projects.
Ultra high molecular weight polyethylene: Optical features at millimeter wavelengths
NASA Astrophysics Data System (ADS)
D'Alessandro, G.; Paiella, A.; Coppolecchia, A.; Castellano, M. G.; Colantoni, I.; de Bernardis, P.; Lamagna, L.; Masi, S.
2018-05-01
The next generation of experiments for the measurement of the Cosmic Microwave Background (CMB) requires more and more the use of advanced materials, with specific physical and structural properties. An example is the material used for receiver's cryostat windows and internal lenses. The large throughput of current CMB experiments requires a large diameter (of the order of 0.5 m) of these parts, resulting in heavy structural and optical requirements on the material to be used. Ultra High Molecular Weight (UHMW) polyethylene (PE) features high resistance to traction and good transmissivity in the frequency range of interest. In this paper, we discuss the possibility of using UHMW PE for windows and lenses in experiments working at millimeter wavelengths, by measuring its optical properties: emissivity, transmission and refraction index. Our measurements show that the material is well suited to this purpose.
Two-phase reduced gravity experiments for a space reactor design
NASA Technical Reports Server (NTRS)
Antoniak, Zenen I.
1987-01-01
Future space missions researchers envision using large nuclear reactors with either a single or a two-phase alkali-metal working fluid. The design and analysis of such reactors require state-of-the-art computer codes that can properly treat alkali-metal flow and heat transfer in a reduced-gravity environment. New flow regime maps, models, and correlations are required if the codes are to be successfully applied to reduced-gravity flow and heat transfer. General plans are put forth for the reduced-gravity experiments which will have to be performed, at NASA facilities, with benign fluids. Data from the reduced-gravity experiments with innocuous fluids are to be combined with normal gravity data from two-phase alkali-metal experiments. Because these reduced-gravity experiments will be very basic, and will employ small test loops of simple geometry, a large measure of commonality exists between them and experiments planned by other organizations. It is recommended that a committee be formed to coordinate all ongoing and planned reduced gravity flow experiments.
Detonation failure characterization of non-ideal explosives
NASA Astrophysics Data System (ADS)
Janesheski, Robert S.; Groven, Lori J.; Son, Steven
2012-03-01
Non-ideal explosives are currently poorly characterized, hence limiting the modeling of them. Current characterization requires large-scale testing to obtain steady detonation wave characterization for analysis due to the relatively thick reaction zones. Use of a microwave interferometer applied to small-scale confined transient experiments is being implemented to allow for time resolved characterization of a failing detonation. The microwave interferometer measures the position of a failing detonation wave in a tube that is initiated with a booster charge. Experiments have been performed with ammonium nitrate and various fuel compositions (diesel fuel and mineral oil). It was observed that the failure dynamics are influenced by factors such as chemical composition and confiner thickness. Future work is planned to calibrate models to these small-scale experiments and eventually validate the models with available large scale experiments. This experiment is shown to be repeatable, shows dependence on reactive properties, and can be performed with little required material.
Mary Beth Adams
2018-01-01
To better understand the impacts of a changing environment and interactions with forest management options for forest resources, including soil, large long-term experiments are required. Such experiments require careful documentation of reference or pre-experimental conditions. This publication describes the Middle Mountain Long-term Soil Productivity (LTSP) Study,...
Development of a verification program for deployable truss advanced technology
NASA Technical Reports Server (NTRS)
Dyer, Jack E.
1988-01-01
Use of large deployable space structures to satisfy the growth demands of space systems is contingent upon reducing the associated risks that pervade many related technical disciplines. The overall objectives of this program was to develop a detailed plan to verify deployable truss advanced technology applicable to future large space structures and to develop a preliminary design of a deployable truss reflector/beam structure for use a a technology demonstration test article. The planning is based on a Shuttle flight experiment program using deployable 5 and 15 meter aperture tetrahedral truss reflections and a 20 m long deployable truss beam structure. The plan addresses validation of analytical methods, the degree to which ground testing adequately simulates flight and in-space testing requirements for large precision antenna designs. Based on an assessment of future NASA and DOD space system requirements, the program was developed to verify four critical technology areas: deployment, shape accuracy and control, pointing and alignment, and articulation and maneuvers. The flight experiment technology verification objectives can be met using two shuttle flights with the total experiment integrated on a single Shuttle Test Experiment Platform (STEP) and a Mission Peculiar Experiment Support Structure (MPESS). First flight of the experiment can be achieved 60 months after go-ahead with a total program duration of 90 months.
NASA Technical Reports Server (NTRS)
Ross, R. G., Jr.
1982-01-01
The Jet Propulsion Laboratory has developed a number of photovoltaic test and measurement specifications to guide the development of modules toward the requirements of future large-scale applications. Experience with these specifications and the extensive module measurement and testing that has accompanied their use is examined. Conclusions are drawn relative to three aspects of product certification: performance measurement, endurance testing and safety evaluation.
NASA Technical Reports Server (NTRS)
1976-01-01
Results of studies performed on the magnetospheric and plasma portion of the AMPS are presented. Magnetospheric and plasma in space experiments and instruments are described along with packaging (palletization) concepts. The described magnetospheric and plasma experiments were considered as separate entities. Instrumentation ospheric and plasma experiments were considered as separate entities. Instrumentation requirements and operations were formulated to provide sufficient data for unambiguous interpretation of results without relying upon other experiments of the series. Where ground observations are specified, an assumption was made that large-scale additions or modifications to existing facilities were not required.
The CREAM-CE: First experiences, results and requirements of the four LHC experiments
NASA Astrophysics Data System (ADS)
Mendez Lorenzo, Patricia; Santinelli, Roberto; Sciaba, Andrea; Thackray, Nick; Shiers, Jamie; Renshall, Harry; Sgaravatto, Massimo; Padhi, Sanjay
2010-04-01
In terms of the gLite middleware, the current LCG-CE used by the four LHC experiments is about to be deprecated. The new CREAM-CE service (Computing Resource Execution And Management) has been approved to replace the previous service. CREAM-CE is a lightweight service created to handle job management operations at the CE level. It is able to accept requests both via the gLite WMS service and also via direct submission for transmission to the local batch system. This flexible duality provides the experiments with a large level of freedom to adapt the service to their own computing models, but at the same time it requires a careful follow up of the requirements and tests of the experiments to ensure that their needs are fulfilled before real data taking. In this paper we present the current testing results of the four LHC experiments concerning this new service. The operations procedures, which have been elaborated together with the experiment support teams will be discussed. Finally, the experiments requirements and the expectations for both the sites and the service itself are exposed in detail.
Group Size Effect on Cooperation in One-Shot Social Dilemmas II: Curvilinear Effect.
Capraro, Valerio; Barcelo, Hélène
2015-01-01
In a world in which many pressing global issues require large scale cooperation, understanding the group size effect on cooperative behavior is a topic of central importance. Yet, the nature of this effect remains largely unknown, with lab experiments insisting that it is either positive or negative or null, and field experiments suggesting that it is instead curvilinear. Here we shed light on this apparent contradiction by considering a novel class of public goods games inspired to the realistic scenario in which the natural output limits of the public good imply that the benefit of cooperation increases fast for early contributions and then decelerates. We report on a large lab experiment providing evidence that, in this case, group size has a curvilinear effect on cooperation, according to which intermediate-size groups cooperate more than smaller groups and more than larger groups. In doing so, our findings help fill the gap between lab experiments and field experiments and suggest concrete ways to promote large scale cooperation among people.
Space construction system analysis. Part 2: Space construction experiments concepts
NASA Technical Reports Server (NTRS)
Boddy, J. A.; Wiley, L. F.; Gimlich, G. W.; Greenberg, H. S.; Hart, R. J.; Lefever, A. E.; Lillenas, A. N.; Totah, R. S.
1980-01-01
Technology areas in the orbital assembly of large space structures are addressed. The areas included structures, remotely operated assembly techniques, and control and stabilization. Various large space structure design concepts are reviewed and their construction procedures and requirements are identified.
CERN data services for LHC computing
NASA Astrophysics Data System (ADS)
Espinal, X.; Bocchi, E.; Chan, B.; Fiorot, A.; Iven, J.; Lo Presti, G.; Lopez, J.; Gonzalez, H.; Lamanna, M.; Mascetti, L.; Moscicki, J.; Pace, A.; Peters, A.; Ponce, S.; Rousseau, H.; van der Ster, D.
2017-10-01
Dependability, resilience, adaptability and efficiency. Growing requirements require tailoring storage services and novel solutions. Unprecedented volumes of data coming from the broad number of experiments at CERN need to be quickly available in a highly scalable way for large-scale processing and data distribution while in parallel they are routed to tape for long-term archival. These activities are critical for the success of HEP experiments. Nowadays we operate at high incoming throughput (14GB/s during 2015 LHC Pb-Pb run and 11PB in July 2016) and with concurrent complex production work-loads. In parallel our systems provide the platform for the continuous user and experiment driven work-loads for large-scale data analysis, including end-user access and sharing. The storage services at CERN cover the needs of our community: EOS and CASTOR as a large-scale storage; CERNBox for end-user access and sharing; Ceph as data back-end for the CERN OpenStack infrastructure, NFS services and S3 functionality; AFS for legacy distributed-file-system services. In this paper we will summarise the experience in supporting LHC experiments and the transition of our infrastructure from static monolithic systems to flexible components providing a more coherent environment with pluggable protocols, tuneable QoS, sharing capabilities and fine grained ACLs management while continuing to guarantee dependable and robust services.
LDR structural experiment definition
NASA Technical Reports Server (NTRS)
Russell, R. A.
1988-01-01
A system study to develop the definition of a structural flight experiment for a large precision segmented reflector on the Space Station was accomplished by the Boeing Aerospace Company for NASA's Langley Research Center. The objective of the study was to use a Large Deployable Reflector (LDR) baseline configuration as the basis for focusing an experiment definition, so that the resulting accommodation requirements and interface constraints could be used as part of the mission requirements data base for Space Station. The primary objectives of the first experiment are to construct the primary mirror support truss and to determine its structural and thermal characteristics. Addition of an optical bench, thermal shield and primary mirror segments, and alignment of the optical components, would occur on a second experiment. The structure would then be moved to the payload point system for pointing, optical control, and scientific optical measurement for a third experiment. Experiment 1 will deploy the primary support truss while it is attached to the instrument module structure. The ability to adjust the mirror attachment points and to attach several dummy primary mirror segments with a robotic system will also be demonstrated. Experiment 2 will be achieved by adding new components and equipment to experiment one. Experiment 3 will demonstrate advanced control strategies, active adjustment of the primary mirror alignment, and technologies associated with optical sensing.
Transfer of movement sequences: bigger is better.
Dean, Noah J; Kovacs, Attila J; Shea, Charles H
2008-02-01
Experiment 1 was conducted to determine if proportional transfer from "small to large" scale movements is as effective as transferring from "large to small." We hypothesize that the learning of larger scale movement will require the participant to learn to manage the generation, storage, and dissipation of forces better than when practicing smaller scale movements. Thus, we predict an advantage for transfer of larger scale movements to smaller scale movements relative to transfer from smaller to larger scale movements. Experiment 2 was conducted to determine if adding a load to a smaller scale movement would enhance later transfer to a larger scale movement sequence. It was hypothesized that the added load would require the participants to consider the dynamics of the movement to a greater extent than without the load. The results replicated earlier findings of effective transfer from large to small movements, but consistent with our hypothesis, transfer was less effective from small to large (Experiment 1). However, when a load was added during acquisition transfer from small to large was enhanced even though the load was removed during the transfer test. These results are consistent with the notion that the transfer asymmetry noted in Experiment 1 was due to factors related to movement dynamics that were enhanced during practice of the larger scale movement sequence, but not during the practice of the smaller scale movement sequence. The findings that the movement structure is unaffected by transfer direction but the movement dynamics are influenced by transfer direction is consistent with hierarchal models of sequence production.
Remote experimental site concept development
NASA Astrophysics Data System (ADS)
Casper, Thomas A.; Meyer, William; Butner, David
1995-01-01
Scientific research is now often conducted on large and expensive experiments that utilize collaborative efforts on a national or international scale to explore physics and engineering issues. This is particularly true for the current US magnetic fusion energy program where collaboration on existing facilities has increased in importance and will form the basis for future efforts. As fusion energy research approaches reactor conditions, the trend is towards fewer large and expensive experimental facilities, leaving many major institutions without local experiments. Since the expertise of various groups is a valuable resource, it is important to integrate these teams into an overall scientific program. To sustain continued involvement in experiments, scientists are now often required to travel frequently, or to move their families, to the new large facilities. This problem is common to many other different fields of scientific research. The next-generation tokamaks, such as the Tokamak Physics Experiment (TPX) or the International Thermonuclear Experimental Reactor (ITER), will operate in steady-state or long pulse mode and produce fluxes of fusion reaction products sufficient to activate the surrounding structures. As a direct consequence, remote operation requiring robotics and video monitoring will become necessary, with only brief and limited access to the vessel area allowed. Even the on-site control room, data acquisition facilities, and work areas will be remotely located from the experiment, isolated by large biological barriers, and connected with fiber-optics. Current planning for the ITER experiment includes a network of control room facilities to be located in the countries of the four major international partners; USA, Russian Federation, Japan, and the European Community.
USDA-ARS?s Scientific Manuscript database
Insect diets are often complex mixtures of vitamins, salts, preservatives, and nutrients (carbohydrates, lipids and proteins). To determine the effect of varying the doses of multiple components, the traditional approach requires large factorial experiments resulting in very large numbers of treat...
NASA Astrophysics Data System (ADS)
Agnew, Donald L.; Vinkey, Victor F.; Runge, Fritz C.
1989-04-01
A study was conducted to determine how the Large Deployable Reflector (LDR) might benefit from the use of the space station for assembly, checkout, deployment, servicing, refurbishment, and technology development. Requirements that must be met by the space station to supply benefits for a selected scenario are summarized. Quantitative and qualitative data are supplied. Space station requirements for LDR which may be utilized by other missions are identified. A technology development mission for LDR is outlined and requirements summarized. A preliminary experiment plan is included. Space Station Data Base SAA 0020 and TDM 2411 are updated.
NASA Technical Reports Server (NTRS)
Agnew, Donald L.; Vinkey, Victor F.; Runge, Fritz C.
1989-01-01
A study was conducted to determine how the Large Deployable Reflector (LDR) might benefit from the use of the space station for assembly, checkout, deployment, servicing, refurbishment, and technology development. Requirements that must be met by the space station to supply benefits for a selected scenario are summarized. Quantitative and qualitative data are supplied. Space station requirements for LDR which may be utilized by other missions are identified. A technology development mission for LDR is outlined and requirements summarized. A preliminary experiment plan is included. Space Station Data Base SAA 0020 and TDM 2411 are updated.
Role of substrate quality on IC performance and yields
NASA Technical Reports Server (NTRS)
Thomas, R. N.
1981-01-01
The development of silicon and gallium arsenide crystal growth for the production of large diameter substrates are discussed. Large area substrates of significantly improved compositional purity, dopant distribution and structural perfection on a microscopic as well as macroscopic scale are important requirements. The exploratory use of magnetic fields to suppress convection effects in Czochralski crystal growth is addressed. The growth of large crystals in space appears impractical at present however the efforts to improve substrate quality could benefit from the experiences gained in smaller scale growth experiments conducted in the zero gravity environment of space.
Optimal diet for production of normative adults of the Diaprepes root weevil, Diaprepes abbreviatus
USDA-ARS?s Scientific Manuscript database
Insect diets are complex mixtures of vitamins, salts, preservatives, and nutrients (carbohydrates, lipids and proteins). To determine the effect of varying the doses of multiple components, the traditional approach requires large factorial experiments resulting in very large numbers of treatment com...
LSST system analysis and integration task for an advanced science and application space platform
NASA Technical Reports Server (NTRS)
1980-01-01
To support the development of an advanced science and application space platform (ASASP) requirements of a representative set of payloads requiring large separation distances selected from the Science and Applications Space Platform data base. These payloads were a 100 meter diameter atmospheric gravity wave antenna, a 100 meter by 100 meter particle beam injection experiment, a 2 meter diameter, 18 meter long astrometric telescope, and a 15 meter diameter, 35 meter long large ambient deployable IR telescope. A low earth orbit at 500 km altitude and 56 deg inclination was selected as being the best compromise for meeting payload requirements. Platform subsystems were defined which would support the payload requirements and a physical platform concept was developed. Structural system requirements which included utilities accommodation, interface requirements, and platform strength and stiffness requirements were developed. An attitude control system concept was also described. The resultant ASASP concept was analyzed and technological developments deemed necessary in the area of large space systems were recommended.
Multiple-rotor-cycle 2D PASS experiments with applications to (207)Pb NMR spectroscopy.
Vogt, F G; Gibson, J M; Aurentz, D J; Mueller, K T; Benesi, A J
2000-03-01
Thetwo-dimensional phase-adjusted spinning sidebands (2D PASS) experiment is a useful technique for simplifying magic-angle spinning (MAS) NMR spectra that contain overlapping or complicated spinning sideband manifolds. The pulse sequence separates spinning sidebands by their order in a two-dimensional experiment. The result is an isotropic/anisotropic correlation experiment, in which a sheared projection of the 2D spectrum effectively yields an isotropic spectrum with no sidebands. The original 2D PASS experiment works best at lower MAS speeds (1-5 kHz). At higher spinning speeds (8-12 kHz) the experiment requires higher RF power levels so that the pulses do not overlap. In the case of nuclei such as (207)Pb, a large chemical shift anisotropy often yields too many spinning sidebands to be handled by a reasonable 2D PASS experiment unless higher spinning speeds are used. Performing the experiment at these speeds requires fewer 2D rows and a correspondingly shorter experimental time. Therefore, we have implemented PASS pulse sequences that occupy multiple MAS rotor cycles, thereby avoiding pulse overlap. These multiple-rotor-cycle 2D PASS sequences are intended for use in high-speed MAS situations such as those required by (207)Pb. A version of the multiple-rotor-cycle 2D PASS sequence that uses composite pulses to suppress spectral artifacts is also presented. These sequences are demonstrated on (207)Pb test samples, including lead zirconate, a perovskite-phase compound that is representative of a large class of interesting materials. Copyright 2000 Academic Press.
Design of General-purpose Industrial signal acquisition system in a large scientific device
NASA Astrophysics Data System (ADS)
Ren, Bin; Yang, Lei
2018-02-01
In order to measure the industrial signal of a large scientific device experiment, a set of industrial data general-purpose acquisition system has been designed. It can collect 4~20mA current signal and 0~10V voltage signal. Through the practical experiments, it shows that the system is flexible, reliable, convenient and economical, and the system has characters of high definition and strong anti-interference ability. Thus, the system fully meets the design requirements..
Control system design for the large space systems technology reference platform
NASA Technical Reports Server (NTRS)
Edmunds, R. S.
1982-01-01
Structural models and classical frequency domain control system designs were developed for the large space systems technology (LSST) reference platform which consists of a central bus structure, solar panels, and platform arms on which a variety of experiments may be mounted. It is shown that operation of multiple independently articulated payloads on a single platform presents major problems when subarc second pointing stability is required. Experiment compatibility will be an important operational consideration for systems of this type.
Solar array flight dynamic experiment
NASA Technical Reports Server (NTRS)
Schock, R. W.
1986-01-01
The purpose of the Solar Array Flight Dynamic Experiment (SAFDE) is to demonstrate the feasibility of on-orbit measurement and ground processing of large space structures dynamic characteristics. Test definition or verification provides the dynamic characteristic accuracy required for control systems use. An illumination/measurement system was developed to fly on space shuttle flight STS-31D. The system was designed to dynamically evaluate a large solar array called the Solar Array Flight Experiment (SAFE) that had been scheduled for this flight. The SAFDE system consisted of a set of laser diode illuminators, retroreflective targets, an intelligent star tracker receiver and the associated equipment to power, condition, and record the results. In six tests on STS-41D, data was successfully acquired from 18 retroreflector targets and ground processed, post flight, to define the solar array's dynamic characteristic. The flight experiment proved the viability of on-orbit test definition of large space structures dynamic characteristics. Future large space structures controllability should be greatly enhanced by this capability.
Solar array flight dynamic experiment
NASA Technical Reports Server (NTRS)
Schock, Richard W.
1986-01-01
The purpose of the Solar Array Flight Dynamic Experiment (SAFDE) is to demonstrate the feasibility of on-orbit measurement and ground processing of large space structures dynamic characteristics. Test definition or verification provides the dynamic characteristic accuracy required for control systems use. An illumination/measurement system was developed to fly on Space Shuttle flight STS-31D. The system was designed to dynamically evaluate a large solar array called the Solar Array Flight Experiment (SAFE) that had been scheduled for this flight. The SAFDE system consisted of a set of laser diode illuminators, retroreflective targets, an intelligent star tracker receiver and the associated equipment to power, condition, and record the results. In six tests on STS-41D, data was successfully acquired from 18 retroreflector targets and ground processed, post flight, to define the solar array's dynamic characteristic. The flight experiment proved the viability of on-orbit test definition of large space structures dynamic characteristics. Future large space structures controllability should be greatly enhanced by this capability.
Solar array flight dynamic experiment
NASA Technical Reports Server (NTRS)
Schock, Richard W.
1987-01-01
The purpose of the Solar Array Flight Dynamic Experiment (SAFDE) is to demonstrate the feasibility of on-orbit measurement and ground processing of large space structures' dynamic characteristics. Test definition or verification provides the dynamic characteristic accuracy required for control systems use. An illumination/measurement system was developed to fly on space shuttle flight STS-41D. The system was designed to dynamically evaluate a large solar array called the Solar Array Flight Experiment (SAFE) that had been scheduled for this flight. The SAFDE system consisted of a set of laser diode illuminators, retroreflective targets, an intelligent star tracker receiver and the associated equipment to power, condition, and record the results. In six tests on STS-41D, data was successfully acquired from 18 retroreflector targets and ground processed, post flight, to define the solar array's dynamic characteristic. The flight experiment proved the viability of on-orbit test definition of large space structures dynamic characteristics. Future large space structures controllability should be greatly enhanced by this capability.
Recent experience with design and manufacture of cine lenses
NASA Astrophysics Data System (ADS)
Thorpe, Michael D.; Dalzell, Kristen E.
2015-09-01
Modern cine lenses require a high degree of aberration correction over a large and ever expanding image size. At low to medium volume production levels, these highly corrected designs also require a workable tolerance set and compensation scheme for successful manufacture. In this paper we discuss the design and manufacture of cine lenses with reference to current designs both internal and in the patent literature and some experience in design, tolerancing and manufacturing these lenses in medium volume production.
Draghici, Sorin; Tarca, Adi L; Yu, Longfei; Ethier, Stephen; Romero, Roberto
2008-03-01
The BioArray Software Environment (BASE) is a very popular MIAME-compliant, web-based microarray data repository. However in BASE, like in most other microarray data repositories, the experiment annotation and raw data uploading can be very timeconsuming, especially for large microarray experiments. We developed KUTE (Karmanos Universal daTabase for microarray Experiments), as a plug-in for BASE 2.0 that addresses these issues. KUTE provides an automatic experiment annotation feature and a completely redesigned data work-flow that dramatically reduce the human-computer interaction time. For instance, in BASE 2.0 a typical Affymetrix experiment involving 100 arrays required 4 h 30 min of user interaction time forexperiment annotation, and 45 min for data upload/download. In contrast, for the same experiment, KUTE required only 28 min of user interaction time for experiment annotation, and 3.3 min for data upload/download. http://vortex.cs.wayne.edu/kute/index.html.
User Oriented Techniques to Support Interaction and Decision Making with Large Educational Databases
ERIC Educational Resources Information Center
Hartley, Roger; Almuhaidib, Saud M. Y.
2007-01-01
Information Technology is developing rapidly and providing policy/decision makers with large amounts of information that require processing and analysis. Decision support systems (DSS) aim to provide tools that not only help such analyses, but enable the decision maker to experiment and simulate the effects of different policies and selection…
NASA Astrophysics Data System (ADS)
Fiore, Sandro; Płóciennik, Marcin; Doutriaux, Charles; Blanquer, Ignacio; Barbera, Roberto; Donvito, Giacinto; Williams, Dean N.; Anantharaj, Valentine; Salomoni, Davide D.; Aloisio, Giovanni
2017-04-01
In many scientific domains such as climate, data is often n-dimensional and requires tools that support specialized data types and primitives to be properly stored, accessed, analysed and visualized. Moreover, new challenges arise in large-scale scenarios and eco-systems where petabytes (PB) of data can be available and data can be distributed and/or replicated, such as the Earth System Grid Federation (ESGF) serving the Coupled Model Intercomparison Project, Phase 5 (CMIP5) experiment, providing access to 2.5PB of data for the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5). A case study on climate models intercomparison data analysis addressing several classes of multi-model experiments is being implemented in the context of the EU H2020 INDIGO-DataCloud project. Such experiments require the availability of large amount of data (multi-terabyte order) related to the output of several climate models simulations as well as the exploitation of scientific data management tools for large-scale data analytics. More specifically, the talk discusses in detail a use case on precipitation trend analysis in terms of requirements, architectural design solution, and infrastructural implementation. The experiment has been tested and validated on CMIP5 datasets, in the context of a large scale distributed testbed across EU and US involving three ESGF sites (LLNL, ORNL, and CMCC) and one central orchestrator site (PSNC). The general "environment" of the case study relates to: (i) multi-model data analysis inter-comparison challenges; (ii) addressed on CMIP5 data; and (iii) which are made available through the IS-ENES/ESGF infrastructure. The added value of the solution proposed in the INDIGO-DataCloud project are summarized in the following: (i) it implements a different paradigm (from client- to server-side); (ii) it intrinsically reduces data movement; (iii) it makes lightweight the end-user setup; (iv) it fosters re-usability (of data, final/intermediate products, workflows, sessions, etc.) since everything is managed on the server-side; (v) it complements, extends and interoperates with the ESGF stack; (vi) it provides a "tool" for scientists to run multi-model experiments, and finally; and (vii) it can drastically reduce the time-to-solution for these experiments from weeks to hours. At the time the contribution is being written, the proposed testbed represents the first concrete implementation of a distributed multi-model experiment in the ESGF/CMIP context joining server-side and parallel processing, end-to-end workflow management and cloud computing. As opposed to the current scenario based on search & discovery, data download, and client-based data analysis, the INDIGO-DataCloud architectural solution described in this contribution addresses the scientific computing & analytics requirements by providing a paradigm shift based on server-side and high performance big data frameworks jointly with two-level workflow management systems realized at the PaaS level via a cloud infrastructure.
Inquiry-Based Experiments for Large-Scale Introduction to PCR and Restriction Enzyme Digests
ERIC Educational Resources Information Center
Johanson, Kelly E.; Watt, Terry J.
2015-01-01
Polymerase chain reaction and restriction endonuclease digest are important techniques that should be included in all Biochemistry and Molecular Biology laboratory curriculums. These techniques are frequently taught at an advanced level, requiring many hours of student and faculty time. Here we present two inquiry-based experiments that are…
The Long Duration Exposure Facility (LDEF). Mission 1 Experiments.
ERIC Educational Resources Information Center
Clark, Lenwood G., Ed.; And Others
The Long Duration Exposure Facility (LDEF) has been designed to take advantage of the two-way transportation capability of the space shuttle by providing a large number of economical opportunities for science and technology experiments that require modest electrical power and data processing while in space and which benefit from postflight…
Tools and Setups for Experiments with AC and Rotating Magnetic Fields
ERIC Educational Resources Information Center
Ponikvar, D.
2010-01-01
A rotating magnetic field is the basis for the transformation of electrical energy to mechanical energy. School experiments on the rotating magnetic field are rare since they require the use of specially prepared mechanical setups and/or relatively large, three-phase power supplies to achieve strong magnetic fields. This paper proposes several…
ERIC Educational Resources Information Center
Arlinghaus, Barry P.
2002-01-01
Responses from 276 of 1,128 faculty at Association to Advance Collegiate Schools of Business-accredited schools indicated that 231 were certified; only 96 served in professional associations; large numbers received financial support for professional activities, but only small numbers felt involvement or relevant experience (which are required for…
1981-06-01
numnber) Annealing Fusion Sealed Mirrors ULED Mirrors Boule Large Lightweight Mirror Core Low Expansion Glass Coremaker Mirror Blanks Forming Furnace...Experiments 34 4 10.6 Grinder Procurement 35 J 1 I GLOSSARY Alpha - Coef. of thermal expansion. Boule - The disc of glass formed in the furnace. Cell...turning over of large plates, cores or mirrors. Flowout - Method used to produce large diameter plates from small diameter boules. Glass - Used in the
Classical boson sampling algorithms with superior performance to near-term experiments
NASA Astrophysics Data System (ADS)
Neville, Alex; Sparrow, Chris; Clifford, Raphaël; Johnston, Eric; Birchall, Patrick M.; Montanaro, Ashley; Laing, Anthony
2017-12-01
It is predicted that quantum computers will dramatically outperform their conventional counterparts. However, large-scale universal quantum computers are yet to be built. Boson sampling is a rudimentary quantum algorithm tailored to the platform of linear optics, which has sparked interest as a rapid way to demonstrate such quantum supremacy. Photon statistics are governed by intractable matrix functions, which suggests that sampling from the distribution obtained by injecting photons into a linear optical network could be solved more quickly by a photonic experiment than by a classical computer. The apparently low resource requirements for large boson sampling experiments have raised expectations of a near-term demonstration of quantum supremacy by boson sampling. Here we present classical boson sampling algorithms and theoretical analyses of prospects for scaling boson sampling experiments, showing that near-term quantum supremacy via boson sampling is unlikely. Our classical algorithm, based on Metropolised independence sampling, allowed the boson sampling problem to be solved for 30 photons with standard computing hardware. Compared to current experiments, a demonstration of quantum supremacy over a successful implementation of these classical methods on a supercomputer would require the number of photons and experimental components to increase by orders of magnitude, while tackling exponentially scaling photon loss.
NASA's Microgravity Fluid Physics Program: Tolerability to Residual Accelerations
NASA Technical Reports Server (NTRS)
Skarda, J. Raymond
1998-01-01
An overview of the NASA microgravity fluid physics program is presented. The necessary quality of a reduced-gravity environment in terms of tolerable residual acceleration or g levels is a concern that is inevitably raised for each new microgravity experiment. Methodologies have been reported in the literature that provide guidance in obtaining reasonable estimates of residual acceleration sensitivity for a broad range of fluid physics phenomena. Furthermore, a relatively large and growing database of microgravity experiments that have successfully been performed in terrestrial reduced gravity facilities and orbiting platforms exists. Similarity of experimental conditions and hardware, in some cases, lead to new experiments adopting prior experiments g-requirements. Rationale applied to other experiments can, in principle, be a valuable guide to assist new Principal Investigators, PIs, in determining the residual acceleration tolerability of their flight experiments. The availability of g-requirements rationale from prior (mu)g experiments is discussed. An example of establishing g tolerability requirements is demonstrated, using a current microgravity fluid physics flight experiment. The Fluids and Combustion Facility (FCF) which is currently manifested on the US Laboratory of the International Space Station (ISS) will provide opportunities for fluid physics and combustion experiments throughout the life of the ISS. Although the FCF is not intended to accommodate all fluid physics experiments, it is expected to meet the science requirements of approximately 80% of the new PIs that enter the microgravity fluid physics program. The residual acceleration requirements for the FCF fluid physics experiments are based on a set of fourteen reference fluid physics experiments which are discussed.
Space Construction Experiment Definition Study (SCEDS), part 3. Volume 2: Study results
NASA Technical Reports Server (NTRS)
1983-01-01
The essential controls and dynamics community needs for a large space structures is addressed by the basic Space Construction Experiments (SCE)/MAST configuration and enhanced configurations for follow-on flights. The SCE/MAST can be integrated on a single structures technology experiments platform (STEP). The experiment objectives can be accomplished without the need for EVA and it is anticipated that further design refinements will eliminate the requirement to use the remote manipulator system.
Using Internet-Based Language Testing Capacity to the Private Sector
ERIC Educational Resources Information Center
Garcia Laborda, Jesus
2009-01-01
Language testing has a large number of commercial applications in both the institutional and the private sectors. Some jobs in the health services sector or the public services sector require foreign language skills and these skills require continuous and efficient language assessments. Based on an experience developed through the cooperation of…
Dynamic behavior of particles in spacecraft
NASA Technical Reports Server (NTRS)
Perrine, B. S.
1981-01-01
The behavior of particles relative to a spacecraft frame of reference was examined. Significant spatial excursions of particles in space can occur relative to the spacecraft frame of reference as a result of drag deceleration of the vehicle. These vehicle excursions tend to be large as time increases. Thus, if the particle is required to remain in a specified volume, constraints may be required. Thus, for example, in levitation experiments it may be extremely difficult to turn off the forces of constraint which keep the particles in a specified region. This means experiments which are sensitive to disturbances may be very difficult to perform if perturbation forces are required to be absent.
Microscope-Based Fluid Physics Experiments in the Fluids and Combustion Facility on ISS
NASA Technical Reports Server (NTRS)
Doherty, Michael P.; Motil, Susan M.; Snead, John H.; Malarik, Diane C.
2000-01-01
At the NASA Glenn Research Center, the Microgravity Science Program is planning to conduct a large number of experiments on the International Space Station in both the Fluid Physics and Combustion Science disciplines, and is developing flight experiment hardware for use within the International Space Station's Fluids and Combustion Facility. Four fluids physics experiments that require an optical microscope will be sequentially conducted within a subrack payload to the Fluids Integrated Rack of the Fluids and Combustion Facility called the Light Microscopy Module, which will provide the containment, changeout, and diagnostic capabilities to perform the experiments. The Light Microscopy Module is planned as a fully remotely controllable on-orbit microscope facility, allowing flexible scheduling and control of experiments within International Space Station resources. This paper will focus on the four microscope-based experiments, specifically, their objectives and the sample cell and instrument hardware to accommodate their requirements.
A third-order silicon racetrack add-drop filter with a moderate feature size
NASA Astrophysics Data System (ADS)
Wang, Ying; Zhou, Xin; Chen, Qian; Shao, Yue; Chen, Xiangning; Huang, Qingzhong; Jiang, Wei
2018-01-01
In this work, we design and fabricate a highly compact third-order racetrack add-drop filter consisting of silicon waveguides with modified widths on a silicon-on-insulator (SOI) wafer. Compared to the previous approach that requires an exceedingly narrow coupling gap less than 100nm, we propose a new approach that enlarges the minimum feature size of the whole device to be 300 nm to reduce the process requirement. The three-dimensional finite-difference time-domain (3D-FDTD) method is used for simulation. Experiment results show good agreement with simulation results in property. In the experiment, the filter shows a nearly box-like channel dropping response, which has a large flat 3-dB bandwidth ({3 nm), relatively large FSR ({13.3 nm) and out-of-band rejection larger than 14 dB at the drop port with a footprint of 0.0006 mm2 . The device is small and simple enough to have a wide range of applications in large scale on-chip photonic integration circuits.
Shock tube study of dissociation and relaxation in 1,1-difluoroethane and vinyl fluoride.
Xu, Hui; Kiefer, John H; Sivaramakrishnan, Raghu; Giri, Binod R; Tranter, Robert S
2007-08-21
This paper reports measurements of the thermal dissociation of 1,1-difluoroethane in the shock tube. The experiments employ laser-schlieren measurements of rate for the dominant HF elimination using 10% 1,1-difluoroethane in Kr over 1500-2000 K and 43 < P < 424 torr. The vinyl fluoride product of this process then dissociates affecting the late observations. We thus include a laser schlieren study (1717-2332 K, 75 < P < 482 torr in 10 and 4% vinyl fluoride in Kr) of this dissociation. This latter work also includes a set of experiments using shock-tube time-of-flight mass spectrometry (4% vinyl fluoride in neon, 1500-1980 K, 500 < P < 1300 torr). These time-of-flight experiments confirm the theoretical expectation that the only reaction in vinyl fluoride is HF elimination. The dissociation experiments are augmented by laser schlieren measurements of vibrational relaxation (1-20% C(2)H(3)F in Kr, 415-1975 K, 5 < P < 50 torr, and 2 and 5% C(2)H(4)F(2) in Kr, 700-1350 K, 6 < P < 22 torr). These experiments exhibit very rapid relaxation, and incubation delays should be negligible in dissociation. An RRKM model of dissociation in 1,1-difluoroethane based on a G3B3 calculation of barrier and other properties fits the experiments but requires a very large DeltaE(down) of 1600 cm(-1), similar to that found in a previous examination of 1,1,1-trifluoroethane. Dissociation of vinyl fluoride is complicated by the presence of two parallel HF eliminations, both three-center and four-center. Structure calculations find nearly equal barriers for these, and TST calculations show almost identical k(infinity). An RRKM fit to the observed falloff again requires an unusually large DeltaE(down) and the experiments actually support a slightly reduced barrier. These large energy-transfer parameters now seem routine in these large fluorinated species. It is perhaps a surprising result for which there is as yet no explanation.
Simulation of FRET dyes allows quantitative comparison against experimental data
NASA Astrophysics Data System (ADS)
Reinartz, Ines; Sinner, Claude; Nettels, Daniel; Stucki-Buchli, Brigitte; Stockmar, Florian; Panek, Pawel T.; Jacob, Christoph R.; Nienhaus, Gerd Ulrich; Schuler, Benjamin; Schug, Alexander
2018-03-01
Fully understanding biomolecular function requires detailed insight into the systems' structural dynamics. Powerful experimental techniques such as single molecule Förster Resonance Energy Transfer (FRET) provide access to such dynamic information yet have to be carefully interpreted. Molecular simulations can complement these experiments but typically face limits in accessing slow time scales and large or unstructured systems. Here, we introduce a coarse-grained simulation technique that tackles these challenges. While requiring only few parameters, we maintain full protein flexibility and include all heavy atoms of proteins, linkers, and dyes. We are able to sufficiently reduce computational demands to simulate large or heterogeneous structural dynamics and ensembles on slow time scales found in, e.g., protein folding. The simulations allow for calculating FRET efficiencies which quantitatively agree with experimentally determined values. By providing atomically resolved trajectories, this work supports the planning and microscopic interpretation of experiments. Overall, these results highlight how simulations and experiments can complement each other leading to new insights into biomolecular dynamics and function.
NASA Technical Reports Server (NTRS)
McQuillen, John; Green, Robert D.; Henrie, Ben; Miller, Teresa; Chiaramonte, Fran
2014-01-01
The Physical Science Informatics (PSI) system is the next step in this an effort to make NASA sponsored flight data available to the scientific and engineering community, along with the general public. The experimental data, from six overall disciplines, Combustion Science, Fluid Physics, Complex Fluids, Fundamental Physics, and Materials Science, will present some unique challenges. Besides data in textual or numerical format, large portions of both the raw and analyzed data for many of these experiments are digital images and video, requiring large data storage requirements. In addition, the accessible data will include experiment design and engineering data (including applicable drawings), any analytical or numerical models, publications, reports, and patents, and any commercial products developed as a result of the research. This objective of paper includes the following: Present the preliminary layout (Figure 2) of MABE data within the PSI database. Obtain feedback on the layout. Present the procedure to obtain access to this database.
NASA Astrophysics Data System (ADS)
Koestner, Stefan
2009-09-01
With the increasing size and degree of complexity of today's experiments in high energy physics the required amount of work and complexity to integrate a complete subdetector into an experiment control system is often underestimated. We report here on the layered software structure and protocols used by the LHCb experiment to control its detectors and readout boards. The experiment control system of LHCb is based on the commercial SCADA system PVSS II. Readout boards which are outside the radiation area are accessed via embedded credit card sized PCs which are connected to a large local area network. The SPECS protocol is used for control of the front end electronics. Finite state machines are introduced to facilitate the control of a large number of electronic devices and to model the whole experiment at the level of an expert system.
Participation and Collaborative Learning in Large Class Sizes: Wiki, Can You Help Me?
ERIC Educational Resources Information Center
de Arriba, Raúl
2017-01-01
Collaborative learning has a long tradition within higher education. However, its application in classes with a large number of students is complicated, since it is a teaching method that requires a high level of participation from the students and careful monitoring of the process by the educator. This article presents an experience in…
NASA Technical Reports Server (NTRS)
1982-01-01
The four astronomical objectives addressed include: the measurement and mapping of extended low surface brightness infrared emission from the galaxy; the measurement of diffuse emission from intergalactic material and/or galaxies and quasi-stellar objects; the measurement of the zodiacal dust emission; and the measurement of a large number of discrete infrared sources.
Use of shuttle for life sciences
NASA Technical Reports Server (NTRS)
Mcgaughy, R. E.
1972-01-01
The use of the space shuttle in carrying out biological and medical research programs, with emphasis on the sortie module, is examined. Detailed descriptions are given of the goals of space life science disciplines, how the sortie can meet these goals, and what shuttle design features are necessary for a viable biological and medical experiment program. Conclusions show that the space shuttle sortie module is capable of accommodating all biological experiments contemplated at this time except for those involving large specimens or large populations of small animals; however, these experiments can be done with a specially designed module. It was also found that at least two weeks is required to do a meaningful survey of biological effects.
Large Bore Powder Gun Qualification (U)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rabern, Donald A.; Valdiviez, Robert
A Large Bore Powder Gun (LBPG) is being designed to enable experimentalists to characterize material behavior outside the capabilities of the NNSS JASPER and LANL TA-55 PF-4 guns. The combination of these three guns will create a capability to conduct impact experiments over a wide range of pressures and shock profiles. The Large Bore Powder Gun will be fielded at the Nevada National Security Site (NNSS) U1a Complex. The Complex is nearly 1000 ft below ground with dedicated drifts for testing, instrumentation, and post-shot entombment. To ensure the reliability, safety, and performance of the LBPG, a qualification plan has beenmore » established and documented here. Requirements for the LBPG have been established and documented in WE-14-TR-0065 U A, Large Bore Powder Gun Customer Requirements. The document includes the requirements for the physics experiments, the gun and confinement systems, and operations at NNSS. A detailed description of the requirements is established in that document and is referred to and quoted throughout this document. Two Gun and Confinement Systems will be fielded. The Prototype Gun will be used primarily to characterize the gun and confinement performance and be the primary platform for qualification actions. This gun will also be used to investigate and qualify target and diagnostic modifications through the life of the program (U1a.104 Drift). An identical gun, the Physics Gun, will be fielded for confirmatory and Pu experiments (U1a.102D Drift). Both guns will be qualified for operation. The Gun and Confinement System design will be qualified through analysis, inspection, and testing using the Prototype Gun for the majority of process. The Physics Gun will be qualified through inspection and a limited number of qualification tests to ensure performance and behavior equivalent to the Prototype gun. Figure 1.1 shows the partial configuration of U1a and the locations of the Prototype and Physics Gun/Confinement Systems.« less
LDR structural experiment definition
NASA Technical Reports Server (NTRS)
Russell, Richard A.; Gates, Richard M.
1988-01-01
A study was performed to develop the definition of a structural flight experiment for a large precision segmented reflector that would utilize the Space Station. The objective of the study was to use the Large Deployable Reflector (LDR) baseline configuration for focusing on experiment definition activity which would identify the Space Station accommodation requirements and interface constraints. Results of the study defined three Space Station based experiments to demonstrate the technologies needed for an LDR type structure. The basic experiment configurations are the same as the JPL baseline except that the primary mirror truss is 10 meters in diameter instead of 20. The primary objectives of the first experiment are to construct the primary mirror support truss and to determine its structural and thermal characteristics. Addition of the optical bench, thermal shield and primary mirror segments and alignment of the optical components occur on the second experiment. The structure will then be moved to the payload pointing system for pointing, optical control and scientific optical measurement for the third experiment.
Overview of large scale experiments performed within the LBB project in the Czech Republic
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kadecka, P.; Lauerova, D.
1997-04-01
During several recent years NRI Rez has been performing the LBB analyses of safety significant primary circuit pipings of NPPs in Czech and Slovak Republics. The analyses covered the NPPs with reactors WWER 440 Type 230 and 213 and WWER 1000 Type 320. Within the relevant LBB projects undertaken with the aim to prove the fulfilling of the requirements of LBB, a series of large scale experiments were performed. The goal of these experiments was to verify the properties of the components selected, and to prove the quality and/or conservatism of assessments used in the LBB-analyses. In this poster, amore » brief overview of experiments performed in Czech Republic under guidance of NRI Rez is presented.« less
ERIC Educational Resources Information Center
Lown, Nick; Davies, Ioan; Cordingley, Lis; Bundy, Chris; Braidman, Isobel
2009-01-01
Personal and Professional Development (PPD) is now key to the undergraduate medical curriculum and requires provision of appropriate learning experiences. In order to achieve this, it is essential that we ascertain students' perceptions of what is important in their PPD. We required a methodological approach suitable for a large medical school,…
Monitoring and Hardware Management for Critical Fusion Plasma Instrumentation
NASA Astrophysics Data System (ADS)
Carvalho, Paulo F.; Santos, Bruno; Correia, Miguel; Combo, Álvaro M.; Rodrigues, AntÓnio P.; Pereira, Rita C.; Fernandes, Ana; Cruz, Nuno; Sousa, Jorge; Carvalho, Bernardo B.; Batista, AntÓnio J. N.; Correia, Carlos M. B. A.; Gonçalves, Bruno
2018-01-01
Controlled nuclear fusion aims to obtain energy by particles collision confined inside a nuclear reactor (Tokamak). These ionized particles, heavier isotopes of hydrogen, are the main elements inside of plasma that is kept at high temperatures (millions of Celsius degrees). Due to high temperatures and magnetic confinement, plasma is exposed to several sources of instabilities which require a set of procedures by the control and data acquisition systems throughout fusion experiments processes. Control and data acquisition systems often used in nuclear fusion experiments are based on the Advanced Telecommunication Computer Architecture (AdvancedTCA®) standard introduced by the Peripheral Component Interconnect Industrial Manufacturers Group (PICMG®), to meet the demands of telecommunications that require large amount of data (TB) transportation at high transfer rates (Gb/s), to ensure high availability including features such as reliability, serviceability and redundancy. For efficient plasma control, systems are required to collect large amounts of data, process it, store for later analysis, make critical decisions in real time and provide status reports either from the experience itself or the electronic instrumentation involved. Moreover, systems should also ensure the correct handling of detected anomalies and identified faults, notify the system operator of occurred events, decisions taken to acknowledge and implemented changes. Therefore, for everything to work in compliance with specifications it is required that the instrumentation includes hardware management and monitoring mechanisms for both hardware and software. These mechanisms should check the system status by reading sensors, manage events, update inventory databases with hardware system components in use and maintenance, store collected information, update firmware and installed software modules, configure and handle alarms to detect possible system failures and prevent emergency scenarios occurrences. The goal is to ensure high availability of the system and provide safety operation, experiment security and data validation for the fusion experiment. This work aims to contribute to the joint effort of the IPFN control and data acquisition group to develop a hardware management and monitoring application for control and data acquisition instrumentation especially designed for large scale tokamaks like ITER.
NASA Technical Reports Server (NTRS)
Hollinden, A. B.; Eaton, L. R.; Vaughan, W. W.
1972-01-01
The first results of an ongoing preliminary-concept and detailed-feasibility study of a zero-gravity earth-orbital cloud physics research facility are reviewed. Current planning and thinking are being shaped by two major conclusions of this study: (1) there is a strong requirement for and it is feasible to achieve important and significant research in a zero-gravity cloud physics facility; and (2) some very important experiments can be accomplished with 'off-the-shelf' type hardware by astronauts who have no cloud-physics background; the most complicated experiments may require sophisticated observation and motion subsystems and the astronaut may need graduate level cloud physics training; there is a large number of experiments whose complexity varies between these two extremes.
Statistical issues in the design and planning of proteomic profiling experiments.
Cairns, David A
2015-01-01
The statistical design of a clinical proteomics experiment is a critical part of well-undertaken investigation. Standard concepts from experimental design such as randomization, replication and blocking should be applied in all experiments, and this is possible when the experimental conditions are well understood by the investigator. The large number of proteins simultaneously considered in proteomic discovery experiments means that determining the number of required replicates to perform a powerful experiment is more complicated than in simple experiments. However, by using information about the nature of an experiment and making simple assumptions this is achievable for a variety of experiments useful for biomarker discovery and initial validation.
Simulations of the modified gap experiment
NASA Astrophysics Data System (ADS)
Sutherland, Gerrit T.; Benjamin, Richard; Kooker, Douglas
2017-01-01
Modified gap experiment (test) hydrocode simulations predict the trends seen in experimental excess free surface velocity versus input pressure curves for explosives with both large and modest failure diameters. Simulations were conducted for explosive "A", an explosive with a large failure diameter, and for cast TNT, which has a modest failure diameter. Using the best available reactive rate models, the simulations predicted sustained ignition thresholds similar to experiment. This is a threshold where detonation is likely given a long enough run distance. For input pressures greater than the sustained ignition threshold pressure, the simulations predicted too little velocity for explosive "A" and too much velocity for TNT. It was found that a better comparison of experiment and simulation requires additional experimental data for both explosives. It was observed that the choice of reactive rate model for cast TNT can lead to large differences in the predicted modified gap experiment result. The cause of the difference is that the same data was not used to parameterize both models; one set of data was more shock reactive than the other.
Xenon Purification Research and Development for the LZ Dark Matter Experiment
NASA Astrophysics Data System (ADS)
Pech, Katherin
2013-04-01
The LZ Experiment is a next generation dark matter detector based on the current LUX detector design, with a 7-ton active volume. Although many research and development breakthroughs were achieved for the 350 kg LUX detector, the large volume scaling required for LZ presents a new set of design challenges that need to be overcome. Because the search for WIMP-like dark matter requires ultra low background experiments, the xenon target material in the LZ detector must meet purity specifications beyond what is commercially available. This challenge is two-fold. The xenon must contain extremely low amounts of electronegative impurities such as oxygen, which attenuate the charge signal. Additionally, it must also have very little of the inert isotope Kr-85, a beta-emitter that can obscure the dark matter signal in the detector volume. The purity requirements for the LUX experiment have been achieved, but the factor of 20 scaling in volume for LZ and increased demands for sensitivity mean that new research and development work must be done to increase our xenon purification capabilities. This talk will focus on the efforts being done at Case Western Reserve University to meet these strict purity requirements for the LZ Experiment.
ERIC Educational Resources Information Center
Dobkin, Carlos; Gil, Ricard; Marion, Justin
2010-01-01
In this paper we estimate the effect of class attendance on exam performance by implementing a policy in three large economics classes that required students scoring below the median on the midterm exam to attend class. This policy generated a large discontinuity in the rate of post-midterm attendance at the median of the midterm score. We…
Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas
2016-01-01
Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.
Using respondent uncertainty to mitigate hypothetical bias in a stated choice experiment
Richard C. Ready; Patricia A. Champ; Jennifer L. Lawton
2010-01-01
In a choice experiment study, willingness to pay for a public good estimated from hypothetical choices was three times as large as willingness to pay estimated from choices requiring actual payment. This hypothetical bias was related to the stated level of certainty of respondents. We develop protocols to measure respondent certainty in the context of a choice...
Big questions, big science: meeting the challenges of global ecology
David Schimel; Michael Keller
2015-01-01
Ecologists are increasingly tackling questions that require significant infrastucture, large experiments, networks of observations, and complex data and computation. Key hypotheses in ecology increasingly require more investment, and larger data sets to be tested than can be collected by a single investigatorâs or s group of investigatorâs labs, sustained for longer...
High Energy Physics and Nuclear Physics Network Requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dart, Eli; Bauerdick, Lothar; Bell, Greg
The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements needed by instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 25 years. In August 2013, ESnet and the DOE SC Offices of High Energy Physics (HEP) and Nuclear Physicsmore » (NP) organized a review to characterize the networking requirements of the programs funded by the HEP and NP program offices. Several key findings resulted from the review. Among them: 1. The Large Hadron Collider?s ATLAS (A Toroidal LHC Apparatus) and CMS (Compact Muon Solenoid) experiments are adopting remote input/output (I/O) as a core component of their data analysis infrastructure. This will significantly increase their demands on the network from both a reliability perspective and a performance perspective. 2. The Large Hadron Collider (LHC) experiments (particularly ATLAS and CMS) are working to integrate network awareness into the workflow systems that manage the large number of daily analysis jobs (1 million analysis jobs per day for ATLAS), which are an integral part of the experiments. Collaboration with networking organizations such as ESnet, and the consumption of performance data (e.g., from perfSONAR [PERformance Service Oriented Network monitoring Architecture]) are critical to the success of these efforts. 3. The international aspects of HEP and NP collaborations continue to expand. This includes the LHC experiments, the Relativistic Heavy Ion Collider (RHIC) experiments, the Belle II Collaboration, the Large Synoptic Survey Telescope (LSST), and others. The international nature of these collaborations makes them heavily reliant on transoceanic connectivity, which is subject to longer term service disruptions than terrestrial connectivity. The network engineering aspects of undersea connectivity will continue to be a significant part of the planning, deployment, and operation of the data analysis infrastructure for HEP and NP experiments for the foreseeable future. Given their critical dependency on networking services, the experiments have expressed the need for tight integration (both technically and operationally) of the domestic and the transoceanic parts of the network infrastructure that supports the experiments. 4. The datasets associated with simulations continue to increase in size, and the need to move these datasets between analysis centers is placing ever-increasing demands on networks and on data management systems at the supercomputing centers. In addition, there is a need to harmonize cybersecurity practice with the data transfer performance requirements of the science. This report expands on these points, and addresses others as well. The report contains a findings section in addition to the text of the case studies discussed during the review.« less
LLRF System for the Fermilab Muon g-2 and Mu2e Projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Varghese, P.; Chase, B.
The Mu2e experiment measures the conversion rate of muons into electrons and the Muon g-2 experiment measures the muon magnetic moment. Both experiments require 53 MHz batches of 8 GeV protons to be re-bunched into 150 ns, 2.5 MHz pulses for extraction to the g-2 target for Muon g-2 and to a delivery ring with a single RF cavity running at 2.36 MHz for Mu2e. The LLRF system for both experiments is implemented in a SOC FPGA board integrated into the existing 53 MHz LLRF system in a VXI crate. The tight timing requirements, the large frequency difference and themore » non-harmonic relationship between the two RF systems provide unique challenges to the LLRF system design to achieve the required phase alignment specifications for beam formation, transfers and beam extinction between pulses. The new LLRF system design for both projects is described and the results of the initial beam commissioning tests for the Muon g-2 experiment are presented.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michal, Carl A.; Hastings, Simon P.; Lee, Lik Hang
2008-02-07
We present NMR signals from a strongly coupled homonuclear spin system, {sup 1}H nuclei in adamantane, acquired with simultaneous two-photon excitation under conditions of the Lee-Goldburg experiment. Small coils, having inside diameters of 0.36 mm, are used to achieve two-photon nutation frequencies of {approx}20 kHz. The very large rf field strengths required give rise to large Bloch-Siegert shifts that cannot be neglected. These experiments are found to be extremely sensitive to inhomogeneity of the applied rf field, and due to the Bloch-Siegert shift, exhibit a large asymmetry in response between the upper and lower Lee-Goldburg offsets. Two-photon excitation has themore » potential to enhance both the sensitivity and performance of homonuclear dipolar decoupling, but is made challenging by the high rf power required and the difficulties introduced by the inhomogeneous Bloch-Siegert shift. We briefly discuss a variation of the frequency-switched Lee-Goldburg technique, called four-quadrant Lee-Goldburg (4QLG) that produces net precession in the x-y plane, with a reduced chemical shift scaling factor of 1/3.« less
Factors modulating the effect of divided attention during retrieval of words.
Fernandes, Myra A; Moscovitch, Morris
2002-07-01
In this study, we examined variables modulating interference effects on episodic memory under divided attention conditions during retrieval for a list of unrelated words. In Experiment 1, we found that distracting tasks that required animacy or syllable decisions to visually presented words, without a memory load, produced large interference on free recall performance. In Experiment 2, a distracting task requiring phonemic decisions about nonsense words produced a far larger interference effect than one that required semantic decisions about pictures. In Experiment 3, we replicated the effect of the nonsense-word distracting task on memory and showed that an equally resource-demanding picture-based task produced significant interference with memory retrieval, although the effect was smaller in magnitude. Taken together, the results suggest that free recall is disrupted by competition for phonological or word-form representations during retrieval and, to a lesser extent, by competition for semantic representations.
AAFE large deployable antenna development program: Executive summary
NASA Technical Reports Server (NTRS)
1977-01-01
The large deployable antenna development program sponsored by the Advanced Applications Flight Experiments of the Langley Research Center is summarized. Projected user requirements for large diameter deployable reflector antennas were reviewed. Trade-off studies for the selection of a design concept for 10-meter diameter reflectors were made. A hoop/column concept was selected as the baseline concept. Parametric data are presented for 15-meter, 30-meter, and 100-meter diameters. A 1.82-meter diameter engineering model which demonstrated the feasiblity of the concept is described.
ERIC Educational Resources Information Center
Albright, Joyce L.; Safer, L. Arthur; Sims, Paul A.; Tagaris, Angela; Glasgow, Denise; Sekulich, Kim M.; Zaharis, Mary C.
2017-01-01
This research investigated the experiences of new teachers employed in urban school districts and how these novice teachers' perceived school district and school administrators' support required to retain them as well as teacher's perceptions of their pre-service experiences and/or induction programs necessary to prepare them for an urban…
Description of the Spacecraft Control Laboratory Experiment (SCOLE) facility
NASA Technical Reports Server (NTRS)
Williams, Jeffrey P.; Rallo, Rosemary A.
1987-01-01
A laboratory facility for the study of control laws for large flexible spacecraft is described. The facility fulfills the requirements of the Spacecraft Control Laboratory Experiment (SCOLE) design challenge for a laboratory experiment, which will allow slew maneuvers and pointing operations. The structural apparatus is described in detail sufficient for modelling purposes. The sensor and actuator types and characteristics are described so that identification and control algorithms may be designed. The control implementation computer and real-time subroutines are also described.
Description of the Spacecraft Control Laboratory Experiment (SCOLE) facility
NASA Technical Reports Server (NTRS)
Williams, Jeffrey P.; Rallo, Rosemary A.
1987-01-01
A laboratory facility for the study of control laws for large flexible spacecraft is described. The facility fulfills the requirements of the Spacecraft Control Laboratory Experiment (SCOLE) design challenge for laboratory experiments, which will allow slew maneuvers and pointing operations. The structural apparatus is described in detail sufficient for modelling purposes. The sensor and actuator types and characteristics are described so that identification and control algorithms may be designed. The control implementation computer and real-time subroutines are also described.
Exposure setup for animal experiments using a parabolic reflector.
Schelkshorn, S; Tejero, S; Detlefsen, J
2007-01-01
The exposure setup presented is intended for a controlled, long-term and continuous exposition (20 Months, 24 h/day) of a large number of animals (100 rats at minimum) with standard GSM and UMTS signals, at 900 MHz and 1966 MHz, respectively. To obtain a homogeneous field within a large volume, the setup is based on the 'compact range' principle well known from antenna measurement facilities to produce a plane wave at relatively short ranges from the reflector. All requirements imposed due to the in vivo nature of the experiment, i.e. air-conditioning and easy access to the cages can be fulfilled.
On the Large-Scaling Issues of Cloud-based Applications for Earth Science Dat
NASA Astrophysics Data System (ADS)
Hua, H.
2016-12-01
Next generation science data systems are needed to address the incoming flood of data from new missions such as NASA's SWOT and NISAR where its SAR data volumes and data throughput rates are order of magnitude larger than present day missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Experiences have shown that to embrace efficient cloud computing approaches for large-scale science data systems requires more than just moving existing code to cloud environments. At large cloud scales, we need to deal with scaling and cost issues. We present our experiences on deploying multiple instances of our hybrid-cloud computing science data system (HySDS) to support large-scale processing of Earth Science data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer 75%-90% costs savings but with an unpredictable computing environment based on market forces.
NASA Technical Reports Server (NTRS)
1985-01-01
The goal of defining a CO2 laser transmitter approach suited to Shuttle Coherent Atmospheric Lidar Experiment (SCALE) requirements is discussed. The adaptation of the existing WINDVAN system to the shuttle environment is addressed. The size, weight, reliability, and efficiency of the existing WINDVAN system are largely compatible with SCALE requirements. Repacking is needed for compatibility with vacuum and thermal environments. Changes are required to ensure survival through launch and landing, mechanical, vibration, and acoustic loads. Existing WINDVAN thermal management approaches depending on convection need to be upgraded zero gravity operations.
Neutron detection with plastic scintillators coupled to solid state photomultiplier detectors
NASA Astrophysics Data System (ADS)
Christian, James F.; Johnson, Erik B.; Fernandez, Daniel E.; Vogel, Samuel; Frank, Rebecca; Stoddard, Graham; Stapels, Christopher; Pereira, Jorge; Zegers, Remco
2017-09-01
The recent reduction of dark current in Silicon Solid-state photomultipliers (SiSSPMs) makes them an attractive alternative to conventional photomultiplier tubes (PMTs) for scintillation detection applications. Nuclear Physics experiments often require large detector volumes made using scintillation materials, which require sensitive photodetectors, such as a PMTs. PMTs add to the size, fragility, and high-voltage requirements as well as distance requirements for experiments using magnetic fields. This work compares RMD's latest detector modules, denoted as the "year 2 prototype", of plastic scintillators that discriminate gamma and high-energy particle events from neutron events using pulse shape discrimination (PSD) coupled to a SiSSPM to the following two detector modules: a similar "year 1 prototype" and a scintillator coupled to a PMT module. It characterizes the noise floor, relative signal-to-noise ratio (SNR), the timing performance, the PSD figure-of-merit (FOM) and the neutron detection efficiency of RMD's detectors. This work also evaluates the scaling of SiSSPM detector modules to accommodate the volumes needed for many Nuclear Physics experiments. The Si SSPM detector module provides a clear advantage in Nuclear Physics experiments that require the following attributes: discrimination of neutron and gamma-ray events, operation in or near strong magnetic fields, and segmentation of the detector.
Deployable antenna phase A study
NASA Technical Reports Server (NTRS)
Schultz, J.; Bernstein, J.; Fischer, G.; Jacobson, G.; Kadar, I.; Marshall, R.; Pflugel, G.; Valentine, J.
1979-01-01
Applications for large deployable antennas were re-examined, flight demonstration objectives were defined, the flight article (antenna) was preliminarily designed, and the flight program and ground development program, including the support equipment, were defined for a proposed space transportation system flight experiment to demonstrate a large (50 to 200 meter) deployable antenna system. Tasks described include: (1) performance requirements analysis; (2) system design and definition; (3) orbital operations analysis; and (4) programmatic analysis.
Tailoring a software production environment for a large project
NASA Technical Reports Server (NTRS)
Levine, D. R.
1984-01-01
A software production environment was constructed to meet the specific goals of a particular large programming project. These goals, the specific solutions as implemented, and the experiences on a project of over 100,000 lines of source code are discussed. The base development environment for this project was an ordinary PWB Unix (tm) system. Several important aspects of the development process required support not available in the existing tool set.
NASA Technical Reports Server (NTRS)
Collins, Emmanuel G., Jr.; Phillips, Douglas J.; Hyland, David C.
1990-01-01
Many large space system concepts will require active vibration control to satisfy critical performance requirements such as line-of-sight accuracy. In order for these concepts to become operational it is imperative that the benefits of active vibration control be practically demonstrated in ground based experiments. The results of the experiment successfully demonstrate active vibration control for a flexible structure. The testbed is the Active Control Technique Evaluation for Spacecraft (ACES) structure at NASA Marshall Space Flight Center. The ACES structure is dynamically traceable to future space systems and especially allows the study of line-of-sight control issues.
An imperative need for global change research in tropical forests.
Zhou, Xuhui; Fu, Yuling; Zhou, Lingyan; Li, Bo; Luo, Yiqi
2013-09-01
Tropical forests play a crucial role in regulating regional and global climate dynamics, and model projections suggest that rapid climate change may result in forest dieback or savannization. However, these predictions are largely based on results from leaf-level studies. How tropical forests respond and feedback to climate change is largely unknown at the ecosystem level. Several complementary approaches have been used to evaluate the effects of climate change on tropical forests, but the results are conflicting, largely due to confounding effects of multiple factors. Although altered precipitation and nitrogen deposition experiments have been conducted in tropical forests, large-scale warming and elevated carbon dioxide (CO2) manipulations are completely lacking, leaving many hypotheses and model predictions untested. Ecosystem-scale experiments to manipulate temperature and CO2 concentration individually or in combination are thus urgently needed to examine their main and interactive effects on tropical forests. Such experiments will provide indispensable data and help gain essential knowledge on biogeochemical, hydrological and biophysical responses and feedbacks of tropical forests to climate change. These datasets can also inform regional and global models for predicting future states of tropical forests and climate systems. The success of such large-scale experiments in natural tropical forests will require an international framework to coordinate collaboration so as to meet the challenges in cost, technological infrastructure and scientific endeavor.
Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas
2017-01-01
Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments. PMID:28190948
Analysis and Ground Testing for Validation of the Inflatable Sunshield in Space (ISIS) Experiment
NASA Technical Reports Server (NTRS)
Lienard, Sebastien; Johnston, John; Adams, Mike; Stanley, Diane; Alfano, Jean-Pierre; Romanacci, Paolo
2000-01-01
The Next Generation Space Telescope (NGST) design requires a large sunshield to protect the large aperture mirror and instrument module from constant solar exposure at its L2 orbit. The structural dynamics of the sunshield must be modeled in order to predict disturbances to the observatory attitude control system and gauge effects on the line of site jitter. Models of large, non-linear membrane systems are not well understood and have not been successfully demonstrated. To answer questions about sunshield dynamic behavior and demonstrate controlled deployment, the NGST project is flying a Pathfinder experiment, the Inflatable Sunshield in Space (ISIS). This paper discusses in detail the modeling and ground-testing efforts performed at the Goddard Space Flight Center to: validate analytical tools for characterizing the dynamic behavior of the deployed sunshield, qualify the experiment for the Space Shuttle, and verify the functionality of the system. Included in the discussion will be test parameters, test setups, problems encountered, and test results.
A Chemical Containment Model for the General Purpose Work Station
NASA Technical Reports Server (NTRS)
Flippen, Alexis A.; Schmidt, Gregory K.
1994-01-01
Contamination control is a critical safety requirement imposed on experiments flying on board the Spacelab. The General Purpose Work Station, a Spacelab support facility used for life sciences space flight experiments, is designed to remove volatile compounds from its internal airpath and thereby minimize contamination of the Spacelab. This is accomplished through the use of a large, multi-stage filter known as the Trace Contaminant Control System. Many experiments planned for the Spacelab require the use of toxic, volatile fixatives in order to preserve specimens prior to postflight analysis. The NASA-Ames Research Center SLS-2 payload, in particular, necessitated the use of several toxic, volatile compounds in order to accomplish the many inflight experiment objectives of this mission. A model was developed based on earlier theories and calculations which provides conservative predictions of the resultant concentrations of these compounds given various spill scenarios. This paper describes the development and application of this model.
Friedman, Jannice; Willis, John H
2013-07-01
Species with extensive ranges experience highly variable environments with respect to temperature, light and soil moisture. Synchronizing the transition from vegetative to floral growth is important to employ favorable conditions for reproduction. Optimal timing of this transition might be different for semelparous annual plants and iteroparous perennial plants. We studied variation in the critical photoperiod necessary for floral induction and the requirement for a period of cold-chilling (vernalization) in 46 populations of annuals and perennials in the Mimulus guttatus species complex. We then examined critical photoperiod and vernalization QTLs in growth chambers using F(2) progeny from annual and perennial parents that differed in their requirements for flowering. We identify extensive variation in critical photoperiod, with most annual populations requiring substantially shorter day lengths to initiate flowering than perennial populations. We discover a novel type of vernalization requirement in perennial populations that is contingent on plants experiencing short days first. QTL analyses identify two large-effect QTLs which influence critical photoperiod. In two separate vernalization experiments we discover each set of crosses contain different large-effect QTLs for vernalization. Mimulus guttatus harbors extensive variation in critical photoperiod and vernalization that may be a consequence of local adaptation. © 2013 The Authors. New Phytologist © 2013 New Phytologist Trust.
Performance and results of the LBNE 35 ton membrane cryostat prototype
Montanari, David; Adamowski, Mark; Hahn, Alan; ...
2015-07-15
We report on the performance and commissioning of the first membrane cryostat to be used for scientific application. The Long Baseline Neutrino Experiment (LBNE) has designed and fabricated a membrane cryostat prototype in collaboration with Ishikawajima-Harima Heavy Industries Co., Ltd. (IHI). LBNE has designed and fabricated the supporting cryogenic system infrastructure and successfully commissioned and operated the first membrane cryostat. Original goals of the prototype are: to demonstrate the membrane cryostat technology in terms of thermal performance, feasibility for liquid argon and leak tightness; to demonstrate that we can remove all the impurities from the vessel and achieve the puritymore » requirements in a membrane cryostat without evacuation; to demonstrate that we can achieve and maintain the purity requirements of the liquid argon using mol sieve and copper filters. The purity requirements of a large liquid argon detector such as LBNE are contaminants below 200 parts per trillion (ppt) oxygen equivalent. LBNE is planning the design and construction of a large liquid argon detector. This presentation will present requirements, design and construction of the LBNE 35 ton membrane cryostat prototype, and detail the commissioning and performance. The experience and results of this prototype are extremely important for the development of the LBNE detector.« less
A theory of requirements definition in engineering design
NASA Astrophysics Data System (ADS)
Eodice, Michael Thomas
2000-10-01
Traditional requirements-definition activities begin with the engineer or design team performing a needs-analysis to identify user requirements. Needs-analysis is generally subjective, and varies according to the composition and experience of the design team. Systematic procedures for defining and ranking requirements are necessary to consolidate the foundation on which the design process is predicated, and to enhance its outcome by providing the designer with a consistent, reliable approach to product development. As a first step towards developing such procedures, research was conducted at Stanford University using empirical data from a NASA spaceflight experiment that flew aboard Space Shuttle mission STS-90 (April 1998). This research was accomplished using ex post facto data analysis. This researcher served in the central role of Experiment Manager for the spaceflight experiment, and acted as a participant-observer while conducting formal research. To better understand requirement structure and evolution, individual requirements were decomposed using AND/OR graphs. The AND/OR graph data structure illustrates requirements evolution, and reveals that the original requirement is often met by fulfilling a series of sub-requirements that are easier to implement. Early in the product life cycle, many hundreds of potential needs were identified; however, it was a smaller subset of these initial needs that were realized in the final product. Based on analysis of a large group of requirements, it was observed that two critical components for any individual requirement were: (1) a stated need, and (2) an advocate to implement the need. Identification of need, or needs-analysis, although a necessary condition, is insufficient to ensure that a stated need evolves into a formal requirement. Equally important to the concept of requirements-definition is the notion of advocacy. Early in the product development cycle of the of the NASA experiment, many potential needs were identified; however, it was only through need-advocate pairing that a stated need became a requirement. Empirical data revealed that when needs were accompanied by strong advocates, they became clear requirements. Conversely, needs without advocates did not become requirements. Hence, need-advocate pairing is useful for predicting needs which will become requirements, and more importantly, needs at risk of not becoming requirements.
Kinetic inductance detectors for far-infrared spectroscopy
NASA Astrophysics Data System (ADS)
Barlis, Alyssa; Aguirre, James; Stevenson, Thomas
2016-07-01
The star formation mechanisms at work in the early universe remain one of the major unsolved problems of modern astrophysics. Many of the luminous galaxies present during the period of peak star formation (between redshifts 1 and 3) were heavily enshrouded in dust, which makes observing their properties difficult at optical wavelengths. However, many spectral lines exist at far-infrared wavelengths that serve as tracers of star formation during that period, in particular fine structure lines of nitrogen, carbon, and oxygen, as well as the carbon monoxide molecule. Using an observation technique known as intensity mapping, it would be possible to observe the total line intensity for a given redshift range even without detecting individual sources. Here, we describe a detector system suitable for a balloonborne spectroscopic intensity mapping experiment at far-infrared wavelengths. The experiment requires an "integralfield" type spectrograph, with modest spectral resolution (R 100) for each of a number of spatial pixels spanning several octaves in wavelength. The detector system uses lumped-element kinetic inductance detectors (LEKIDs), which have the potential to achieve the high sensitivity, low noise, and high multiplexing factor required for this experiment. We detail the design requirements and considerations, and the fabrication process for a prototype LEKID array of 1600 pixels. The pixel design is driven by the need for high responsivity, which requires a small physical volume for the LEKID inductor. In order to minimize two-level system noise, the resonators include large-area interdigitated capacitors. High quality factor resonances are required for a large frequency multiplexing factor. Detectors were fabricated using both trilayer TiN/Ti/TiN recipes and thin-film Al, and are operated at base temperatures near 250 mK.
Outdoor field experience with autonomous RPC based stations
NASA Astrophysics Data System (ADS)
Lopes, L.; Assis, P.; Blanco, A.; Carolino, N.; Cerda, M. A.; Conceição, R.; Cunha, O.; Ferreira, M.; Fonte, P.; Luz, R.; Mendes, L.; Pereira, A.; Pimenta, M.; Sarmento, R.; Tomé, B.
2016-09-01
In the last two decades Resistive Plate Chambers were employed in the Cosmic Ray Experiments COVER-PLASTEX and ARGO/YBJ. In both experiments the detectors were housed indoors, likely owing to gas distribution requirements and the need to control environment variables that directly affect RPCs operational stability. But in experiments where Extended Air Shower (EAS) sampling is necessary, large area arrays composed by dispersed stations are deployed, rendering this kind of approach impossible. In this situation, it would be mandatory to have detectors that could be deployed in small standalone stations, with very rare opportunities for maintenance, and with good resilience to environmental conditions. Aiming to meet these requirements, we started some years ago the development of RPCs for Autonomous Stations. The results from indoor tests and measurements were very promising, both concerning performance and stability under very low gas flow rate, which is the main requirement for Autonomous Stations. In this work we update the indoor results and show the first ones concerning outdoor stable operation. In particular, a dynamic adjustment of the high voltage is applied to keep gas gain constant.
Decadal climate prediction in the large ensemble limit
NASA Astrophysics Data System (ADS)
Yeager, S. G.; Rosenbloom, N. A.; Strand, G.; Lindsay, K. T.; Danabasoglu, G.; Karspeck, A. R.; Bates, S. C.; Meehl, G. A.
2017-12-01
In order to quantify the benefits of initialization for climate prediction on decadal timescales, two parallel sets of historical simulations are required: one "initialized" ensemble that incorporates observations of past climate states and one "uninitialized" ensemble whose internal climate variations evolve freely and without synchronicity. In the large ensemble limit, ensemble averaging isolates potentially predictable forced and internal variance components in the "initialized" set, but only the forced variance remains after averaging the "uninitialized" set. The ensemble size needed to achieve this variance decomposition, and to robustly distinguish initialized from uninitialized decadal predictions, remains poorly constrained. We examine a large ensemble (LE) of initialized decadal prediction (DP) experiments carried out using the Community Earth System Model (CESM). This 40-member CESM-DP-LE set of experiments represents the "initialized" complement to the CESM large ensemble of 20th century runs (CESM-LE) documented in Kay et al. (2015). Both simulation sets share the same model configuration, historical radiative forcings, and large ensemble sizes. The twin experiments afford an unprecedented opportunity to explore the sensitivity of DP skill assessment, and in particular the skill enhancement associated with initialization, to ensemble size. This talk will highlight the benefits of a large ensemble size for initialized predictions of seasonal climate over land in the Atlantic sector as well as predictions of shifts in the likelihood of climate extremes that have large societal impact.
Ground test experiment for large space structures
NASA Technical Reports Server (NTRS)
Tollison, D. K.; Waites, H. B.
1985-01-01
In recent years a new body of control theory has been developed for the design of control systems for Large Space Structures (LSS). The problems of testing this theory on LSS hardware are aggravated by the expense and risk of actual in orbit tests. Ground tests on large space structures can provide a proving ground for candidate control systems, but such tests require a unique facility for their execution. The current development of such a facility at the NASA Marshall Space Flight Center (MSFC) is the subject of this report.
NASA Astrophysics Data System (ADS)
Gallagher, John A.
2016-04-01
The desired operating range of ferroelectric materials with compositions near the morphotropic phase boundary is limited by field induced phase transformations. In [001]C cut and poled relaxor ferroelectric single crystals the mechanically driven ferroelectric rhombohedral to ferroelectric orthorhombic phase transformation is hindered by antagonistic electrical loading. Instability around the phase transformation makes the current experimental technique for characterization of the large field behavior very time consuming. Characterization requires specialized equipment and involves an extensive set of measurements under combined electrical, mechanical, and thermal loads. In this work a mechanism-based model is combined with a more limited set of experiments to obtain the same results. The model utilizes a work-energy criterion that calculates the mechanical work required to induce the transformation and the required electrical work that is removed to reverse the transformation. This is done by defining energy barriers to the transformation. The results of the combined experiment and modeling approach are compared to the fully experimental approach and error is discussed. The model shows excellent predictive capability and is used to substantially reduce the total number of experiments required for characterization. This decreases the time and resources required for characterization of new compositions.
Experiences with Text Mining Large Collections of Unstructured Systems Development Artifacts at JPL
NASA Technical Reports Server (NTRS)
Port, Dan; Nikora, Allen; Hihn, Jairus; Huang, LiGuo
2011-01-01
Often repositories of systems engineering artifacts at NASA's Jet Propulsion Laboratory (JPL) are so large and poorly structured that they have outgrown our capability to effectively manually process their contents to extract useful information. Sophisticated text mining methods and tools seem a quick, low-effort approach to automating our limited manual efforts. Our experiences of exploring such methods mainly in three areas including historical risk analysis, defect identification based on requirements analysis, and over-time analysis of system anomalies at JPL, have shown that obtaining useful results requires substantial unanticipated efforts - from preprocessing the data to transforming the output for practical applications. We have not observed any quick 'wins' or realized benefit from short-term effort avoidance through automation in this area. Surprisingly we have realized a number of unexpected long-term benefits from the process of applying text mining to our repositories. This paper elaborates some of these benefits and our important lessons learned from the process of preparing and applying text mining to large unstructured system artifacts at JPL aiming to benefit future TM applications in similar problem domains and also in hope for being extended to broader areas of applications.
A Fast Evaluation Method for Energy Building Consumption Based on the Design of Experiments
NASA Astrophysics Data System (ADS)
Belahya, Hocine; Boubekri, Abdelghani; Kriker, Abdelouahed
2017-08-01
Building sector is one of the effective consumer energy by 42% in Algeria. The need for energy has continued to grow, in inordinate way, due to lack of legislation on energy performance in this large consumer sector. Another reason is the simultaneous change of users’ requirements to maintain their comfort, especially summer in dry lands and parts of southern Algeria, where the town of Ouargla presents a typical example which leads to a large amount of electricity consumption through the use of air conditioning. In order to achieve a high performance envelope of the building, an optimization of major parameters building envelope is required, using design of experiments (DOE), can determine the most effective parameters and eliminate the less importance. The study building is often complex and time consuming due to the large number of parameters to consider. This study focuses on reducing the computing time and determines the major parameters of building energy consumption, such as area of building, factor shape, orientation, ration walls to windows …etc to make some proposal models in order to minimize the seasonal energy consumption due to air conditioning needs.
The LArIAT experiment: first measurement of the inclusive total pion cross-section in Argon
NASA Astrophysics Data System (ADS)
de María Blaszczyk, Flor
2018-05-01
In light of future large neutrino experiments such as DUNE, an excellent understanding of LArTPCs is required. The Liquid Argon In A Test-beam (LArIAT) experiment, located in the Fermilab Test Beam Facility, is designed to characterize the performance of LArTPCs and improve the reconstruction algorithms but also to measure the cross-sections of charged particles in Argon. The goals and experimental layout will be presented, as well as the world’s first inclusive total pion interaction cross-section on Argon measured by LArIAT.
In-space research, technology and engineering experiments and Space Station
NASA Technical Reports Server (NTRS)
Tyson, Richard; Gartrell, Charles F.
1988-01-01
The NASA Space Station will serve as a technology research laboratory, a payload-servicing facility, and a large structure fabrication and assembly facility. Space structures research will encompass advanced structural concepts and their dynamics, advanced control concepts, sensors, and actuators. Experiments dealing with fluid management will gather data on such fundamentals as multiphase flow phenomena. As requirements for power systems and thermal management grow, experiments quantifying the performance of energy systems and thermal management concepts will be undertaken, together with expanded efforts in the fields of information systems, automation, and robotics.
ERIC Educational Resources Information Center
Gould, Mauri
1975-01-01
Presents complete instructions for assembling an electric motor which does not require large amounts of power to operate and which is inexpensive as well as reliable. Several open-ended experiments with the motor are included as well as information for obtaining a kit of parts and instructions. (BR)
The calorimeter system of the new muon g-2 experiment at Fermilab
Alonzi, L. P.; Anastasi, A.; Bjorkquist, R.; ...
2015-12-02
The electromagnetic calorimeter for the new muon ( g–2) experiment at Fermilab will consist of arrays of PbF 2 Cerenkov crystals read out by large-area silicon photo-multiplier (SiPM) sensors. Here, we report here the requirements for this system, the achieved solution and the results obtained from a test beam using 2.0–4.5 GeV electrons with a 28-element prototype array.
New calorimeters for space experiments: physics requirements and technological challenges
NASA Astrophysics Data System (ADS)
Marrocchesi, Pier Simone
2015-07-01
Direct measurements of charged cosmic radiation with instruments in Low Earth Orbit (LEO), or flying on balloons above the atmosphere, require the identification of the incident particle, the measurement of its energy and possibly the determination of its sign-of-charge. The latter information can be provided by a magnetic spectrometer together with a measurement of momentum. However, magnetic deflection in space experiments is at present limited to values of the Maximum Detectable Rigidity (MDR) hardly exceeding a few TV. Advanced calorimetric techniques are, at present, the only way to measure charged and neutral radiation at higher energies in the multi-TeV range. Despite their mass limitation, calorimeters may achieve a large geometric factor and provide an adequate proton background rejection factor, taking advantage of a fine granularity and imaging capabilities. In this lecture, after a brief introduction on electromagnetic and hadronic calorimetry, an innovative approach to the design of a space-borne, large acceptance, homogeneous calorimeter for the detection of high energy cosmic rays will be described.
Large-scale quantum photonic circuits in silicon
NASA Astrophysics Data System (ADS)
Harris, Nicholas C.; Bunandar, Darius; Pant, Mihir; Steinbrecher, Greg R.; Mower, Jacob; Prabhu, Mihika; Baehr-Jones, Tom; Hochberg, Michael; Englund, Dirk
2016-08-01
Quantum information science offers inherently more powerful methods for communication, computation, and precision measurement that take advantage of quantum superposition and entanglement. In recent years, theoretical and experimental advances in quantum computing and simulation with photons have spurred great interest in developing large photonic entangled states that challenge today's classical computers. As experiments have increased in complexity, there has been an increasing need to transition bulk optics experiments to integrated photonics platforms to control more spatial modes with higher fidelity and phase stability. The silicon-on-insulator (SOI) nanophotonics platform offers new possibilities for quantum optics, including the integration of bright, nonclassical light sources, based on the large third-order nonlinearity (χ(3)) of silicon, alongside quantum state manipulation circuits with thousands of optical elements, all on a single phase-stable chip. How large do these photonic systems need to be? Recent theoretical work on Boson Sampling suggests that even the problem of sampling from e30 identical photons, having passed through an interferometer of hundreds of modes, becomes challenging for classical computers. While experiments of this size are still challenging, the SOI platform has the required component density to enable low-loss and programmable interferometers for manipulating hundreds of spatial modes. Here, we discuss the SOI nanophotonics platform for quantum photonic circuits with hundreds-to-thousands of optical elements and the associated challenges. We compare SOI to competing technologies in terms of requirements for quantum optical systems. We review recent results on large-scale quantum state evolution circuits and strategies for realizing high-fidelity heralded gates with imperfect, practical systems. Next, we review recent results on silicon photonics-based photon-pair sources and device architectures, and we discuss a path towards large-scale source integration. Finally, we review monolithic integration strategies for single-photon detectors and their essential role in on-chip feed forward operations.
Data Crosscutting Requirements Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kleese van Dam, Kerstin; Shoshani, Arie; Plata, Charity
2013-04-01
In April 2013, a diverse group of researchers from the U.S. Department of Energy (DOE) scientific community assembled to assess data requirements associated with DOE-sponsored scientific facilities and large-scale experiments. Participants in the review included facilities staff, program managers, and scientific experts from the offices of Basic Energy Sciences, Biological and Environmental Research, High Energy Physics, and Advanced Scientific Computing Research. As part of the meeting, review participants discussed key issues associated with three distinct aspects of the data challenge: 1) processing, 2) management, and 3) analysis. These discussions identified commonalities and differences among the needs of varied scientific communities.more » They also helped to articulate gaps between current approaches and future needs, as well as the research advances that will be required to close these gaps. Moreover, the review provided a rare opportunity for experts from across the Office of Science to learn about their collective expertise, challenges, and opportunities. The "Data Crosscutting Requirements Review" generated specific findings and recommendations for addressing large-scale data crosscutting requirements.« less
NASA Technical Reports Server (NTRS)
1976-01-01
The six themes identified by the Workshop have many common navigation guidance and control needs. All the earth orbit themes have a strong requirement for attitude, figure and stabilization control of large space structures, a requirement not currently being supported. All but the space transportation theme have need for precision pointing of spacecraft and instruments. In addition all the themes have requirements for increasing autonomous operations for such activities as spacecraft and experiment operations, onboard mission modification, rendezvous and docking, spacecraft assembly and maintenance, navigation and guidance, and self-checkout, test and repair. Major new efforts are required to conceptualize new approaches to large space antennas and arrays that are lightweight, readily deployable, and capable of precise attitude and figure control. Conventional approaches offer little hope of meeting these requirements. Functions that can benefit from increasing automation or autonomous operations are listed.
Kelly, Debbie M; Cook, Robert G
2003-06-01
Three experiment examined the role of contextual information during line orientation and line position discriminations by pigeons (Columba livia) and humans (Homo sapiens). Experiment 1 tested pigeons' performance with these stimuli in a target localization task using texture displays. Experiments 2 and 3 tested pigeons and humans, respectively, with small and large variations of these stimuli in a same-different task. Humans showed a configural superiority effect when tested with displays constructed from large elements but not when tested with the smaller, more densely packed texture displays. The pigeons, in contrast, exhibited a configural inferiority effect when required to discriminate line orientation, regardless of stimulus size. These contrasting results suggest a species difference in the perceptionand use of features and contextual information in the discrimination of line information.
Deuterium results at the negative ion source test facility ELISE
NASA Astrophysics Data System (ADS)
Kraus, W.; Wünderlich, D.; Fantz, U.; Heinemann, B.; Bonomo, F.; Riedl, R.
2018-05-01
The ITER neutral beam system will be equipped with large radio frequency (RF) driven negative ion sources, with a cross section of 0.9 m × 1.9 m, which have to deliver extracted D- ion beams of 57 A at 1 MeV for 1 h. On the extraction from a large ion source experiment test facility, a source of half of this size is being operational since 2013. The goal of this experiment is to demonstrate a high operational reliability and to achieve the extracted current densities and beam properties required for ITER. Technical improvements of the source design and the RF system were necessary to provide reliable operation in steady state with an RF power of up to 300 kW. While in short pulses the required D- current density has almost been reached, the performance in long pulses is determined in particular in Deuterium by inhomogeneous and unstable currents of co-extracted electrons. By application of refined caesium evaporation and distribution procedures, and reduction and symmetrization of the electron currents, considerable progress has been made and up to 190 A/m2 D-, corresponding to 66% of the value required for ITER, have been extracted for 45 min.
Experience report: Using formal methods for requirements analysis of critical spacecraft software
NASA Technical Reports Server (NTRS)
Lutz, Robyn R.; Ampo, Yoko
1994-01-01
Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.
A high performance cost-effective digital complex correlator for an X-band polarimetry survey.
Bergano, Miguel; Rocha, Armando; Cupido, Luís; Barbosa, Domingos; Villela, Thyrso; Boas, José Vilas; Rocha, Graça; Smoot, George F
2016-01-01
The detailed knowledge of the Milky Way radio emission is important to characterize galactic foregrounds masking extragalactic and cosmological signals. The update of the global sky models describing radio emissions over a very large spectral band requires high sensitivity experiments capable of observing large sky areas with long integration times. Here, we present the design of a new 10 GHz (X-band) polarimeter digital back-end to map the polarization components of the galactic synchrotron radiation field of the Northern Hemisphere sky. The design follows the digital processing trends in radio astronomy and implements a large bandwidth (1 GHz) digital complex cross-correlator to extract the Stokes parameters of the incoming synchrotron radiation field. The hardware constraints cover the implemented VLSI hardware description language code and the preliminary results. The implementation is based on the simultaneous digitized acquisition of the Cartesian components of the two linear receiver polarization channels. The design strategy involves a double data rate acquisition of the ADC interleaved parallel bus, and field programmable gate array device programming at the register transfer mode. The digital core of the back-end is capable of processing 32 Gbps and is built around an Altera field programmable gate array clocked at 250 MHz, 1 GSps analog to digital converters and a clock generator. The control of the field programmable gate array internal signal delays and a convenient use of its phase locked loops provide the timing requirements to achieve the target bandwidths and sensitivity. This solution is convenient for radio astronomy experiments requiring large bandwidth, high functionality, high volume availability and low cost. Of particular interest, this correlator was developed for the Galactic Emission Mapping project and is suitable for large sky area polarization continuum surveys. The solutions may also be adapted to be used at signal processing subsystem levels for large projects like the square kilometer array testbeds.
Interfering with free recall of words: Detrimental effects of phonological competition.
Fernandes, Myra A; Wammes, Jeffrey D; Priselac, Sandra; Moscovitch, Morris
2016-09-01
We examined the effect of different distracting tasks, performed concurrently during memory retrieval, on recall of a list of words. By manipulating the type of material and processing (semantic, orthographic, and phonological) required in the distracting task, and comparing the magnitude of memory interference produced, we aimed to infer the kind of representation upon which retrieval of words depends. In Experiment 1, identifying odd digits concurrently during free recall disrupted memory, relative to a full attention condition, when the numbers were presented orthographically (e.g. nineteen), but not numerically (e.g. 19). In Experiment 2, a distracting task that required phonological-based decisions to either word or picture material produced large, but equivalent effects on recall of words. In Experiment 3, phonological-based decisions to pictures in a distracting task disrupted recall more than when the same pictures required semantically-based size estimations. In Experiment 4, a distracting task that required syllable decisions to line drawings interfered significantly with recall, while an equally difficult semantically-based color-decision task about the same line drawings, did not. Together, these experiments demonstrate that the degree of memory interference experienced during recall of words depends primarily on whether the distracting task competes for phonological representations or processes, and less on competition for semantic or orthographic or material-specific representations or processes. Copyright © 2016 Elsevier Ltd. All rights reserved.
Shape Tracking of a Dexterous Continuum Manipulator Utilizing Two Large Deflection Shape Sensors
Farvardin, Amirhossein; Grupp, Robert; Murphy, Ryan J.; Taylor, Russell H.; Iordachita, Iulian
2016-01-01
Dexterous continuum manipulators (DCMs) can largely increase the reachable region and steerability for minimally and less invasive surgery. Many such procedures require the DCM to be capable of producing large deflections. The real-time control of the DCM shape requires sensors that accurately detect and report large deflections. We propose a novel, large deflection, shape sensor to track the shape of a 35 mm DCM designed for a less invasive treatment of osteolysis. Two shape sensors, each with three fiber Bragg grating sensing nodes is embedded within the DCM, and the sensors’ distal ends fixed to the DCM. The DCM centerline is computed using the centerlines of each sensor curve. An experimental platform was built and different groups of experiments were carried out, including free bending and three cases of bending with obstacles. For each experiment, the DCM drive cable was pulled with a precise linear slide stage, the DCM centerline was calculated, and a 2D camera image was captured for verification. The reconstructed shape created with the shape sensors is compared with the ground truth generated by executing a 2D–3D registration between the camera image and 3D DCM model. Results show that the distal tip tracking accuracy is 0.40 ± 0.30 mm for the free bending and 0.61 ± 0.15 mm, 0.93 ± 0.05 mm and 0.23 ± 0.10 mm for three cases of bending with obstacles. The data suggest FBG arrays can accurately characterize the shape of large-deflection DCMs. PMID:27761103
Purification for the XENONnT dark matter experiment
NASA Astrophysics Data System (ADS)
Brown, Ethan; Xenon Collaboration
2017-01-01
The XENON1T experiment uses 3.5 tons of liquid xenon in a cryogenic detector to search for dark matter. Its upgrade, XENONnT, will similarly house 7.5 tons of liquid xenon. Operation of these large detectors requires continual purification of the xenon in an external purifier, and the need for less than part per billion level oxygen in the xenon, coupled with the large quantity of xenon to be purified, places high demands on the rate of flow through this purification system. Building on the success of the XENON10 and XENON100 experiments, XENON1T circulates gaseous xenon through heated getters at a rate of up to 100 SLPM, pushing commercial pumps to their limits moving this large quantity of gas without interruption for several years. Two upgrades are considered for XENONnT. A custom high-capacity magnetic piston pump based on the one developed for the EXO200 experiment has been scaled up to support the high demands of this much larger experiment. Additionally, a liquid phase circulation and purification system that purifies the cryogenic liquid directly is being developed, which takes advantage of the much smaller volumetric flow demands of liquid relative to gas. The implementation of both upgrades will be presented. Supported by the National Science Foundation.
Attitude control challenges for earth orbiters of the 1980's
NASA Technical Reports Server (NTRS)
Hibbard, W.
1980-01-01
Experience gained in designing attitude control systems for orbiting spacecraft of the late 1980's is related. Implications for satellite attitude control design of the guidance capabilities, rendezvous and recovery requirements, use of multiple-use spacecraft and the development of large spacecraft associated with the advent of the Space Shuttle are considered. Attention is then given to satellite attitude control requirements posed by the Tracking and Data Relay Satellite System, the Global Positioning System, the NASA End-to-End Data System, and Shuttle-associated subsatellites. The anticipated completion and launch of the Space Telescope, which will provide one of the first experiences with the new generation of attitude control, is also pointed out.
Commissioning and initial experience with the ALICE on-line
NASA Astrophysics Data System (ADS)
Altini, V.; Anticic, T.; Carena, F.; Carena, W.; Chapeland, S.; Chibante Barroso, V.; Costa, F.; Dénes, E.; Divià, R.; Fuchs, U.; Kiss, T.; Makhlyueva, I.; Roukoutakis, F.; Schossmaier, K.; Soós, C.; Vande Vyvre, P.; von Haller, B.; ALICE Collaboration
2010-04-01
ALICE (A Large Ion Collider Experiment) is the heavy-ion detector designed to study the physics of strongly interacting matter and the quark-gluon plasma at the CERN Large Hadron Collider (LHC). A large bandwidth and flexible Data Acquisition System (DAQ) has been designed and deployed to collect sufficient statistics in the short running time available per year for heavy ions and to accommodate very different requirements originated from the 18 sub-detectors. This paper will present the large scale tests conducted to assess the standalone DAQ performances, the interfaces with the other online systems and the extensive commissioning performed in order to be fully prepared for physics data taking. It will review the experience accumulated since May 2007 during the standalone commissioning of the main detectors and the global cosmic runs and the lessons learned from this exposure on the "battle field". It will also discuss the test protocol followed to integrate and validate each sub-detector with the online systems and it will conclude with the first results of the LHC injection tests and startup in September 2008. Several papers of the same conference present in more details some elements of the ALICE DAQ system.
An Illustrative Guide to the Minerva Framework
NASA Astrophysics Data System (ADS)
Flom, Erik; Leonard, Patrick; Hoeffel, Udo; Kwak, Sehyun; Pavone, Andrea; Svensson, Jakob; Krychowiak, Maciej; Wendelstein 7-X Team Collaboration
2017-10-01
Modern phsyics experiments require tracking and modelling data and their associated uncertainties on a large scale, as well as the combined implementation of multiple independent data streams for sophisticated modelling and analysis. The Minerva Framework offers a centralized, user-friendly method of large-scale physics modelling and scientific inference. Currently used by teams at multiple large-scale fusion experiments including the Joint European Torus (JET) and Wendelstein 7-X (W7-X), the Minerva framework provides a forward-model friendly architecture for developing and implementing models for large-scale experiments. One aspect of the framework involves so-called data sources, which are nodes in the graphical model. These nodes are supplied with engineering and physics parameters. When end-user level code calls a node, it is checked network-wide against its dependent nodes for changes since its last implementation and returns version-specific data. Here, a filterscope data node is used as an illustrative example of the Minerva Framework's data management structure and its further application to Bayesian modelling of complex systems. This work has been carried out within the framework of the EUROfusion Consortium and has received funding from the Euratom research and training programme 2014-2018 under Grant Agreement No. 633053.
NASA Astrophysics Data System (ADS)
Vecsey, Luděk; Plomerová, Jaroslava; Jedlička, Petr; Munzarová, Helena; Babuška, Vladislav; AlpArray Working Group
2017-12-01
This paper focuses on major issues related to the data reliability and network performance of 20 broadband (BB) stations of the Czech (CZ) MOBNET (MOBile NETwork) seismic pool within the AlpArray seismic experiments. Currently used high-resolution seismological applications require high-quality data recorded for a sufficiently long time interval at seismological observatories and during the entire time of operation of the temporary stations. In this paper we present new hardware and software tools we have been developing during the last two decades while analysing data from several international passive experiments. The new tools help to assure the high-quality standard of broadband seismic data and eliminate potential errors before supplying data to seismological centres. Special attention is paid to crucial issues like the detection of sensor misorientation, timing problems, interchange of record components and/or their polarity reversal, sensor mass centring, or anomalous channel amplitudes due to, for example, imperfect gain. Thorough data quality control should represent an integral constituent of seismic data recording, preprocessing, and archiving, especially for data from temporary stations in passive seismic experiments. Large international seismic experiments require enormous efforts from scientists from different countries and institutions to gather hundreds of stations to be deployed in the field during a limited time period. In this paper, we demonstrate the beneficial effects of the procedures we have developed for acquiring a reliable large set of high-quality data from each group participating in field experiments. The presented tools can be applied manually or automatically on data from any seismic network.
Millimeter radiometer system technology
NASA Technical Reports Server (NTRS)
Wilson, W. J.; Swanson, P. N.
1989-01-01
JPL has had a large amount of experience with spaceborne microwave/millimeter wave radiometers for remote sensing. All of the instruments use filled aperture antenna systems from 5 cm diameter for the microwave Sounder Units (MSU), 16 m for the microwave limb sounder (MLS) to 20 m for the large deployable reflector (LDR). The advantages of filled aperture antenna systems are presented. The requirements of the 10 m Geoplat antenna system, 10 m multified antenna, and the MLS are briefly discussed.
Millimeter radiometer system technology
NASA Astrophysics Data System (ADS)
Wilson, W. J.; Swanson, P. N.
1989-07-01
JPL has had a large amount of experience with spaceborne microwave/millimeter wave radiometers for remote sensing. All of the instruments use filled aperture antenna systems from 5 cm diameter for the microwave Sounder Units (MSU), 16 m for the microwave limb sounder (MLS) to 20 m for the large deployable reflector (LDR). The advantages of filled aperture antenna systems are presented. The requirements of the 10 m Geoplat antenna system, 10 m multified antenna, and the MLS are briefly discussed.
NASTRAN users' experience of Avco Aerostructures Division
NASA Technical Reports Server (NTRS)
Blackburn, C. L.; Wilhelm, C. A.
1973-01-01
The NASTRAN experiences of a major structural design and fabrication subcontractor that has less engineering personnel and computer facilities than those available to large prime contractors are discussed. Efforts to obtain sufficient computer capacity and the development and implementation of auxiliary programs to reduce manpower requirements are described. Applications of the NASTRAN program for training users, checking out auxiliary programs, performing in-house research and development, and structurally analyzing an Avco designed and manufactured missile case are presented.
2012-06-01
Czech Republic, Estonia, Hungary, Latvia, Lithuania, Poland, Romania , Slovakia, and Slovenia. 26 The seven PfP Eastern European countries, as...The Soviet experience has left an indelible mark in Ukrainian “identity, politics, economics and even religion ”127 and this experience looms large in...from Tsarist Russia to the Soviet Union from 1918 to 1921.186 The Russian and Soviet influences, along with previous Persian and Ottoman cultures
Ontology-Driven Provenance Management in eScience: An Application in Parasite Research
NASA Astrophysics Data System (ADS)
Sahoo, Satya S.; Weatherly, D. Brent; Mutharaju, Raghava; Anantharam, Pramod; Sheth, Amit; Tarleton, Rick L.
Provenance, from the French word "provenir", describes the lineage or history of a data entity. Provenance is critical information in scientific applications to verify experiment process, validate data quality and associate trust values with scientific results. Current industrial scale eScience projects require an end-to-end provenance management infrastructure. This infrastructure needs to be underpinned by formal semantics to enable analysis of large scale provenance information by software applications. Further, effective analysis of provenance information requires well-defined query mechanisms to support complex queries over large datasets. This paper introduces an ontology-driven provenance management infrastructure for biology experiment data, as part of the Semantic Problem Solving Environment (SPSE) for Trypanosoma cruzi (T.cruzi). This provenance infrastructure, called T.cruzi Provenance Management System (PMS), is underpinned by (a) a domain-specific provenance ontology called Parasite Experiment ontology, (b) specialized query operators for provenance analysis, and (c) a provenance query engine. The query engine uses a novel optimization technique based on materialized views called materialized provenance views (MPV) to scale with increasing data size and query complexity. This comprehensive ontology-driven provenance infrastructure not only allows effective tracking and management of ongoing experiments in the Tarleton Research Group at the Center for Tropical and Emerging Global Diseases (CTEGD), but also enables researchers to retrieve the complete provenance information of scientific results for publication in literature.
Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources
NASA Astrophysics Data System (ADS)
Evans, D.; Fisk, I.; Holzman, B.; Melo, A.; Metson, S.; Pordes, R.; Sheldon, P.; Tiradani, A.
2011-12-01
Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely "on-demand" as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the "base-line" needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.
The Goal-Based Scenario Builder: Experiences with Novice Instructional Designers.
ERIC Educational Resources Information Center
Bell, Benjamin; Korcuska, Michael
Creating educational software generally requires a great deal of computer expertise, and as a result, educators lacking such knowledge have largely been excluded from the design process. Recently, researchers have been designing tools for automating some aspects of building instructional applications. These tools typically aim for generality,…
ERIC Educational Resources Information Center
Comstock, George
Television is a large part of growing up in America, and a part that meshes in various ways with other influences. Teachers should understand it, and as the occasion requires, confront, correct, or take advantage of it. Research on television viewing yields five lessons. Television experience is an individual one, although there are definite…
Service Blueprinting: Transforming the Student Experience
ERIC Educational Resources Information Center
Bitner, Mary Jo; Ostrom, Amy L.; Burkhard, Kevin A.
2012-01-01
There is much discussion today about the need to transform higher education for the benefit of students, employers, and society at large. Experts and researchers list the numerous challenges: low student retention and graduation rates, the increasing cost of higher education, and concerns that graduates don't possess the skills required to compete…
Stuck at the bottom rung: occupational characteristics of workers with disabilities.
Kaye, H Stephen
2009-06-01
The proportion of workers reporting disabilities varies tremendously across occupations. Although differences in the occupational distributions may partly explain the large disparities in earnings and job security between workers with and without disabilities, little is known about the reasons that workers with disabilities are underrepresented in certain occupations and overrepresented in others. Using a large, national survey of the US population combined with official data on the skill and experience requirements and occupational risks of 269 occupations, a multilevel regression analysis was performed to identify occupational and individual factors that influence the representation of workers with disabilities across occupations. Models of overall, sensory, mobility, and cognitive disability were constructed for working-age labor force participants, as were models of overall disability for younger, in-between, and older workers. At the occupational level, reported disability is negatively associated with occupational requirements for information and communication skills and with the amount of prior work experience that is required, after controlling for individual factors such as age and educational attainment. Little relationship is found between disability status and a set of occupational risk factors. These findings generally hold true across disability types and age groups. Even after taking into account their lower average educational attainment, workers with disabilities appear to be disproportionately relegated to entry-level occupations that do not emphasize the better-remunerated job skills. Underemployment results in lower wages and less job security and stability. Possible reasons include employer discrimination, low expectations, deficits in relevant skills or experience, and work disincentives.
1990-05-01
of static and dynamic resource allocation . * Develop a wide-spectrum requirements engineering language that meets the objectives defined in this...within the next few years. The TrCP Panel will closely monitor future developments in this area, and will fully consider this suggestion. Chairman...experience has shown that, especially for large and complex system developments , it is rare that the true needs of all stakeholders are fully stated
Parallel-Processing Test Bed For Simulation Software
NASA Technical Reports Server (NTRS)
Blech, Richard; Cole, Gary; Townsend, Scott
1996-01-01
Second-generation Hypercluster computing system is multiprocessor test bed for research on parallel algorithms for simulation in fluid dynamics, electromagnetics, chemistry, and other fields with large computational requirements but relatively low input/output requirements. Built from standard, off-shelf hardware readily upgraded as improved technology becomes available. System used for experiments with such parallel-processing concepts as message-passing algorithms, debugging software tools, and computational steering. First-generation Hypercluster system described in "Hypercluster Parallel Processor" (LEW-15283).
Experimental demonstration of the control of flexible structures
NASA Technical Reports Server (NTRS)
Schaechter, D. B.; Eldred, D. B.
1984-01-01
The Large Space Structure Technology Flexible Beam Experiment employs a pinned-free flexible beam to demonstrate such required methods as dynamic and adaptive control, as well as various control law design approaches and hardware requirements. An attempt is made to define the mechanization difficulties that may inhere in flexible structures. Attention is presently given to analytical work performed in support of the test facility's development, the final design's specifications, the control laws' synthesis, and experimental results obtained.
Quiroga, Maria del Mar; Price, Nicholas SC
2016-01-01
Lecture content and practical laboratory classes are ideally complementary. However, the types of experiments that have led to our detailed understanding of sensory neuroscience are often not amenable to classroom experimentation as they require expensive equipment, time-consuming surgeries, specialized experimental techniques, and the use of animals. While sometimes feasible in small group teaching, these experiments are not suitable for large cohorts of students. Previous attempts to expose students to sensory neuroscience experiments include: the use of electrophysiology preparations in invertebrates, data-driven simulations that do not replicate the experience of conducting an experiment, or simply observing an experiment in a research laboratory. We developed an online simulation of a visual neuroscience experiment in which extracellular recordings are made from a motion sensitive neuron. Students have control over stimulation parameters (direction and contrast) and can see and hear the action potential responses to stimuli as they are presented. The simulation provides an intuitive way for students to gain insight into neurophysiology, including experimental design, data collection and data analysis. Our simulation allows large cohorts of students to cost-effectively “experience” the results of animal research without ethical concerns, to be exposed to realistic data variability, and to develop their understanding of how sensory neuroscience experiments are conducted. PMID:27980465
Crater size estimates for large-body terrestrial impact
NASA Technical Reports Server (NTRS)
Schmidt, Robert M.; Housen, Kevin R.
1988-01-01
Calculating the effects of impacts leading to global catastrophes requires knowledge of the impact process at very large size scales. This information cannot be obtained directly but must be inferred from subscale physical simulations, numerical simulations, and scaling laws. Schmidt and Holsapple presented scaling laws based upon laboratory-scale impact experiments performed on a centrifuge (Schmidt, 1980 and Schmidt and Holsapple, 1980). These experiments were used to develop scaling laws which were among the first to include gravity dependence associated with increasing event size. At that time using the results of experiments in dry sand and in water to provide bounds on crater size, they recognized that more precise bounds on large-body impact crater formation could be obtained with additional centrifuge experiments conducted in other geological media. In that previous work, simple power-law formulae were developed to relate final crater diameter to impactor size and velocity. In addition, Schmidt (1980) and Holsapple and Schmidt (1982) recognized that the energy scaling exponent is not a universal constant but depends upon the target media. Recently, Holsapple and Schmidt (1987) includes results for non-porous materials and provides a basis for estimating crater formation kinematics and final crater size. A revised set of scaling relationships for all crater parameters of interest are presented. These include results for various target media and include the kinematics of formation. Particular attention is given to possible limits brought about by very large impactors.
Lien, Mei-Ching; Ruthruff, Eric
2004-05-01
This study examined how task switching is affected by hierarchical task organization. Traditional task-switching studies, which use a constant temporal and spatial distance between each task element (defined as a stimulus requiring a response), promote a flat task structure. Using this approach, Experiment 1 revealed a large switch cost of 238 ms. In Experiments 2-5, adjacent task elements were grouped temporally and/or spatially (forming an ensemble) to create a hierarchical task organization. Results indicate that the effect of switching at the ensemble level dominated the effect of switching at the element level. Experiments 6 and 7, using an ensemble of 3 task elements, revealed that the element-level switch cost was virtually absent between ensembles but was large within an ensemble. The authors conclude that the element-level task repetition benefit is fragile and can be eliminated in a hierarchical task organization.
NASA Technical Reports Server (NTRS)
Lien, Mei-Ching; Ruthruff, Eric
2004-01-01
This study examined how task switching is affected by hierarchical task organization. Traditional task-switching studies, which use a constant temporal and spatial distance between each task element (defined as a stimulus requiring a response), promote a flat task structure. Using this approach, Experiment 1 revealed a large switch cost of 238 ms. In Experiments 2-5, adjacent task elements were grouped temporally and/or spatially (forming an ensemble) to create a hierarchical task organization. Results indicate that the effect of switching at the ensemble level dominated the effect of switching at the element level. Experiments 6 and 7, using an ensemble of 3 task elements, revealed that the element-level switch cost was virtually absent between ensembles but was large within an ensemble. The authors conclude that the element-level task repetition benefit is fragile and can be eliminated in a hierarchical task organization.
NASA Technical Reports Server (NTRS)
Malcipa, Carlos; Decker, William A.; Theodore, Colin R.; Blanken, Christopher L.; Berger, Tom
2010-01-01
A piloted simulation investigation was conducted using the NASA Ames Vertical Motion Simulator to study the impact of pitch, roll and yaw attitude bandwidth and phase delay on handling qualities of large tilt-rotor aircraft. Multiple bandwidth and phase delay pairs were investigated for each axis. The simulation also investigated the effect that the pilot offset from the center of gravity has on handling qualities. While pilot offset does not change the dynamics of the vehicle, it does affect the proprioceptive and visual cues and it can have an impact on handling qualities. The experiment concentrated on two primary evaluation tasks: a precision hover task and a simple hover pedal turn. Six pilots flew over 1400 data runs with evaluation comments and objective performance data recorded. The paper will describe the experiment design and methodology, discuss the results of the experiment and summarize the findings.
Measurement of Critical Contact Angle in a Microgravity Space Experiment
NASA Technical Reports Server (NTRS)
Concus, P.; Finn, R.; Weislogel, M.
1998-01-01
Mathematical theory predicts that small changes in container shape or in contact angle can give rise to large shifts of liquid in a microgravity environment. This phenomenon was investigated in the Interface Configuration Experiment on board the USML-2 Space Shuttle flight. The experiment's "double proboscis" containers were designed to strike a balance between conflicting requirements of sizable volume of liquid shift (for ease of observation) and abruptness of the shift (for accurate determination of critical contact angle). The experimental results support the classical concept of macroscopic contact angle and demonstrate the role of hysteresis in impeding orientation toward equilibrium.
Measurement of Critical Contact Angle in a Microgravity Space Experiment
NASA Technical Reports Server (NTRS)
Concus, P.; Finn, R.; Weislogel, M.
1998-01-01
Mathematical theory predicts that small changes in container shape or in contact angle can give rise to large shifts of liquid in a microgravity environment. This phenomenon was investigated in the Interface Configuration Experiment on board the USMT,2 Space Shuttle flight. The experiment's "double proboscis" containers were designed to strike a balance between conflicting requirements of sizable volume of liquid shift (for ease of observation) and abruptness of the shift (for accurate determination of critical contact angle). The experimental results support the classical concept of macroscopic contact angle and demonstrate the role of hysteresis in impeding orientation toward equilibrium.
Microwave Remote Sensing and the Cold Land Processes Field Experiment
NASA Technical Reports Server (NTRS)
Kim, Edward J.; Cline, Don; Davis, Bert; Hildebrand, Peter H. (Technical Monitor)
2001-01-01
The Cold Land Processes Field Experiment (CLPX) has been designed to advance our understanding of the terrestrial cryosphere. Developing a more complete understanding of fluxes, storage, and transformations of water and energy in cold land areas is a critical focus of the NASA Earth Science Enterprise Research Strategy, the NASA Global Water and Energy Cycle (GWEC) Initiative, the Global Energy and Water Cycle Experiment (GEWEX), and the GEWEX Americas Prediction Project (GAPP). The movement of water and energy through cold regions in turn plays a large role in ecological activity and biogeochemical cycles. Quantitative understanding of cold land processes over large areas will require synergistic advancements in 1) understanding how cold land processes, most comprehensively understood at local or hillslope scales, extend to larger scales, 2) improved representation of cold land processes in coupled and uncoupled land-surface models, and 3) a breakthrough in large-scale observation of hydrologic properties, including snow characteristics, soil moisture, the extent of frozen soils, and the transition between frozen and thawed soil conditions. The CLPX Plan has been developed through the efforts of over 60 interested scientists that have participated in the NASA Cold Land Processes Working Group (CLPWG). This group is charged with the task of assessing, planning and implementing the required background science, technology, and application infrastructure to support successful land surface hydrology remote sensing space missions. A major product of the experiment will be a comprehensive, legacy data set that will energize many aspects of cold land processes research. The CLPX will focus on developing the quantitative understanding, models, and measurements necessary to extend our local-scale understanding of water fluxes, storage, and transformations to regional and global scales. The experiment will particularly emphasize developing a strong synergism between process-oriented understanding, land surface models and microwave remote sensing. The experimental design is a multi-sensor, multi-scale (1-ha to 160,000 km ^ {2}) approach to providing the comprehensive data set necessary to address several experiment objectives. A description focusing on the microwave remote sensing components (ground, airborne, and spaceborne) of the experiment will be presented.
NASA Technical Reports Server (NTRS)
Pohner, John A.; Dempsey, Brian P.; Herold, Leroy M.
1990-01-01
Space Station elements and advanced military spacecraft will require rejection of tens of kilowatts of waste heat. Large space radiators and two-phase heat transport loops will be required. To minimize radiator size and weight, it is critical to minimize the temperature drop between the heat source and sink. Under an Air Force contract, a unique, high-performance heat exchanger is developed for coupling the radiator to the transport loop. Since fluid flow through the heat exchanger is driven by capillary forces which are easily dominated by gravity forces in ground testing, it is necessary to perform microgravity thermal testing to verify the design. This contract consists of an experiment definition phase leading to a preliminary design and cost estimate for a shuttle-based flight experiment of this heat exchanger design. This program will utilize modified hardware from a ground test program for the heat exchanger.
Evaluation of the cognitive effects of travel technique in complex real and virtual environments.
Suma, Evan A; Finkelstein, Samantha L; Reid, Myra; V Babu, Sabarish; Ulinski, Amy C; Hodges, Larry F
2010-01-01
We report a series of experiments conducted to investigate the effects of travel technique on information gathering and cognition in complex virtual environments. In the first experiment, participants completed a non-branching multilevel 3D maze at their own pace using either real walking or one of two virtual travel techniques. In the second experiment, we constructed a real-world maze with branching pathways and modeled an identical virtual environment. Participants explored either the real or virtual maze for a predetermined amount of time using real walking or a virtual travel technique. Our results across experiments suggest that for complex environments requiring a large number of turns, virtual travel is an acceptable substitute for real walking if the goal of the application involves learning or reasoning based on information presented in the virtual world. However, for applications that require fast, efficient navigation or travel that closely resembles real-world behavior, real walking has advantages over common joystick-based virtual travel techniques.
Fiacconi, Chris M; Milliken, Bruce
2011-12-01
In a series of four experiments, we examine the hypothesis that selective attention is crucial for the generation of conscious knowledge of contingency information. We investigated this question using a spatial priming task in which participants were required to localize a target letter in a probe display. In Experiment 1, participants kept track of the frequency with which the predictive letter in the prime appeared in various locations. This manipulation had a negligible impact on contingency awareness. Subsequent experiments requiring participants to attend to features (color, location) of the predictive letter increased contingency awareness somewhat, but there remained a large proportion of individuals who remained unaware of the strong contingency. Together the results of our experiments suggest that the construct of attention does not fully capture the processes that lead to contingency awareness, and suggest a critical role for bottom-up feature integration in explicit contingency learning. Copyright © 2011 Elsevier Inc. All rights reserved.
Radiopurity assessment of the energy readout for the NEXT double beta decay experiment
NASA Astrophysics Data System (ADS)
Cebrián, S.; Pérez, J.; Bandac, I.; Labarga, L.; Álvarez, V.; Azevedo, C. D. R.; Benlloch-Rodríguez, J. M.; Borges, F. I. G. M.; Botas, A.; Cárcel, S.; Carrión, J. V.; Conde, C. A. N.; Díaz, J.; Diesburg, M.; Escada, J.; Esteve, R.; Felkai, R.; Fernandes, L. M. P.; Ferrario, P.; Ferreira, A. L.; Freitas, E. D. C.; Goldschmidt, A.; Gómez-Cadenas, J. J.; González-Díaz, D.; Gutiérrez, R. M.; Hauptman, J.; Henriques, C. A. O.; Hernandez, A. I.; Hernando Morata, J. A.; Herrero, V.; Jones, B. J. P.; Laing, A.; Lebrun, P.; Liubarsky, I.; López-March, N.; Losada, M.; Martín-Albo, J.; Martínez-Lema, G.; Martínez, A.; McDonald, A. D.; Monrabal, F.; Monteiro, C. M. B.; Mora, F. J.; Moutinho, L. M.; Muñoz Vidal, J.; Musti, M.; Nebot-Guinot, M.; Novella, P.; Nygren, D. R.; Palmeiro, B.; Para, A.; Querol, M.; Renner, J.; Ripoll, L.; Rodríguez, J.; Rogers, L.; Santos, F. P.; dos Santos, J. M. F.; Simón, A.; Sofka, C.; Sorel, M.; Stiegler, T.; Toledo, J. F.; Torrent, J.; Tsamalaidze, Z.; Veloso, J. F. C. A.; Villar, J. A.; Webb, R.; White, J. T.; Yahlali, N.
2017-08-01
The "Neutrino Experiment with a Xenon Time-Projection Chamber" (NEXT) experiment intends to investigate the neutrinoless double beta decay of 136Xe, and therefore requires a severe suppression of potential backgrounds. An extensive material screening and selection process was undertaken to quantify the radioactivity of the materials used in the experiment. Separate energy and tracking readout planes using different sensors allow us to combine the measurement of the topological signature of the event for background discrimination with the energy resolution optimization. The design of radiopure readout planes, in direct contact with the gas detector medium, was especially challenging since the required components typically have activities too large for experiments demanding ultra-low background conditions. After studying the tracking plane, here the radiopurity control of the energy plane is presented, mainly based on gamma-ray spectroscopy using ultra-low background germanium detectors at the Laboratorio Subterr&aposaneo de Canfranc (Spain). All the available units of the selected model of photomultiplier have been screened together with most of the components for the bases, enclosures and windows. According to these results for the activity of the relevant radioisotopes, the selected components of the energy plane would give a contribution to the overall background level in the region of interest of at most 2.4×10-4 counts keV-1 kg-1 y-1, satisfying the sensitivity requirements of the NEXT experiment.
Improving Design Efficiency for Large-Scale Heterogeneous Circuits
NASA Astrophysics Data System (ADS)
Gregerson, Anthony
Despite increases in logic density, many Big Data applications must still be partitioned across multiple computing devices in order to meet their strict performance requirements. Among the most demanding of these applications is high-energy physics (HEP), which uses complex computing systems consisting of thousands of FPGAs and ASICs to process the sensor data created by experiments at particles accelerators such as the Large Hadron Collider (LHC). Designing such computing systems is challenging due to the scale of the systems, the exceptionally high-throughput and low-latency performance constraints that necessitate application-specific hardware implementations, the requirement that algorithms are efficiently partitioned across many devices, and the possible need to update the implemented algorithms during the lifetime of the system. In this work, we describe our research to develop flexible architectures for implementing such large-scale circuits on FPGAs. In particular, this work is motivated by (but not limited in scope to) high-energy physics algorithms for the Compact Muon Solenoid (CMS) experiment at the LHC. To make efficient use of logic resources in multi-FPGA systems, we introduce Multi-Personality Partitioning, a novel form of the graph partitioning problem, and present partitioning algorithms that can significantly improve resource utilization on heterogeneous devices while also reducing inter-chip connections. To reduce the high communication costs of Big Data applications, we also introduce Information-Aware Partitioning, a partitioning method that analyzes the data content of application-specific circuits, characterizes their entropy, and selects circuit partitions that enable efficient compression of data between chips. We employ our information-aware partitioning method to improve the performance of the hardware validation platform for evaluating new algorithms for the CMS experiment. Together, these research efforts help to improve the efficiency and decrease the cost of the developing large-scale, heterogeneous circuits needed to enable large-scale application in high-energy physics and other important areas.
Development of a flash, bang, and smoke simulation of a shell burst
NASA Technical Reports Server (NTRS)
Williamson, F. R.; Kinney, J. F.; Wallace, T. V.
1982-01-01
A large number of experiments (cue test firings) were performed in the definition of the cue concepts and packaging configurations. A total of 344 of these experiments were recorded with instrumentation photography to allow a quantitative analysis of the smoke cloud to be made as a function of time. These analyses were predominantly made using a short test site. Supplementary long range visibility tests were conducted to insure the required 3 kilometer visibility of the smoke signature.
The NASA/Baltimore Applications Project: An experiment in technology transfer
NASA Technical Reports Server (NTRS)
Golden, T. S.
1981-01-01
Conclusions drawn from the experiment thus far are presented. The problems of a large city most often do not require highly sophisticated solutions; the simpler the solution, the better. A problem focused approach is a greater help to the city than a product focused approach. Most problem situations involve several individuals or organized groups within the city. Mutual trust and good interpersonal relationships between the technologist and the administrator is as important for solving problems as technological know-how.
Trace: a high-throughput tomographic reconstruction engine for large-scale datasets
Bicer, Tekin; Gursoy, Doga; Andrade, Vincent De; ...
2017-01-28
Here, synchrotron light source and detector technologies enable scientists to perform advanced experiments. These scientific instruments and experiments produce data at such scale and complexity that large-scale computation is required to unleash their full power. One of the widely used data acquisition technique at light sources is Computed Tomography, which can generate tens of GB/s depending on x-ray range. A large-scale tomographic dataset, such as mouse brain, may require hours of computation time with a medium size workstation. In this paper, we present Trace, a data-intensive computing middleware we developed for implementation and parallelization of iterative tomographic reconstruction algorithms. Tracemore » provides fine-grained reconstruction of tomography datasets using both (thread level) shared memory and (process level) distributed memory parallelization. Trace utilizes a special data structure called replicated reconstruction object to maximize application performance. We also present the optimizations we have done on the replicated reconstruction objects and evaluate them using a shale and a mouse brain sinogram. Our experimental evaluations show that the applied optimizations and parallelization techniques can provide 158x speedup (using 32 compute nodes) over single core configuration, which decreases the reconstruction time of a sinogram (with 4501 projections and 22400 detector resolution) from 12.5 hours to less than 5 minutes per iteration.« less
Trace: a high-throughput tomographic reconstruction engine for large-scale datasets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bicer, Tekin; Gursoy, Doga; Andrade, Vincent De
Here, synchrotron light source and detector technologies enable scientists to perform advanced experiments. These scientific instruments and experiments produce data at such scale and complexity that large-scale computation is required to unleash their full power. One of the widely used data acquisition technique at light sources is Computed Tomography, which can generate tens of GB/s depending on x-ray range. A large-scale tomographic dataset, such as mouse brain, may require hours of computation time with a medium size workstation. In this paper, we present Trace, a data-intensive computing middleware we developed for implementation and parallelization of iterative tomographic reconstruction algorithms. Tracemore » provides fine-grained reconstruction of tomography datasets using both (thread level) shared memory and (process level) distributed memory parallelization. Trace utilizes a special data structure called replicated reconstruction object to maximize application performance. We also present the optimizations we have done on the replicated reconstruction objects and evaluate them using a shale and a mouse brain sinogram. Our experimental evaluations show that the applied optimizations and parallelization techniques can provide 158x speedup (using 32 compute nodes) over single core configuration, which decreases the reconstruction time of a sinogram (with 4501 projections and 22400 detector resolution) from 12.5 hours to less than 5 minutes per iteration.« less
Investigation of Recombination Processes In A Magnetized Plasma
NASA Technical Reports Server (NTRS)
Chavers, Greg; Chang-Diaz, Franklin; Rodgers, Stephen L. (Technical Monitor)
2002-01-01
Interplanetary travel requires propulsion systems that can provide high specific impulse (Isp), while also having sufficient thrust to rapidly accelerate large payloads. One such propulsion system is the Variable Specific Impulse Magneto-plasma Rocket (VASIMR), which creates, heats, and exhausts plasma to provide variable thrust and Isp, optimally meeting the mission requirements. A large fraction of the energy to create the plasma is frozen in the exhaust in the form of ionization energy. This loss mechanism is common to all electromagnetic plasma thrusters and has an impact on their efficiency. When the device operates at high Isp, where the exhaust kinetic energy is high compared to the ionization energy, the frozen flow component is of little consequence; however, at low Isp, the effect of the frozen flow may be important. If some of this energy could be recovered through recombination processes, and re-injected as neutral kinetic energy, the efficiency of VASIMR, in its low Isp/high thrust mode may be improved. In this operating regime, the ionization energy is a large portion of the total plasma energy. An experiment is being conducted to investigate the possibility of recovering some of the energy used to create the plasma. This presentation will cover the progress and status of the experiment involving surface recombination of the plasma.
Software package for modeling spin-orbit motion in storage rings
NASA Astrophysics Data System (ADS)
Zyuzin, D. V.
2015-12-01
A software package providing a graphical user interface for computer experiments on the motion of charged particle beams in accelerators, as well as analysis of obtained data, is presented. The software package was tested in the framework of the international project on electric dipole moment measurement JEDI (Jülich Electric Dipole moment Investigations). The specific features of particle spin motion imply the requirement to use a cyclic accelerator (storage ring) consisting of electrostatic elements, which makes it possible to preserve horizontal polarization for a long time. Computer experiments study the dynamics of 106-109 particles in a beam during 109 turns in an accelerator (about 1012-1015 integration steps for the equations of motion). For designing an optimal accelerator structure, a large number of computer experiments on polarized beam dynamics are required. The numerical core of the package is COSY Infinity, a program for modeling spin-orbit dynamics.
NASA Technical Reports Server (NTRS)
Harris, Charles D.; Harvey, William D.; Brooks, Cuyler W., Jr.
1988-01-01
A large-chord, swept, supercritical, laminar-flow-control (LFC) airfoil was designed and constructed and is currently undergoing tests in the Langley 8 ft Transonic Pressure Tunnel. The experiment was directed toward evaluating the compatibility of LFC and supercritical airfoils, validating prediction techniques, and generating a data base for future transport airfoil design as part of NASA's ongoing research program to significantly reduce drag and increase aircraft efficiency. Unique features of the airfoil included a high design Mach number with shock free flow and boundary layer control by suction. Special requirements for the experiment included modifications to the wind tunnel to achieve the necessary flow quality and contouring of the test section walls to simulate free air flow about a swept model at transonic speeds. Design of the airfoil with a slotted suction surface, the suction system, and modifications to the tunnel to meet test requirements are discussed.
Building for the future: essential infrastructure for rodent ageing studies.
Wells, Sara E; Bellantuono, Ilaria
2016-08-01
When planning ageing research using rodent models, the logistics of supply, long term housing and infrastructure provision are important factors to take into consideration. These issues need to be prioritised to ensure they meet the requirements of experiments which potentially will not be completed for several years. Although these issues are not unique to this discipline, the longevity of experiments and indeed the animals, requires a high level of consistency and sustainability to be maintained throughout lengthy periods of time. Moreover, the need to access aged stock or material for more immediate experiments poses many issues for the completion of pilot studies and/or short term intervention studies on older models. In this article, we highlight the increasing demand for ageing research, the resources and infrastructure involved, and the need for large-scale collaborative programmes to advance studies in both a timely and a cost-effective way.
Characterization of the VEGA ASIC coupled to large area position-sensitive Silicon Drift Detectors
NASA Astrophysics Data System (ADS)
Campana, R.; Evangelista, Y.; Fuschino, F.; Ahangarianabhari, M.; Macera, D.; Bertuccio, G.; Grassi, M.; Labanti, C.; Marisaldi, M.; Malcovati, P.; Rachevski, A.; Zampa, G.; Zampa, N.; Andreani, L.; Baldazzi, G.; Del Monte, E.; Favre, Y.; Feroci, M.; Muleri, F.; Rashevskaya, I.; Vacchi, A.; Ficorella, F.; Giacomini, G.; Picciotto, A.; Zuffa, M.
2014-08-01
Low-noise, position-sensitive Silicon Drift Detectors (SDDs) are particularly useful for experiments in which a good energy resolution combined with a large sensitive area is required, as in the case of X-ray astronomy space missions and medical applications. This paper presents the experimental characterization of VEGA, a custom Application Specific Integrated Circuit (ASIC) used as the front-end electronics for XDXL-2, a large-area (30.5 cm2) SDD prototype. The ASICs were integrated on a specifically developed PCB hosting also the detector. Results on the ASIC noise performances, both stand-alone and bonded to the large area SDD, are presented and discussed.
Software Engineering for Scientific Computer Simulations
NASA Astrophysics Data System (ADS)
Post, Douglass E.; Henderson, Dale B.; Kendall, Richard P.; Whitney, Earl M.
2004-11-01
Computer simulation is becoming a very powerful tool for analyzing and predicting the performance of fusion experiments. Simulation efforts are evolving from including only a few effects to many effects, from small teams with a few people to large teams, and from workstations and small processor count parallel computers to massively parallel platforms. Successfully making this transition requires attention to software engineering issues. We report on the conclusions drawn from a number of case studies of large scale scientific computing projects within DOE, academia and the DoD. The major lessons learned include attention to sound project management including setting reasonable and achievable requirements, building a good code team, enforcing customer focus, carrying out verification and validation and selecting the optimum computational mathematics approaches.
NASA Astrophysics Data System (ADS)
McKinstry, Chris
The present article describes a possible method for the automatic discovery of a universal human semantic-affective hyperspatial approximation of the human subcognitive substrate - the associative network which French (1990) asserts is the ultimate foundation of the human ability to pass the Turing Test - that does not require a machine to have direct human experience or a physical human body. This method involves automatic programming - such as Koza's genetic programming (1992) - guided in the discovery of the proposed universal hypergeometry by feedback from a Minimum Intelligent Signal Test or MIST (McKinstry, 1997) constructed from a very large number of human validated probabilistic propositions collected from a large population of Internet users. It will be argued that though a lifetime of human experience is required to pass a rigorous Turing Test, a probabilistic propositional approximation of this experience can be constructed via public participation on the Internet, and then used as a fitness function to direct the artificial evolution of a universal hypergeometry capable of classifying arbitrary propositions. A model of this hypergeometry will be presented; it predicts Miller's "Magical Number Seven" (1956) as the size of human short-term memory from fundamental hypergeometric properties. A system that can lead to the generation of novel propositions or "artificial thoughts" will also be described.
Evaluation of ultra-low background materials for uranium and thorium using ICP-MS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoppe, E. W.; Overman, N. R.; LaFerriere, B. D.
2013-08-08
An increasing number of physics experiments require low background materials for their construction. The presence of Uranium and Thorium and their progeny in these materials present a variety of unwanted background sources for these experiments. The sensitivity of the experiments continues to drive the necessary levels of detection ever lower as well. This requirement for greater sensitivity has rendered direct radioassay impractical in many cases requiring large quantities of material, frequently many kilograms, and prolonged counting times, often months. Other assay techniques have been employed such as Neutron Activation Analysis but this requires access to expensive facilities and instrumentation andmore » can be further complicated and delayed by the formation of unwanted radionuclides. Inductively Coupled Plasma Mass Spectrometry (ICP-MS) is a useful tool and recent advancements have increased the sensitivity particularly in the elemental high mass range of U and Th. Unlike direct radioassay, ICP-MS is a destructive technique since it requires the sample to be in liquid form which is aspirated into a high temperature plasma. But it benefits in that it usually requires a very small sample, typically about a gram. This paper discusses how a variety of low background materials such as copper, polymers, and fused silica are made amenable to ICP-MS assay and how the arduous task of maintaining low backgrounds of U and Th is achieved.« less
Evaluation of Ultra-Low Background Materials for Uranium and Thorium Using ICP-MS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoppe, Eric W.; Overman, Nicole R.; LaFerriere, Brian D.
2013-08-08
An increasing number of physics experiments require low background materials for their construction. The presence of Uranium and Thorium and their progeny in these materials present a variety of unwanted background sources for these experiments. The sensitivity of the experiments continues to drive the necessary levels of detection ever lower as well. This requirement for greater sensitivity has rendered direct radioassay impractical in many cases requiring large quantities of material, frequently many kilograms, and prolonged counting times, often months. Other assay techniques have been employed such as Neutron Activation Analysis but this requires access to expensive facilities and instrumentation andmore » can be further complicated and delayed by the formation of unwanted radionuclides. Inductively Coupled Plasma Mass Spectrometry (ICP-MS) is a useful tool and recent advancements have increased the sensitivity particularly in the elemental high mass range of U and Th. Unlike direct radioassay, ICP-MS is a destructive technique since it requires the sample to be in liquid form which is aspirated into a high temperature plasma. But it benefits in that it usually requires a very small sample, typically about a gram. Here we will discuss how a variety of low background materials such as copper, polymers, and fused silica are made amenable to ICP-MS assay and how the arduous task of maintaining low backgrounds of U and Th is achieved.« less
NASA Astrophysics Data System (ADS)
Fiore, S.; Płóciennik, M.; Doutriaux, C.; Blanquer, I.; Barbera, R.; Williams, D. N.; Anantharaj, V. G.; Evans, B. J. K.; Salomoni, D.; Aloisio, G.
2017-12-01
The increased models resolution in the development of comprehensive Earth System Models is rapidly leading to very large climate simulations output that pose significant scientific data management challenges in terms of data sharing, processing, analysis, visualization, preservation, curation, and archiving.Large scale global experiments for Climate Model Intercomparison Projects (CMIP) have led to the development of the Earth System Grid Federation (ESGF), a federated data infrastructure which has been serving the CMIP5 experiment, providing access to 2PB of data for the IPCC Assessment Reports. In such a context, running a multi-model data analysis experiment is very challenging, as it requires the availability of a large amount of data related to multiple climate models simulations and scientific data management tools for large-scale data analytics. To address these challenges, a case study on climate models intercomparison data analysis has been defined and implemented in the context of the EU H2020 INDIGO-DataCloud project. The case study has been tested and validated on CMIP5 datasets, in the context of a large scale, international testbed involving several ESGF sites (LLNL, ORNL and CMCC), one orchestrator site (PSNC) and one more hosting INDIGO PaaS services (UPV). Additional ESGF sites, such as NCI (Australia) and a couple more in Europe, are also joining the testbed. The added value of the proposed solution is summarized in the following: it implements a server-side paradigm which limits data movement; it relies on a High-Performance Data Analytics (HPDA) stack to address performance; it exploits the INDIGO PaaS layer to support flexible, dynamic and automated deployment of software components; it provides user-friendly web access based on the INDIGO Future Gateway; and finally it integrates, complements and extends the support currently available through ESGF. Overall it provides a new "tool" for climate scientists to run multi-model experiments. At the time this contribution is being written, the proposed testbed represents the first implementation of a distributed large-scale, multi-model experiment in the ESGF/CMIP context, joining together server-side approaches for scientific data analysis, HPDA frameworks, end-to-end workflow management, and cloud computing.
GEM detector performance and efficiency in Proton Charge Radius (PRad) Experiment
NASA Astrophysics Data System (ADS)
Bai, Xinzhan; PRad Collaboration
2017-09-01
The PRad experiment (E12-11-106) was performed in 2016 at Jefferson Lab in Hall B. It aims to investigate the proton charge radius puzzle through electron proton elastic scattering process. The experiment used a non-magnetic spectrometer method, and reached a very small ep scattering angle and thus an unprecedented small four-momentum transfer squared region, Q2 from 2 ×10-4 to 0.06(GeV / c) 2 . PRad experiment was designed to measure the proton charge radius within a sub-percent precision. Gas Electron Multiplier (GEM) detectors have contributed to reach the experimental goal. A pair of large area GEM detectors, and a large acceptance, high resolution calorimeter(HyCal) were utilized in the experiment to detect the scattered electrons. The precision requirements of the experiment demands a highly accurate understanding of efficiency and stability of GEM detectors. In this talk, we will present the preliminary results on the performance and efficiency of GEM detectors. This work is supported in part by NSF MRI award PHY-1229153, the U.S. Department of Energy under Contract No. DE-FG02-07ER41528, No. DE-FG02-03ER41240 and Thomas Jefferson National Laboratory.
Optimization of cDNA-AFLP experiments using genomic sequence data.
Kivioja, Teemu; Arvas, Mikko; Saloheimo, Markku; Penttilä, Merja; Ukkonen, Esko
2005-06-01
cDNA amplified fragment length polymorphism (cDNA-AFLP) is one of the few genome-wide level expression profiling methods capable of finding genes that have not yet been cloned or even predicted from sequence but have interesting expression patterns under the studied conditions. In cDNA-AFLP, a complex cDNA mixture is divided into small subsets using restriction enzymes and selective PCR. A large cDNA-AFLP experiment can require a substantial amount of resources, such as hundreds of PCR amplifications and gel electrophoresis runs, followed by manual cutting of a large number of bands from the gels. Our aim was to test whether this workload can be reduced by rational design of the experiment. We used the available genomic sequence information to optimize cDNA-AFLP experiments beforehand so that as many transcripts as possible could be profiled with a given amount of resources. Optimization of the selection of both restriction enzymes and selective primers for cDNA-AFLP experiments has not been performed previously. The in silico tests performed suggest that substantial amounts of resources can be saved by the optimization of cDNA-AFLP experiments.
But I'm an engineer—not a contracts lawyer!
NASA Astrophysics Data System (ADS)
Warner, Mark; Bass, Harvey
2012-09-01
Industrial partners, commercial vendors, and subsystem contractors play a large role in the design and construction of modern telescopes. Because many telescope projects carry relatively small staffs, engineers are often required to perform the additional functions of technical writing, cost estimating, and contract bidding and negotiating. The skills required to carry out these tasks are not normally taught in traditional engineering programs. As a result, engineers often learn to write Request for Proposals (RFPs), select vendors, and negotiate contracts by trial-and-error and/or by adapting previous project documents to match their own requirements. Typically, this means that at the end of a contract the engineer has a large list of do's, don'ts, and lessons learned for the next RFP he or she must generate. This paper will present one such engineer's experience writing and bidding proposal packages for large telescope components and subsystems. Included are: thoughts on structuring SOWs, Specs, ICDs, and other RFP documents; modern methods for bidding the work; and systematic means for selecting and negotiating with a contractor to arrive at the best value for the project.
The Tapered Hybrid Undulator (THUNDER) of the visible free-electron laser oscillator experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson, K.E.; Quimby, D.C.; Slater, J.M.
A 5 m tapered hybrid undulator (THUNDER) has been designed and built as part of the Boeing Aerospace Company and Spectra Technology, Inc. visible free-electron laser (FEL) oscillator experiment. The performance goals required of an undulator for a visible oscillator with large extraction are ambitious. They require the establishment of stringent magnetic field quality tolerances which impact design and fabrication techniques. The performance goals of THUNDER are presented. The tolerances resulting from the FEL interaction are contrasted and compared to those of a synchrotron radiation source. The design, fabrication, and field measurements are discussed. The performance of THUNDER serves asmore » a benchmark for future wiggler/undulator design for advanced FEL's and synchrotron radiation sources.« less
NASA Astrophysics Data System (ADS)
Henkel, Daniela; Eisenhauer, Anton
2017-04-01
During the last decades, the number of large research projects has increased and therewith the requirement for multidisciplinary, multisectoral collaboration. Such complex and large-scale projects pose new competencies to form, manage, and use large, diverse teams as a competitive advantage. For complex projects the effort is magnified because multiple large international research consortia involving academic and non-academic partners, including big industries, NGOs, private and public bodies, all with cultural differences, individually discrepant expectations on teamwork and differences in the collaboration between national and multi-national administrations and research organisations, challenge the organisation and management of such multi-partner research consortia. How many partners are needed to establish and conduct collaboration with a multidisciplinary and multisectoral approach? How much personnel effort and what kinds of management techniques are required for such projects. This presentation identifies advantages and challenges of large research projects based on the experiences made in the context of an Innovative Training Network (ITN) project within Marie Skłodowska-Curie Actions of the European HORIZON 2020 program. Possible strategies are discussed to circumvent and avoid conflicts already at the beginning of the project.
USDA-ARS?s Scientific Manuscript database
Screening large populations for carriers of known or de novo rare SNPs is required both in Targeting induced local lesions IN genomes (TILLING) experiments in plants and analogously in screening human populations. We formerly suggested an approach that combines the celebrated mathematical field of c...
Challenges in Teaching "Colloid and Surface Chemistry"--A Danish Experience
ERIC Educational Resources Information Center
Kontogeorgis, Georgios M.; Vigild, Martin E.
2009-01-01
Seven years ago we were asked, as one of our first teaching duties at the Technical University of Denmark (DTU), to teach a 5 ECTS point course on "Colloid and Surface Chemistry". The topic is itself at the same time exciting and demanding, largely due to its multidisciplinary nature. Several "local" requirements posed…
ERIC Educational Resources Information Center
Zablotsky, Benjamin; Colpe, Lisa J.; Pringle, Beverly A.; Kogan, Michael D.; Rice, Catherine; Blumberg, Stephen J.
2017-01-01
Children with autism spectrum disorder (ASD) require substantial support to address the core symptoms of ASD and co-occurring behavioral/developmental conditions. This study explores the early diagnostic experiences of school-aged children with ASD using survey data from a large probability-based national sample. Multivariate linear regressions…
ERIC Educational Resources Information Center
Caldarella, Paul; Williams, Leslie; Jolstead, Krystine A.; Wills, Howard P.
2017-01-01
Classroom management is a common concern for teachers. Music teachers in particular experience unique behavior challenges because of large class sizes, uncommon pacing requirements, and performance-based outcomes. Positive behavior support is an evidence-based framework for preventing or eliminating challenging behaviors by teaching and…
Bringing Text Display Digital Radio to Consumers with Hearing Loss
ERIC Educational Resources Information Center
Sheffield, Ellyn G.; Starling, Michael; Schwab, Daniel
2011-01-01
Radio is migrating to digital transmission, expanding its offerings to include captioning for individuals with hearing loss. Text display radio requires a large amount of word throughput with minimal screen display area, making good user interface design crucial to its success. In two experiments, we presented hearing, hard-of-hearing, and deaf…
Affective Experiences of International and Home Students during the Information Search Process
ERIC Educational Resources Information Center
Haley, Adele Nicole; Clough, Paul
2017-01-01
An increasing number of students are studying abroad requiring that they interact with information in languages other than their mother tongue. The UK in particular has seen a large growth in international students within Higher Education. These nonnative English speaking students present a distinct user group for university information services,…
Student Thoughts and Perceptions on Curriculum Reform
ERIC Educational Resources Information Center
VanderJagt, Douglas D.
2013-01-01
The purpose of this qualitative case study was to examine how students experience and respond to Michigan's increased graduation requirements. The study was conducted in a large, suburban high school that instituted a change to a trimester system in response to the state mandate. A criterion-based sample of 16 students, both college bound and…
Don't Try to Bridge the Literacy Gap Alone
ERIC Educational Resources Information Center
González, Ramón M.
2015-01-01
The author is a middle school principal who has spent a decade working on improving literacy among his largely socio-economically disadvantaged student body. Experience and research have shown, the author says, that a successful effort to bridge the literacy gap between children who live in poverty and middle-class students requires a concerted…
Telecommunication market research processing
NASA Astrophysics Data System (ADS)
Dupont, J. F.
1983-06-01
The data processing in two telecommunication market investigations is described. One of the studies concerns the office applications of communication and the other the experiences with a videotex terminal. Statistical factorial analysis was performed on a large mass of data. A comparison between utilization intentions and effective utilization is made. Extensive rewriting of statistical analysis computer programs was required.
Arctic Boreal Vulnerability Experiment (ABoVE) Science Cloud
NASA Astrophysics Data System (ADS)
Duffy, D.; Schnase, J. L.; McInerney, M.; Webster, W. P.; Sinno, S.; Thompson, J. H.; Griffith, P. C.; Hoy, E.; Carroll, M.
2014-12-01
The effects of climate change are being revealed at alarming rates in the Arctic and Boreal regions of the planet. NASA's Terrestrial Ecology Program has launched a major field campaign to study these effects over the next 5 to 8 years. The Arctic Boreal Vulnerability Experiment (ABoVE) will challenge scientists to take measurements in the field, study remote observations, and even run models to better understand the impacts of a rapidly changing climate for areas of Alaska and western Canada. The NASA Center for Climate Simulation (NCCS) at the Goddard Space Flight Center (GSFC) has partnered with the Terrestrial Ecology Program to create a science cloud designed for this field campaign - the ABoVE Science Cloud. The cloud combines traditional high performance computing with emerging technologies to create an environment specifically designed for large-scale climate analytics. The ABoVE Science Cloud utilizes (1) virtualized high-speed InfiniBand networks, (2) a combination of high-performance file systems and object storage, and (3) virtual system environments tailored for data intensive, science applications. At the center of the architecture is a large object storage environment, much like a traditional high-performance file system, that supports data proximal processing using technologies like MapReduce on a Hadoop Distributed File System (HDFS). Surrounding the storage is a cloud of high performance compute resources with many processing cores and large memory coupled to the storage through an InfiniBand network. Virtual systems can be tailored to a specific scientist and provisioned on the compute resources with extremely high-speed network connectivity to the storage and to other virtual systems. In this talk, we will present the architectural components of the science cloud and examples of how it is being used to meet the needs of the ABoVE campaign. In our experience, the science cloud approach significantly lowers the barriers and risks to organizations that require high performance computing solutions and provides the NCCS with the agility required to meet our customers' rapidly increasing and evolving requirements.
A Conditions Data Management System for HEP Experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laycock, P. J.; Dykstra, D.; Formica, A.
Conditions data infrastructure for both ATLAS and CMS have to deal with the management of several Terabytes of data. Distributed computing access to this data requires particular care and attention to manage request-rates of up to several tens of kHz. Thanks to the large overlap in use cases and requirements, ATLAS and CMS have worked towards a common solution for conditions data management with the aim of using this design for data-taking in Run 3. In the meantime other experiments, including NA62, have expressed an interest in this cross- experiment initiative. For experiments with a smaller payload volume and complexity,more » there is particular interest in simplifying the payload storage. The conditions data management model is implemented in a small set of relational database tables. A prototype access toolkit consisting of an intermediate web server has been implemented, using standard technologies available in the Java community. Access is provided through a set of REST services for which the API has been described in a generic way using standard Open API specications, implemented in Swagger. Such a solution allows the automatic generation of client code and server stubs and further allows changes in the backend technology transparently. An important advantage of using a REST API for conditions access is the possibility of caching identical URLs, addressing one of the biggest challenges that large distributed computing solutions impose on conditions data access, avoiding direct DB access by means of standard web proxy solutions.« less
Wiener, J M; Ehbauer, N N; Mallot, H A
2009-09-01
For large numbers of targets, path planning is a complex and computationally expensive task. Humans, however, usually solve such tasks quickly and efficiently. We present experiments studying human path planning performance and the cognitive processes and heuristics involved. Twenty-five places were arranged on a regular grid in a large room. Participants were repeatedly asked to solve traveling salesman problems (TSP), i.e., to find the shortest closed loop connecting a start location with multiple target locations. In Experiment 1, we tested whether humans employed the nearest neighbor (NN) strategy when solving the TSP. Results showed that subjects outperform the NN-strategy, suggesting that it is not sufficient to explain human route planning behavior. As a second possible strategy we tested a hierarchical planning heuristic in Experiment 2, demonstrating that participants first plan a coarse route on the region level that is refined during navigation. To test for the relevance of spatial working memory (SWM) and spatial long-term memory (LTM) for planning performance and the planning heuristics applied, we varied the memory demands between conditions in Experiment 2. In one condition the target locations were directly marked, such that no memory was required; a second condition required participants to memorize the target locations during path planning (SWM); in a third condition, additionally, the locations of targets had to retrieved from LTM (SWM and LTM). Results showed that navigation performance decreased with increasing memory demands while the dependence on the hierarchical planning heuristic increased.
Barrett, Lisa Feldman; Barsalou, Lawrence W.
2015-01-01
The tremendous variability within categories of human emotional experience receives little empirical attention. We hypothesized that atypical instances of emotion categories (e.g. pleasant fear of thrill-seeking) would be processed less efficiently than typical instances of emotion categories (e.g. unpleasant fear of violent threat) in large-scale brain networks. During a novel fMRI paradigm, participants immersed themselves in scenarios designed to induce atypical and typical experiences of fear, sadness or happiness (scenario immersion), and then focused on and rated the pleasant or unpleasant feeling that emerged (valence focus) in most trials. As predicted, reliably greater activity in the ‘default mode’ network (including medial prefrontal cortex and posterior cingulate) was observed for atypical (vs typical) emotional experiences during scenario immersion, suggesting atypical instances require greater conceptual processing to situate the socio-emotional experience. During valence focus, reliably greater activity was observed for atypical (vs typical) emotional experiences in the ‘salience’ network (including anterior insula and anterior cingulate), suggesting atypical instances place greater demands on integrating shifting body signals with the sensory and social context. Consistent with emerging psychological construction approaches to emotion, these findings demonstrate that is it important to study the variability within common categories of emotional experience. PMID:24563528
An epidemiological perspective of personalized medicine: the Estonian experience
Milani, L; Leitsalu, L; Metspalu, A
2015-01-01
Milani L, Leitsalu L, Metspalu A (University of Tartu). An epidemiological perspective of personalized medicine: the Estonian experience (Review). J Intern Med 2015; 277: 188–200. The Estonian Biobank and several other biobanks established over a decade ago are now starting to yield valuable longitudinal follow-up data for large numbers of individuals. These samples have been used in hundreds of different genome-wide association studies, resulting in the identification of reliable disease-associated variants. The focus of genomic research has started to shift from identifying genetic and nongenetic risk factors associated with common complex diseases to understanding the underlying mechanisms of the diseases and suggesting novel targets for therapy. However, translation of findings from genomic research into medical practice is still lagging, mainly due to insufficient evidence of clinical validity and utility. In this review, we examine the different elements required for the implementation of personalized medicine based on genomic information. First, biobanks and genome centres are required and have been established for the high-throughput genomic screening of large numbers of samples. Secondly, the combination of susceptibility alleles into polygenic risk scores has improved risk prediction of cardiovascular disease, breast cancer and several other diseases. Finally, national health information systems are being developed internationally, to combine data from electronic medical records from different sources, and also to gradually incorporate genomic information. We focus on the experience in Estonia, one of several countries with national goals towards more personalized health care based on genomic information, where the unique combination of elements required to accomplish this goal are already in place. PMID:25339628
Locus and persistence of capacity limitations in visual information processing.
Kleiss, J A; Lane, D M
1986-05-01
Although there is considerable evidence that stimuli such as digits and letters are extensively processed in parallel and without capacity limitations, recent data suggest that only the features of stimuli are processed in parallel. In an attempt to reconcile this discrepancy, we used the simultaneous/successive detection paradigm with stimuli from experiments indicating parallel processing and with stimuli from experiments indicating that only features can be processed in parallel. In Experiment 1, large differences between simultaneous and successive presentations were obtained with an R target among P and Q distractors and among P and B distractors, but not with digit targets among letter distractors. As predicted by the feature integration theory of attention, false-alarm rates in the simultaneous condition were much higher than in the successive condition with the R/PQ stimuli. In Experiment 2, the possibility that attention is required for any difficult discrimination was ruled out as an explanation of the discrepancy between the digit/letter results and the R/PQ and R/PB results. Experiment 3A replicated the R/PQ and R/PB results of Experiment 1, and Experiment 3B extended these findings to a new set of stimuli. In Experiment 4, we found that large amounts of consistent practice did not generally eliminate capacity limitations. From this series of experiments we strongly conclude that the notion of capacity-free letter perception has limited generality.
NASA Technical Reports Server (NTRS)
Frantz, Brian D.; Ivancic, William D.
2001-01-01
Asynchronous Transfer Mode (ATM) Quality of Service (QoS) experiments using the Transmission Control Protocol/Internet Protocol (TCP/IP) were performed for various link delays. The link delay was set to emulate a Wide Area Network (WAN) and a Satellite Link. The purpose of these experiments was to evaluate the ATM QoS requirements for applications that utilize advance TCP/IP protocols implemented with large windows and Selective ACKnowledgements (SACK). The effects of cell error, cell loss, and random bit errors on throughput were reported. The detailed test plan and test results are presented herein.
NASA Technical Reports Server (NTRS)
Fowlis, W. W. (Editor); Davis, M. H. (Editor)
1981-01-01
The atmospheric general circulation experiment (AGCE) numerical design for Spacelab flights was studied. A spherical baroclinic flow experiment which models the large scale circulations of the Earth's atmosphere was proposed. Gravity is simulated by a radial dielectric body force. The major objective of the AGCE is to study nonlinear baroclinic wave flows in spherical geometry. Numerical models must be developed which accurately predict the basic axisymmetric states and the stability of nonlinear baroclinic wave flows. A three dimensional, fully nonlinear, numerical model and the AGCE based on the complete set of equations is required. Progress in the AGCE numerical design studies program is reported.
Space Construction Experiment Definition Study (SCEDS), part 2. Volume 2: Study results
NASA Technical Reports Server (NTRS)
1982-01-01
The Space Construction Experiment (SCE) was defined for integration into the Space Shuttle. This included development of flight assignment data, revision and update of preliminary mission timelines and test plans, analysis of flight safety issues, and definition of ground operations scenarios. New requirements for the flight experiment and changes for a large space antenna feed mask test article were incorporated. The program plan and cost estimates were updated. Revised SCE structural dynamics characteristics were provided for simulation and analysis of experimental tests to define and verify control limits and interactions effects between the SCE and the Orbiter digital automatic pilot.
Large-Scale Distributed Computational Fluid Dynamics on the Information Power Grid Using Globus
NASA Technical Reports Server (NTRS)
Barnard, Stephen; Biswas, Rupak; Saini, Subhash; VanderWijngaart, Robertus; Yarrow, Maurice; Zechtzer, Lou; Foster, Ian; Larsson, Olle
1999-01-01
This paper describes an experiment in which a large-scale scientific application development for tightly-coupled parallel machines is adapted to the distributed execution environment of the Information Power Grid (IPG). A brief overview of the IPG and a description of the computational fluid dynamics (CFD) algorithm are given. The Globus metacomputing toolkit is used as the enabling device for the geographically-distributed computation. Modifications related to latency hiding and Load balancing were required for an efficient implementation of the CFD application in the IPG environment. Performance results on a pair of SGI Origin 2000 machines indicate that real scientific applications can be effectively implemented on the IPG; however, a significant amount of continued effort is required to make such an environment useful and accessible to scientists and engineers.
NASA Technical Reports Server (NTRS)
1974-01-01
It is shown in this report that comprehensive in-situ study of all aspects of the entire zone disturbance caused by a body in a flowing plasma resulted in a large number if requirements on the shuttle-PPEPL facility. A large amount of necessary in-situ observation can be obtained by adopting appropriate modes of performing the experiments. Requirements are indicated for worthwhile studies, of some aspects of the problems, which can be carried out effectively while imposing relatively few constraints on the early missions. Considerations for the desired growth and improvement of the PPEPL to facilitate more complete studies in later missions are also discussed. For Vol. 2, see N74-28170; for Vol# 3, see N74-28171.
Shaffer, Christopher D; Alvarez, Consuelo J; Bednarski, April E; Dunbar, David; Goodman, Anya L; Reinke, Catherine; Rosenwald, Anne G; Wolyniak, Michael J; Bailey, Cheryl; Barnard, Daron; Bazinet, Christopher; Beach, Dale L; Bedard, James E J; Bhalla, Satish; Braverman, John; Burg, Martin; Chandrasekaran, Vidya; Chung, Hui-Min; Clase, Kari; Dejong, Randall J; Diangelo, Justin R; Du, Chunguang; Eckdahl, Todd T; Eisler, Heather; Emerson, Julia A; Frary, Amy; Frohlich, Donald; Gosser, Yuying; Govind, Shubha; Haberman, Adam; Hark, Amy T; Hauser, Charles; Hoogewerf, Arlene; Hoopes, Laura L M; Howell, Carina E; Johnson, Diana; Jones, Christopher J; Kadlec, Lisa; Kaehler, Marian; Silver Key, S Catherine; Kleinschmit, Adam; Kokan, Nighat P; Kopp, Olga; Kuleck, Gary; Leatherman, Judith; Lopilato, Jane; Mackinnon, Christy; Martinez-Cruzado, Juan Carlos; McNeil, Gerard; Mel, Stephanie; Mistry, Hemlata; Nagengast, Alexis; Overvoorde, Paul; Paetkau, Don W; Parrish, Susan; Peterson, Celeste N; Preuss, Mary; Reed, Laura K; Revie, Dennis; Robic, Srebrenka; Roecklein-Canfield, Jennifer; Rubin, Michael R; Saville, Kenneth; Schroeder, Stephanie; Sharif, Karim; Shaw, Mary; Skuse, Gary; Smith, Christopher D; Smith, Mary A; Smith, Sheryl T; Spana, Eric; Spratt, Mary; Sreenivasan, Aparna; Stamm, Joyce; Szauter, Paul; Thompson, Jeffrey S; Wawersik, Matthew; Youngblom, James; Zhou, Leming; Mardis, Elaine R; Buhler, Jeremy; Leung, Wilson; Lopatto, David; Elgin, Sarah C R
2014-01-01
There is widespread agreement that science, technology, engineering, and mathematics programs should provide undergraduates with research experience. Practical issues and limited resources, however, make this a challenge. We have developed a bioinformatics project that provides a course-based research experience for students at a diverse group of schools and offers the opportunity to tailor this experience to local curriculum and institution-specific student needs. We assessed both attitude and knowledge gains, looking for insights into how students respond given this wide range of curricular and institutional variables. While different approaches all appear to result in learning gains, we find that a significant investment of course time is required to enable students to show gains commensurate to a summer research experience. An alumni survey revealed that time spent on a research project is also a significant factor in the value former students assign to the experience one or more years later. We conclude: 1) implementation of a bioinformatics project within the biology curriculum provides a mechanism for successfully engaging large numbers of students in undergraduate research; 2) benefits to students are achievable at a wide variety of academic institutions; and 3) successful implementation of course-based research experiences requires significant investment of instructional time for students to gain full benefit.
Shaffer, Christopher D.; Alvarez, Consuelo J.; Bednarski, April E.; Dunbar, David; Goodman, Anya L.; Reinke, Catherine; Rosenwald, Anne G.; Wolyniak, Michael J.; Bailey, Cheryl; Barnard, Daron; Bazinet, Christopher; Beach, Dale L.; Bedard, James E. J.; Bhalla, Satish; Braverman, John; Burg, Martin; Chandrasekaran, Vidya; Chung, Hui-Min; Clase, Kari; DeJong, Randall J.; DiAngelo, Justin R.; Du, Chunguang; Eckdahl, Todd T.; Eisler, Heather; Emerson, Julia A.; Frary, Amy; Frohlich, Donald; Gosser, Yuying; Govind, Shubha; Haberman, Adam; Hark, Amy T.; Hauser, Charles; Hoogewerf, Arlene; Hoopes, Laura L. M.; Howell, Carina E.; Johnson, Diana; Jones, Christopher J.; Kadlec, Lisa; Kaehler, Marian; Silver Key, S. Catherine; Kleinschmit, Adam; Kokan, Nighat P.; Kopp, Olga; Kuleck, Gary; Leatherman, Judith; Lopilato, Jane; MacKinnon, Christy; Martinez-Cruzado, Juan Carlos; McNeil, Gerard; Mel, Stephanie; Mistry, Hemlata; Nagengast, Alexis; Overvoorde, Paul; Paetkau, Don W.; Parrish, Susan; Peterson, Celeste N.; Preuss, Mary; Reed, Laura K.; Revie, Dennis; Robic, Srebrenka; Roecklein-Canfield, Jennifer; Rubin, Michael R.; Saville, Kenneth; Schroeder, Stephanie; Sharif, Karim; Shaw, Mary; Skuse, Gary; Smith, Christopher D.; Smith, Mary A.; Smith, Sheryl T.; Spana, Eric; Spratt, Mary; Sreenivasan, Aparna; Stamm, Joyce; Szauter, Paul; Thompson, Jeffrey S.; Wawersik, Matthew; Youngblom, James; Zhou, Leming; Mardis, Elaine R.; Buhler, Jeremy; Leung, Wilson; Lopatto, David; Elgin, Sarah C. R.
2014-01-01
There is widespread agreement that science, technology, engineering, and mathematics programs should provide undergraduates with research experience. Practical issues and limited resources, however, make this a challenge. We have developed a bioinformatics project that provides a course-based research experience for students at a diverse group of schools and offers the opportunity to tailor this experience to local curriculum and institution-specific student needs. We assessed both attitude and knowledge gains, looking for insights into how students respond given this wide range of curricular and institutional variables. While different approaches all appear to result in learning gains, we find that a significant investment of course time is required to enable students to show gains commensurate to a summer research experience. An alumni survey revealed that time spent on a research project is also a significant factor in the value former students assign to the experience one or more years later. We conclude: 1) implementation of a bioinformatics project within the biology curriculum provides a mechanism for successfully engaging large numbers of students in undergraduate research; 2) benefits to students are achievable at a wide variety of academic institutions; and 3) successful implementation of course-based research experiences requires significant investment of instructional time for students to gain full benefit. PMID:24591510
ERIC Educational Resources Information Center
Caldarella, Paul; Williams, Leslie; Jolstead, Krystine A.; Wills, Howard P.
2017-01-01
Classroom management is a common concern for teachers. Music teachers in particular experience unique behavior challenges because of large class sizes, uncommon pacing requirements, and performance-based outcomes. Positive behavior support (PBS) is an evidence-based framework for preventing or eliminating challenging behaviors by teaching and…
Expanding Computer Science Education in Schools: Understanding Teacher Experiences and Challenges
ERIC Educational Resources Information Center
Yadav, Aman; Gretter, Sarah; Hambrusch, Susanne; Sands, Phil
2017-01-01
The increased push for teaching computer science (CS) in schools in the United States requires training a large number of new K-12 teachers. The current efforts to increase the number of CS teachers have predominantly focused on training teachers from other content areas. In order to support these beginning CS teachers, we need to better…
USDA-ARS?s Scientific Manuscript database
Beef cow efficiency, a century’s old debate, on what the criteria, certain phenotypic traits, and definition of an “efficient” cow really should be. However, we do know that energy utilization by the cow herd is proportionally large compared to the rest of the sector. This requirement accounts up to...
Research Guided Practice: Student Online Experiences during Mathematics Class in the Middle School
ERIC Educational Resources Information Center
Mojica-Casey, Maria; Dekkers, John; Thrupp, Rose-Marie
2014-01-01
The approaches to new technologies available to schools, teachers and students largely concern computers and engagement. This requires adoption of alternate and new teaching practices to engage students in the teaching and learning process. This research integrates youth voice about the use of technology. A major motivation for this research is to…
USDA-ARS?s Scientific Manuscript database
Beef cow efficiency, a century’s old debate, on what the criteria, certain phenotypic traits, and definition of an “efficient” cow really should be. However, we do know that energy utilization by the cow herd is proportionally large compared to the rest of the sector. This requirement accounts up to...
ERIC Educational Resources Information Center
Plavnick, Joshua B.; Ferreri, Summer J.
2013-01-01
Current legislation requires educational practices be informed by science. The effort to establish educational practices supported by science has, to date, emphasized experiments with large numbers of participants who are randomly assigned to an intervention or control condition. A potential limitation of such an emphasis at the expense of other…
Payload isolation and stabilization by a Suspended Experiment Mount (SEM)
NASA Technical Reports Server (NTRS)
Bailey, Wayne L.; Desanctis, Carmine E.; Nicaise, Placide D.; Schultz, David N.
1992-01-01
Many Space Shuttle and Space Station payloads can benefit from isolation from crew or attitude control system disturbances. Preliminary studies have been performed for a Suspended Experiment Mount (SEM) system that will provide isolation from accelerations and stabilize the viewing direction of a payload. The concept consists of a flexible suspension system and payload-mounted control moment gyros. The suspension system, which is rigidly locked for ascent and descent, isolates the payload from high frequency disturbances. The control moment gyros stabilize the payload orientation. The SEM will be useful for payloads that require a lower-g environment than a manned vehicle can provide, such as materials processing, and for payloads that require stabilization of pointing direction, but not large angle slewing, such as nadir-viewing earth observation or solar viewing payloads.
Solving the critical thermal bowing in 3C-SiC/Si(111) by a tilting Si pillar architecture
NASA Astrophysics Data System (ADS)
Albani, Marco; Marzegalli, Anna; Bergamaschini, Roberto; Mauceri, Marco; Crippa, Danilo; La Via, Francesco; von Känel, Hans; Miglio, Leo
2018-05-01
The exceptionally large thermal strain in few-micrometers-thick 3C-SiC films on Si(111), causing severe wafer bending and cracking, is demonstrated to be elastically quenched by substrate patterning in finite arrays of Si micro-pillars, sufficiently large in aspect ratio to allow for lateral pillar tilting, both by simulations and by preliminary experiments. In suspended SiC patches, the mechanical problem is addressed by finite element method: both the strain relaxation and the wafer curvature are calculated at different pillar height, array size, and film thickness. Patches as large as required by power electronic devices (500-1000 μm in size) show a remarkable residual strain in the central area, unless the pillar aspect ratio is made sufficiently large to allow peripheral pillars to accommodate the full film retraction. A sublinear relationship between the pillar aspect ratio and the patch size, guaranteeing a minimal curvature radius, as required for wafer processing and micro-crack prevention, is shown to be valid for any heteroepitaxial system.
Noise-Reduction Benefits Analyzed for Over-the-Wing-Mounted Advanced Turbofan Engines
NASA Technical Reports Server (NTRS)
Berton, Jeffrey J.
2000-01-01
As we look to the future, increasingly stringent civilian aviation noise regulations will require the design and manufacture of extremely quiet commercial aircraft. Also, the large fan diameters of modern engines with increasingly higher bypass ratios pose significant packaging and aircraft installation challenges. One design approach that addresses both of these challenges is to mount the engines above the wing. In addition to allowing the performance trend towards large diameters and high bypass ratio cycles to continue, this approach allows the wing to shield much of the engine noise from people on the ground. The Propulsion Systems Analysis Office at the NASA Glenn Research Center at Lewis Field conducted independent analytical research to estimate the noise reduction potential of mounting advanced turbofan engines above the wing. Certification noise predictions were made for a notional long-haul commercial quadjet transport. A large quad was chosen because, even under current regulations, such aircraft sometimes experience difficulty in complying with certification noise requirements with a substantial margin. Also, because of its long wing chords, a large airplane would receive the greatest advantage of any noise-shielding benefit.
Iron chelation therapy for transfusional iron overload: a swift evolution.
Musallam, Khaled M; Taher, Ali T
2011-01-01
Chronic transfusional iron overload leads to significant morbidity and mortality. While deferoxamine (DFO) is an effective iron chelator with over four decades of experience, it requires tedious subcutaneous infusions that reflect negatively on patient compliance. The novel oral iron chelators deferiprone (L1) and deferasirox (DFRA) opened new horizons for the management of transfusional siderosis. A large body of evidence is now available regarding their efficacy and safety in various populations and settings. Nevertheless, experience with both drugs witnessed some drawbacks, and the search for an ideal and cost-effective iron chelator continues.
Trends in NASA communication satellites.
NASA Technical Reports Server (NTRS)
Sivo, J. N.; Robbins, W. H.; Stretchberry, D. M.
1972-01-01
Discussion of the potential applications of satellite communications technology in meeting the national needs in education, health care, culture, and data transfer techniques. Experiments with the NASA ATS 1, 3 and 5 spacecraft, which are conducted in an attempt to satisfy such needs, are reviewed. The future needs are also considered, covering the requirements of multiple region coverage, communications between regions, large numbers of ground terminals, multichannel capability and high quality TV pictures. The ATS F and CTS spacecraft are expected to be available in the near future to expand experiments in this field.
NASA Technical Reports Server (NTRS)
Glasgow, J. C.; Birchenough, A. G.
1978-01-01
The experimental wind turbine was designed and fabricated to assess technology requirements and engineering problems of large wind turbines. The machine has demonstrated successful operation in all of its design modes and served as a prototype developmental test bed for the Mod-0A operational wind turbines which are currently used on utility networks. The mechanical and control system are described as they evolved in operational tests and some of the experience with various systems in the downwind rotor configurations are elaborated.
AGIS: The ATLAS Grid Information System
NASA Astrophysics Data System (ADS)
Anisenkov, Alexey; Belov, Sergey; Di Girolamo, Alessandro; Gayazov, Stavro; Klimentov, Alexei; Oleynik, Danila; Senchenko, Alexander
2012-12-01
ATLAS is a particle physics experiment at the Large Hadron Collider at CERN. The experiment produces petabytes of data annually through simulation production and tens petabytes of data per year from the detector itself. The ATLAS Computing model embraces the Grid paradigm and a high degree of decentralization and computing resources able to meet ATLAS requirements of petabytes scale data operations. In this paper we present ATLAS Grid Information System (AGIS) designed to integrate configuration and status information about resources, services and topology of whole ATLAS Grid needed by ATLAS Distributed Computing applications and services.
Results from the HARP Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Catanesi, M. G.
2008-02-21
Hadron production is a key ingredient in many aspects of {nu} physics. Precise prediction of atmospheric {nu} fluxes, characterization of accelerator {nu} beams, quantification of {pi} production and capture for {nu}-factory designs, all of these would profit from hadron production measurements. HARP at the CERN PS was the first hadron production experiment designed on purpose to match all these requirements. It combines a large, full phase space acceptance with low systematic errors and high statistics. HARP was operated in the range from 3 GeV to 15 GeV. We briefly describe here the most recent results.
Late time neutrino masses, the LSND experiment, and the cosmic microwave background.
Chacko, Z; Hall, Lawrence J; Oliver, Steven J; Perelstein, Maxim
2005-03-25
Models with low-scale breaking of global symmetries in the neutrino sector provide an alternative to the seesaw mechanism for understanding why neutrinos are light. Such models can easily incorporate light sterile neutrinos required by the Liquid Scintillator Neutrino Detector experiment. Furthermore, the constraints on the sterile neutrino properties from nucleosynthesis and large-scale structure can be removed due to the nonconventional cosmological evolution of neutrino masses and densities. We present explicit, fully realistic supersymmetric models, and discuss the characteristic signatures predicted in the angular distributions of the cosmic microwave background.
Characterization of the electromechanical properties of EAP materials
NASA Technical Reports Server (NTRS)
Bar-Cohen, Yoseph; Sherrita, Stewart; Bhattachary, Kaushik; Lih, Shyh-Shiuh
2001-01-01
Electroactive polymers (EAP) are an emerging class of actuation materials. Their large electrically induced strains (longitudinal or bending), low density, mechanical flexibility, and ease of processing offer advantages over traditional electroactive materials. However, before the capability of these materials can be exploited, their electrical and mechanical behavior must be properly quantified. Two general types of EAP can be identified. The first type is ionic EAP, which requires relatively low voltages (<10V) to achieve large bending deflections. This class usually needs to be hydrated and electrochemical reactions may occur. The second type is Electronic-EAP and it involves electrostrictive and/or Maxwell stresses. This type of materials requires large electric fields (>100MV/m) to achieve longitudinal deformations at the range from 4 - 360%. Some of the difficulties in characterizing EAP include: nonlinear properties, large compliance (large mismatch with metal electrodes), nonhomogeneity resulting from processing, etc. To support the need for reliable data, the authors are developing characterization techniques to quantify the electroactive responses and material properties of EAP materials. The emphasis of the current study is on addressing electromechanical issues related to the ion-exchange type EAP also known as IPMC. The analysis, experiments and test results are discussed in this paper.
NASA Astrophysics Data System (ADS)
Gabrielli, Alessandro; Loddo, Flavio; Ranieri, Antonio; De Robertis, Giuseppe
2008-10-01
This work is aimed at defining the architecture of a new digital ASIC, namely Slow-Control Adapter (SCA), which will be designed in a commercial 130-nm CMOS technology. This chip will be embedded within a high-speed data acquisition optical link (GBT) to control and monitor the front-end electronics in future high-energy physics experiments. The GBT link provides a transparent transport layer between the SCA and control electronics in the counting room. The proposed SCA supports a variety of common bus protocols to interface with end-user general-purpose electronics. Between the GBT and the SCA a standard 100 Mb/s IEEE-802.3 compatible protocol will be implemented. This standard protocol allows off-line tests of the prototypes using commercial components that support the same standard. The project is justified because embedded applications in modern large HEP experiments require particular care to assure the lowest possible power consumption, still offering the highest reliability demanded by very large particle detectors.
Surface plasmon microscopy with low-cost metallic nanostructures for biosensing I
NASA Astrophysics Data System (ADS)
Lindquist, Nathan; Oh, Sang-Hyun; Otto, Lauren
2012-02-01
The field of plasmonics aims to manipulate light over dimensions smaller than the optical wavelength by exploiting surface plasmon resonances in metallic films. Typically, surface plasmons are excited by illuminating metallic nanostructures. For meaningful research in this exciting area, the fabrication of high-quality nanostructures is critical, and in an undergraduate setting, low-cost methods are desirable. Careful optical characterization of the metallic nanostructures is also required. Here, we present the use of novel, inexpensive nanofabrication techniques and the development of a customized surface plasmon microscopy setup for interdisciplinary undergraduate experiments in biosensing, surface-enhanced Raman spectroscopy, and surface plasmon imaging. A Bethel undergraduate student performs the nanofabrication in collaboration with the University of Minnesota. The rewards of mentoring undergraduate students in cooperation with a large research university are numerous, exposing them to a wide variety of opportunities. This research also interacts with upper-level, open-ended laboratory projects, summer research, a semester-long senior research experience, and will enable a large range of experiments into the future.
Inquiry-based experiments for large-scale introduction to PCR and restriction enzyme digests.
Johanson, Kelly E; Watt, Terry J
2015-01-01
Polymerase chain reaction and restriction endonuclease digest are important techniques that should be included in all Biochemistry and Molecular Biology laboratory curriculums. These techniques are frequently taught at an advanced level, requiring many hours of student and faculty time. Here we present two inquiry-based experiments that are designed for introductory laboratory courses and combine both techniques. In both approaches, students must determine the identity of an unknown DNA sequence, either a gene sequence or a primer sequence, based on a combination of PCR product size and restriction digest pattern. The experimental design is flexible, and can be adapted based on available instructor preparation time and resources, and both approaches can accommodate large numbers of students. We implemented these experiments in our courses with a combined total of 584 students and have an 85% success rate. Overall, students demonstrated an increase in their understanding of the experimental topics, ability to interpret the resulting data, and proficiency in general laboratory skills. © 2015 The International Union of Biochemistry and Molecular Biology.
A High Performance Pulsatile Pump for Aortic Flow Experiments in 3-Dimensional Models.
Chaudhury, Rafeed A; Atlasman, Victor; Pathangey, Girish; Pracht, Nicholas; Adrian, Ronald J; Frakes, David H
2016-06-01
Aortic pathologies such as coarctation, dissection, and aneurysm represent a particularly emergent class of cardiovascular diseases. Computational simulations of aortic flows are growing increasingly important as tools for gaining understanding of these pathologies, as well as for planning their surgical repair. In vitro experiments are required to validate the simulations against real world data, and the experiments require a pulsatile flow pump system that can provide physiologic flow conditions characteristic of the aorta. We designed a newly capable piston-based pulsatile flow pump system that can generate high volume flow rates (850 mL/s), replicate physiologic waveforms, and pump high viscosity fluids against large impedances. The system is also compatible with a broad range of fluid types, and is operable in magnetic resonance imaging environments. Performance of the system was validated using image processing-based analysis of piston motion as well as particle image velocimetry. The new system represents a more capable pumping solution for aortic flow experiments than other available designs, and can be manufactured at a relatively low cost.
Role of the ATLAS Grid Information System (AGIS) in Distributed Data Analysis and Simulation
NASA Astrophysics Data System (ADS)
Anisenkov, A. V.
2018-03-01
In modern high-energy physics experiments, particular attention is paid to the global integration of information and computing resources into a unified system for efficient storage and processing of experimental data. Annually, the ATLAS experiment performed at the Large Hadron Collider at the European Organization for Nuclear Research (CERN) produces tens of petabytes raw data from the recording electronics and several petabytes of data from the simulation system. For processing and storage of such super-large volumes of data, the computing model of the ATLAS experiment is based on heterogeneous geographically distributed computing environment, which includes the worldwide LHC computing grid (WLCG) infrastructure and is able to meet the requirements of the experiment for processing huge data sets and provide a high degree of their accessibility (hundreds of petabytes). The paper considers the ATLAS grid information system (AGIS) used by the ATLAS collaboration to describe the topology and resources of the computing infrastructure, to configure and connect the high-level software systems of computer centers, to describe and store all possible parameters, control, configuration, and other auxiliary information required for the effective operation of the ATLAS distributed computing applications and services. The role of the AGIS system in the development of a unified description of the computing resources provided by grid sites, supercomputer centers, and cloud computing into a consistent information model for the ATLAS experiment is outlined. This approach has allowed the collaboration to extend the computing capabilities of the WLCG project and integrate the supercomputers and cloud computing platforms into the software components of the production and distributed analysis workload management system (PanDA, ATLAS).
DIRAC in Large Particle Physics Experiments
NASA Astrophysics Data System (ADS)
Stagni, F.; Tsaregorodtsev, A.; Arrabito, L.; Sailer, A.; Hara, T.; Zhang, X.; Consortium, DIRAC
2017-10-01
The DIRAC project is developing interware to build and operate distributed computing systems. It provides a development framework and a rich set of services for both Workload and Data Management tasks of large scientific communities. A number of High Energy Physics and Astrophysics collaborations have adopted DIRAC as the base for their computing models. DIRAC was initially developed for the LHCb experiment at LHC, CERN. Later, the Belle II, BES III and CTA experiments as well as the linear collider detector collaborations started using DIRAC for their computing systems. Some of the experiments built their DIRAC-based systems from scratch, others migrated from previous solutions, ad-hoc or based on different middlewares. Adaptation of DIRAC for a particular experiment was enabled through the creation of extensions to meet their specific requirements. Each experiment has a heterogeneous set of computing and storage resources at their disposal that were aggregated through DIRAC into a coherent pool. Users from different experiments can interact with the system in different ways depending on their specific tasks, expertise level and previous experience using command line tools, python APIs or Web Portals. In this contribution we will summarize the experience of using DIRAC in particle physics collaborations. The problems of migration to DIRAC from previous systems and their solutions will be presented. An overview of specific DIRAC extensions will be given. We hope that this review will be useful for experiments considering an update, or for those designing their computing models.
NASA Technical Reports Server (NTRS)
1980-01-01
Twenty-four functional requirements were prepared under six categories and serve to indicate how to integrate dispersed storage generation (DSG) systems with the distribution and other portions of the electric utility system. Results indicate that there are no fundamental technical obstacles to prevent the connection of dispersed storage and generation to the distribution system. However, a communication system of some sophistication is required to integrate the distribution system and the dispersed generation sources for effective control. The large-size span of generators from 10 KW to 30 MW means that a variety of remote monitoring and control may be required. Increased effort is required to develop demonstration equipment to perform the DSG monitoring and control functions and to acquire experience with this equipment in the utility distribution environment.
Choice with a fixed requirement for food, and the generality of the matching relation
Stubbs, D. Alan; Dreyfus, Leon R.; Fetterman, J. Gregor; Dorman, Lana G.
1986-01-01
Pigeons were trained on choice procedures in which responses on each of two keys were reinforced probabilistically, but only after a schedule requirement had been met. Under one arrangement, a fixed-interval choice procedure was used in which responses were not reinforced until the interval was over; then a response on one key would be reinforced, with the effective key changing irregularly from interval to interval. Under a second, fixed-ratio choice procedure, responses on either key counted towards completion of the ratio and then, once the ratio had been completed, a response on the probabilistically selected key would produce food. In one experiment, the schedule requirements were varied for both fixed-interval and fixed-ratio schedules. In the second experiment, relative reinforcement rate was varied. And in a third experiment, the duration of an intertrial interval separating choices was varied. The results for 11 pigeons across all three experiments indicate that there were often large deviations between relative response rates and relative reinforcement rates. Overall performance measures were characterized by a great deal of variability across conditions. More detailed measures of choice across the schedule requirement were also quite variable across conditions. In spite of this variability, performance was consistent across conditions in its efficiency of producing food. The absence of matching of behavior allocation to reinforcement rate indicates an important difference between the present procedures and other choice procedures; that difference raises questions about the specific conditions that lead to matching as an outcome. PMID:16812452
Large gamma-ray detector arrays and electromagnetic separators
NASA Astrophysics Data System (ADS)
Lee, I.-Yang
2013-12-01
The use of large gamma-ray detector arrays with electromagnetic separators is a powerful combination. Various types of gamma-ray detectors have been used; some provide high detector efficiency such as scintillation detector array, others use Ge detectors for good energy resolution, and recently developed Ge energy tracking arrays gives both high peak-to-background ratio and position resolution. Similarly, different types of separators were used to optimize the performance under different experimental requirements and conditions. For example, gas-filled separators were used in heavy element studies for their large efficiency and beam rejection factor. Vacuum separators with good isotope resolution were used in transfer and fragmentation reactions for the study of nuclei far from stability. This paper presents results from recent experiments using gamma-ray detector arrays in combination with electromagnetic separators, and discusses the physics opportunities provided by these instruments. In particular, we review the performance of the instruments currently in use, and discuss the requirements of instruments for future radioactive beam accelerator facilities.
Safety and efficacy of percutaneous nephrolithotomy for the treatment of paediatric urolithiasis
Veeratterapillay, R; Shaw, MBK; Williams, R; Haslam, P; Lall, A; De la Hunt, M; Hasan, ST; Thomas, DJ
2012-01-01
INTRODUCTION Paediatric percutaneous nephrolithotomy (PCNL) has revolutionised the treatment of paediatric nephrolithiasis. Paediatric PCNL has been performed using both adult and paediatric instruments. Stone clearance rates and complications vary according to the technique used and surgeon experience. We present our experience with PCNL using adult instruments and a 28Fr access tract for large renal calculi in children under 18 years. METHODS All patients undergoing PCNL at our institution between 2000 and 2009 were reviewed. Demographics, surgical details and post-operative follow-up information were obtained to identify stone clearance rates and complications. RESULTS PCNL was performed in 32 renal units in 31 patients (mean age: 10.8 years). The mean stone diameter was 19mm (range: 5–40mm). Twenty-six cases required single puncture and six required multiple tracts. Overall, 11 staghorn stones, 10 multiple calyceal stones and 11 single stones were treated. Twenty-seven patients (84%) were completely stone free following initial PCNL. Two cases had extracorporeal shock wave lithotripsy for residual fragments, giving an overall stone free rate of 91% following treatment. There was no significant bleeding or sepsis encountered either during the operation or in the post-operative setting. No patient required or received a blood transfusion. CONCLUSIONS Paediatric PCNL can be performed safely with minimal morbidity using adult instruments for large stone burden, enabling rapid and complete stone clearance. PMID:23131231
Safety and efficacy of percutaneous nephrolithotomy for the treatment of paediatric urolithiasis.
Veeratterapillay, R; Shaw, M B K; Williams, R; Haslam, P; Lall, A; De la Hunt, M; Hasan, S T; Thomas, D J
2012-11-01
Paediatric percutaneous nephrolithotomy (PCNL) has revolutionised the treatment of paediatric nephrolithiasis. Paediatric PCNL has been performed using both adult and paediatric instruments. Stone clearance rates and complications vary according to the technique used and surgeon experience. We present our experience with PCNL using adult instruments and a 28Fr access tract for large renal calculi in children under 18 years. All patients undergoing PCNL at our institution between 2000 and 2009 were reviewed. Demographics, surgical details and post-operative follow-up information were obtained to identify stone clearance rates and complications. PCNL was performed in 32 renal units in 31 patients (mean age: 10.8 years). The mean stone diameter was 19mm (range: 5-40mm). Twenty-six cases required single puncture and six required multiple tracts. Overall, 11 staghorn stones, 10 multiple calyceal stones and 11 single stones were treated. Twenty-seven patients (84%) were completely stone free following initial PCNL. Two cases had extracorporeal shock wave lithotripsy for residual fragments, giving an overall stone free rate of 91% following treatment. There was no significant bleeding or sepsis encountered either during the operation or in the post-operative setting. No patient required or received a blood transfusion. Paediatric PCNL can be performed safely with minimal morbidity using adult instruments for large stone burden, enabling rapid and complete stone clearance.
Estimation of the rain signal in the presence of large surface clutter
NASA Technical Reports Server (NTRS)
Ahamad, Atiq; Moore, Richard K.
1994-01-01
The principal limitation for the use of a spaceborne imaging SAR as a rain radar is the surface-clutter problem. Signals may be estimated in the presence of noise by averaging large numbers of independent samples. This method was applied to obtain an estimate of the rain echo by averaging a set of N(sub c) samples of the clutter in a separate measurement and subtracting the clutter estimate from the combined estimate. The number of samples required for successful estimation (within 10-20%) for off-vertical angles of incidence appears to be prohibitively large. However, by appropriately degrading the resolution in both range and azimuth, the required number of samples can be obtained. For vertical incidence, the number of samples required for successful estimation is reasonable. In estimating the clutter it was assumed that the surface echo is the same outside the rain volume as it is within the rain volume. This may be true for the forest echo, but for convective storms over the ocean the surface echo outside the rain volume is very different from that within. It is suggested that the experiment be performed with vertical incidence over forest to overcome this limitation.
First scientific application of the membrane cryostat technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Montanari, David; Adamowski, Mark; Baller, Bruce R.
2014-01-29
We report on the design, fabrication, performance and commissioning of the first membrane cryostat to be used for scientific application. The Long Baseline Neutrino Experiment (LBNE) has designed and fabricated a membrane cryostat prototype in collaboration with IHI Corporation (IHI). Original goals of the prototype are: to demonstrate the membrane cryostat technology in terms of thermal performance, feasibility for liquid argon, and leak tightness; to demonstrate that we can remove all the impurities from the vessel and achieve the purity requirements in a membrane cryostat without evacuation and using only a controlled gaseous argon purge; to demonstrate that we canmore » achieve and maintain the purity requirements of the liquid argon during filling, purification, and maintenance mode using mole sieve and copper filters from the Liquid Argon Purity Demonstrator (LAPD) R and D project. The purity requirements of a large liquid argon detector such as LBNE are contaminants below 200 parts per trillion oxygen equivalent. This paper gives the requirements, design, construction, and performance of the LBNE membrane cryostat prototype, with experience and results important to the development of the LBNE detector.« less
Summary of tower designs for large horizontal axis wind turbines
NASA Technical Reports Server (NTRS)
Frederick, G. R.; Savino, J. M.
1986-01-01
Towers for large horizontal axis wind turbines, machines with a rotor axis height above 30 meters and rated at more than 500 kW, have varied in configuration, materials of construction, type of construction, height, and stiffness. For example, the U.S. large HAWTs have utilized steel truss type towers and free-standing steel cylindrical towers. In Europe, the trend has been to use only free-standing and guyed cylindrical towers, but both steel and reinforced concrete have been used as materials of construction. These variations in materials of construction and type of construction reflect different engineering approaches to the design of cost effective towers for large HAWTs. Tower designs are the NASA/DOE Mod-5B presently being fabricated. Design goals and requirements that influence tower configuration, height and materials are discussed. In particular, experiences with United States large wind turbine towers are elucidated. Finally, current trends in tower designs for large HAWTs are highlighted.
Anisotropies of the cosmic microwave background in nonstandard cold dark matter models
NASA Technical Reports Server (NTRS)
Vittorio, Nicola; Silk, Joseph
1992-01-01
Small angular scale cosmic microwave anisotropies in flat, vacuum-dominated, cold dark matter cosmological models which fit large-scale structure observations and are consistent with a high value for the Hubble constant are reexamined. New predictions for CDM models in which the large-scale power is boosted via a high baryon content and low H(0) are presented. Both classes of models are consistent with current limits: an improvement in sensitivity by a factor of about 3 for experiments which probe angular scales between 7 arcmin and 1 deg is required, in the absence of very early reionization, to test boosted CDM models for large-scale structure formation.
Software package for modeling spin–orbit motion in storage rings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zyuzin, D. V., E-mail: d.zyuzin@fz-juelich.de
2015-12-15
A software package providing a graphical user interface for computer experiments on the motion of charged particle beams in accelerators, as well as analysis of obtained data, is presented. The software package was tested in the framework of the international project on electric dipole moment measurement JEDI (Jülich Electric Dipole moment Investigations). The specific features of particle spin motion imply the requirement to use a cyclic accelerator (storage ring) consisting of electrostatic elements, which makes it possible to preserve horizontal polarization for a long time. Computer experiments study the dynamics of 10{sup 6}–10{sup 9} particles in a beam during 10{supmore » 9} turns in an accelerator (about 10{sup 12}–10{sup 15} integration steps for the equations of motion). For designing an optimal accelerator structure, a large number of computer experiments on polarized beam dynamics are required. The numerical core of the package is COSY Infinity, a program for modeling spin–orbit dynamics.« less
Rube, I F
1989-01-01
Experiences in a large-scale interlaboratory rescreening of Papanicolaou smears are detailed, and the pros and cons of measuring proficiency in cytology are discussed. Despite the additional work of the rescreening project and some psychological and technical problems, it proved to be a useful measure of the laboratory's performance as a whole. One problem to be avoided in future similar studies is the creation of too many diagnostic categories. Individual testing and certification have been shown to be accurate predictors of proficiency. For cytology, such tests require a strong visual component to test interpretation and judgment skills, such as by the use of glass slides or photomicrographs. The potential of interactive videodisc technology for facilitating cytopathologic teaching and assessment is discussed.
Surface settling in partially filled containers upon step reduction in gravity
NASA Technical Reports Server (NTRS)
Weislogel, Marl M.; Ross, Howard D.
1990-01-01
A large literature exists concerning the equilibrium configurations of free liquid/gas surfaces in reduced gravity environments. Such conditions generally yield surfaces of constant curvature meeting the container wall at a particular (contact) angle. The time required to reach and stabilize about this configuration is less studied for the case of sudden changes in gravity level, e.g. from normal- to low-gravity, as can occur in many drop tower experiments. The particular interest here was to determine the total reorientation time for such surfaces in cylinders (mainly), as a function primarily of contact angle and kinematic viscosity, in order to aid in the development of drop tower experiment design. A large parametric range of tests were performed and, based on an accompanying scale analysis, the complete data set was correlated. The results of other investigations are included for comparison.
The topology of metabolic isotope labeling networks.
Weitzel, Michael; Wiechert, Wolfgang; Nöh, Katharina
2007-08-29
Metabolic Flux Analysis (MFA) based on isotope labeling experiments (ILEs) is a widely established tool for determining fluxes in metabolic pathways. Isotope labeling networks (ILNs) contain all essential information required to describe the flow of labeled material in an ILE. Whereas recent experimental progress paves the way for high-throughput MFA, large network investigations and exact statistical methods, these developments are still limited by the poor performance of computational routines used for the evaluation and design of ILEs. In this context, the global analysis of ILN topology turns out to be a clue for realizing large speedup factors in all required computational procedures. With a strong focus on the speedup of algorithms the topology of ILNs is investigated using graph theoretic concepts and algorithms. A rigorous determination of all cyclic and isomorphic subnetworks, accompanied by the global analysis of ILN connectivity is performed. Particularly, it is proven that ILNs always brake up into a large number of small strongly connected components (SCCs) and, moreover, there are natural isomorphisms between many of these SCCs. All presented techniques are universal, i.e. they do not require special assumptions on the network structure, bidirectionality of fluxes, measurement configuration, or label input. The general results are exemplified with a practically relevant metabolic network which describes the central metabolism of E. coli comprising 10390 isotopomer pools. Exploiting the topological features of ILNs leads to a significant speedup of all universal algorithms for ILE evaluation. It is proven in theory and exemplified with the E. coli example that a speedup factor of about 1000 compared to standard algorithms is achieved. This widely opens the door for new high performance algorithms suitable for high throughput applications and large ILNs. Moreover, for the first time the global topological analysis of ILNs allows to comprehensively describe and understand the general patterns of label flow in complex networks. This is an invaluable tool for the structural design of new experiments and the interpretation of measured data.
Scientific Visualization Tools for Enhancement of Undergraduate Research
NASA Astrophysics Data System (ADS)
Rodriguez, W. J.; Chaudhury, S. R.
2001-05-01
Undergraduate research projects that utilize remote sensing satellite instrument data to investigate atmospheric phenomena pose many challenges. A significant challenge is processing large amounts of multi-dimensional data. Remote sensing data initially requires mining; filtering of undesirable spectral, instrumental, or environmental features; and subsequently sorting and reformatting to files for easy and quick access. The data must then be transformed according to the needs of the investigation(s) and displayed for interpretation. These multidimensional datasets require views that can range from two-dimensional plots to multivariable-multidimensional scientific visualizations with animations. Science undergraduate students generally find these data processing tasks daunting. Generally, researchers are required to fully understand the intricacies of the dataset and write computer programs or rely on commercially available software, which may not be trivial to use. In the time that undergraduate researchers have available for their research projects, learning the data formats, programming languages, and/or visualization packages is impractical. When dealing with large multi-dimensional data sets appropriate Scientific Visualization tools are imperative in allowing students to have a meaningful and pleasant research experience, while producing valuable scientific research results. The BEST Lab at Norfolk State University has been creating tools for multivariable-multidimensional analysis of Earth Science data. EzSAGE and SAGE4D have been developed to sort, analyze and visualize SAGE II (Stratospheric Aerosol and Gas Experiment) data with ease. Three- and four-dimensional visualizations in interactive environments can be produced. EzSAGE provides atmospheric slices in three-dimensions where the researcher can change the scales in the three-dimensions, color tables and degree of smoothing interactively to focus on particular phenomena. SAGE4D provides a navigable four-dimensional interactive environment. These tools allow students to make higher order decisions based on large multidimensional sets of data while diminishing the level of frustration that results from dealing with the details of processing large data sets.
The Intelligent Control System and Experiments for an Unmanned Wave Glider.
Liao, Yulei; Wang, Leifeng; Li, Yiming; Li, Ye; Jiang, Quanquan
2016-01-01
The control system designing of Unmanned Wave Glider (UWG) is challenging since the control system is weak maneuvering, large time-lag and large disturbance, which is difficult to establish accurate mathematical model. Meanwhile, to complete marine environment monitoring in long time scale and large spatial scale autonomously, UWG asks high requirements of intelligence and reliability. This paper focuses on the "Ocean Rambler" UWG. First, the intelligent control system architecture is designed based on the cerebrum basic function combination zone theory and hierarchic control method. The hardware and software designing of the embedded motion control system are mainly discussed. A motion control system based on rational behavior model of four layers is proposed. Then, combining with the line-of sight method(LOS), a self-adapting PID guidance law is proposed to compensate the steady state error in path following of UWG caused by marine environment disturbance especially current. Based on S-surface control method, an improved S-surface heading controller is proposed to solve the heading control problem of the weak maneuvering carrier under large disturbance. Finally, the simulation experiments were carried out and the UWG completed autonomous path following and marine environment monitoring in sea trials. The simulation experiments and sea trial results prove that the proposed intelligent control system, guidance law, controller have favorable control performance, and the feasibility and reliability of the designed intelligent control system of UWG are verified.
The Intelligent Control System and Experiments for an Unmanned Wave Glider
Liao, Yulei; Wang, Leifeng; Li, Yiming; Li, Ye; Jiang, Quanquan
2016-01-01
The control system designing of Unmanned Wave Glider (UWG) is challenging since the control system is weak maneuvering, large time-lag and large disturbance, which is difficult to establish accurate mathematical model. Meanwhile, to complete marine environment monitoring in long time scale and large spatial scale autonomously, UWG asks high requirements of intelligence and reliability. This paper focuses on the “Ocean Rambler” UWG. First, the intelligent control system architecture is designed based on the cerebrum basic function combination zone theory and hierarchic control method. The hardware and software designing of the embedded motion control system are mainly discussed. A motion control system based on rational behavior model of four layers is proposed. Then, combining with the line-of sight method(LOS), a self-adapting PID guidance law is proposed to compensate the steady state error in path following of UWG caused by marine environment disturbance especially current. Based on S-surface control method, an improved S-surface heading controller is proposed to solve the heading control problem of the weak maneuvering carrier under large disturbance. Finally, the simulation experiments were carried out and the UWG completed autonomous path following and marine environment monitoring in sea trials. The simulation experiments and sea trial results prove that the proposed intelligent control system, guidance law, controller have favorable control performance, and the feasibility and reliability of the designed intelligent control system of UWG are verified. PMID:28005956
Rucio, the next-generation Data Management system in ATLAS
NASA Astrophysics Data System (ADS)
Serfon, C.; Barisits, M.; Beermann, T.; Garonne, V.; Goossens, L.; Lassnig, M.; Nairz, A.; Vigne, R.; ATLAS Collaboration
2016-04-01
Rucio is the next-generation of Distributed Data Management (DDM) system benefiting from recent advances in cloud and ;Big Data; computing to address HEP experiments scaling requirements. Rucio is an evolution of the ATLAS DDM system Don Quixote 2 (DQ2), which has demonstrated very large scale data management capabilities with more than 160 petabytes spread worldwide across 130 sites, and accesses from 1,000 active users. However, DQ2 is reaching its limits in terms of scalability, requiring a large number of support staff to operate and being hard to extend with new technologies. Rucio addresses these issues by relying on new technologies to ensure system scalability, cover new user requirements and employ new automation framework to reduce operational overheads. This paper shows the key concepts of Rucio, details the Rucio design, and the technology it employs, the tests that were conducted to validate it and finally describes the migration steps that were conducted to move from DQ2 to Rucio.
Maximizing Macromolecule Crystal Size for Neutron Diffraction Experiments
NASA Technical Reports Server (NTRS)
Judge, R. A.; Kephart, R.; Leardi, R.; Myles, D. A.; Snell, E. H.; vanderWoerd, M.; Curreri, Peter A. (Technical Monitor)
2002-01-01
A challenge in neutron diffraction experiments is growing large (greater than 1 cu mm) macromolecule crystals. In taking up this challenge we have used statistical experiment design techniques to quickly identify crystallization conditions under which the largest crystals grow. These techniques provide the maximum information for minimal experimental effort, allowing optimal screening of crystallization variables in a simple experimental matrix, using the minimum amount of sample. Analysis of the results quickly tells the investigator what conditions are the most important for the crystallization. These can then be used to maximize the crystallization results in terms of reducing crystal numbers and providing large crystals of suitable habit. We have used these techniques to grow large crystals of Glucose isomerase. Glucose isomerase is an industrial enzyme used extensively in the food industry for the conversion of glucose to fructose. The aim of this study is the elucidation of the enzymatic mechanism at the molecular level. The accurate determination of hydrogen positions, which is critical for this, is a requirement that neutron diffraction is uniquely suited for. Preliminary neutron diffraction experiments with these crystals conducted at the Institute Laue-Langevin (Grenoble, France) reveal diffraction to beyond 2.5 angstrom. Macromolecular crystal growth is a process involving many parameters, and statistical experimental design is naturally suited to this field. These techniques are sample independent and provide an experimental strategy to maximize crystal volume and habit for neutron diffraction studies.
Space Construction Automated Fabrication Experiment Definition Study (SCAFEDS), part 2
NASA Technical Reports Server (NTRS)
1978-01-01
The techniques, processes, and equipment required for automatic fabrication and assembly of structural elements in using Shuttle as a launch vehicle, and construction were defined. Additional construction systems operational techniques, processes, and equipment which can be developed and demonstrated in the same program to provide further risk reduction benefits to future large space systems were identified and examined.
Microbial desulfurization of coal
NASA Technical Reports Server (NTRS)
Dastoor, M. N.; Kalvinskas, J. J.
1978-01-01
Experiments indicate that several sulfur-oxidizing bacteria strains have been very efficient in desulfurizing coal. Process occurs at room temperature and does not require large capital investments of high energy inputs. Process may expand use of abundant reserves of high-sulfur bituminous coal, which is currently restricted due to environmental pollution. On practical scale, process may be integrated with modern coal-slurry transportation lines.
ERIC Educational Resources Information Center
Schockaert, Frederik
2014-01-01
School districts at times need to implement structural and programmatic changes requiring students to attend a different school, which tends to elicit strong parental emotions. This qualitative study analyzes how parents in one suburban Rhode Island district responded to a large-scale redistricting at the elementary level in order to (a) attain a…
ERIC Educational Resources Information Center
Lien, Mei-Ching; Ruthruff, Eric
2004-01-01
This study examined how task switching is affected by hierarchical task organization. Traditional task-switching studies, which use a constant temporal and spatial distance between each task element (defined as a stimulus requiring a response), promote a flat task structure. Using this approach, Experiment 1 revealed a large switch cost of 238 ms.…
Meridional overturning and large-scale circulation of the Indian Ocean
NASA Astrophysics Data System (ADS)
Ganachaud, Alexandre; Wunsch, Carl; Marotzke, Jochem; Toole, John
2000-11-01
The large scale Indian Ocean circulation is estimated from a global hydrographic inverse geostrophic box model with a focus on the meridional overturning circulation (MOC). The global model is based on selected recent World Ocean Circulation Experiment (WOCE) sections which in the Indian Basin consist of zonal sections at 32°S, 20°S and 8°S, and a section between Bali and Australia from the Java-Australia Dynamic Experiment (JADE). The circulation is required to conserve mass, salinity, heat, silica and "PO" (170PO4+O2). Near-conservation is imposed within layers bounded by neutral surfaces, while permitting advective and diffusive exchanges between the layers. Conceptually, the derived circulation is an estimate of the average circulation for the period 1987-1995. A deep inflow into the Indian Basin of 11±4 Sv is found, which is in the lower range of previous estimates, but consistent with conservation requirements and the global data set. The Indonesian Throughflow (ITF) is estimated at 15±5 Sv. The flow in the Mozambique Channel is of the same magnitude, implying a weak net flow between Madagascar and Australia. A net evaporation of -0.6±0.4 Sv is found between 32°S and 8°S, consistent with independent estimates. No net heat gain is found over the Indian Basin (0.1 ± 0.2PW north of 32°S) as a consequence of the large warm water influx from the ITF. Through the use of anomaly equations, the average dianeutral upwelling and diffusion between the sections are required and resolved, with values in the range 1-3×10-5 cm s-1 for the upwelling and 2-10 cm2 s-1 for the diffusivity.
NASA Technical Reports Server (NTRS)
1986-01-01
Emerging satellite designs require increasing amounts of electrical power to operate spacecraft instruments and to provide environments suitable for human habitation. In the past, electrical power was generated by covering rigid honeycomb panels with solar cells. This technology results in unacceptable weight and volume penalties when large amounts of power are required. To fill the need for large-area, lightweight solar arrays, a fabrication technique in which solar cells are attached to a copper printed circuit laminated to a plastic sheet was developed. The result is a flexible solar array with one-tenth the stowed volume and one-third the weight of comparably sized rigid arrays. An automated welding process developed to attack the cells to the printed circuit guarantees repeatable welds that are more tolerant of severe environments than conventional soldered connections. To demonstrate the flight readiness of this technology, the Solar Array Flight Experiment (SAFE) was developed and flown on the space shuttle Discovery in September 1984. The tests showed the modes and frequencies of the array to be very close to preflight predictions. Structural damping, however, was higher than anticipated. Electrical performance of the active solar panel was also tested. The flight performance and postflight data evaluation are described.
An intelligent tool for activity data collection.
Sarkar, A M Jehad
2011-01-01
Activity recognition systems using simple and ubiquitous sensors require a large variety of real-world sensor data for not only evaluating their performance but also training the systems for better functioning. However, a tremendous amount of effort is required to setup an environment for collecting such data. For example, expertise and resources are needed to design and install the sensors, controllers, network components, and middleware just to perform basic data collections. It is therefore desirable to have a data collection method that is inexpensive, flexible, user-friendly, and capable of providing large and diverse activity datasets. In this paper, we propose an intelligent activity data collection tool which has the ability to provide such datasets inexpensively without physically deploying the testbeds. It can be used as an inexpensive and alternative technique to collect human activity data. The tool provides a set of web interfaces to create a web-based activity data collection environment. It also provides a web-based experience sampling tool to take the user's activity input. The tool generates an activity log using its activity knowledge and the user-given inputs. The activity knowledge is mined from the web. We have performed two experiments to validate the tool's performance in producing reliable datasets.
Conducting real-time multiplayer experiments on the web.
Hawkins, Robert X D
2015-12-01
Group behavior experiments require potentially large numbers of participants to interact in real time with perfect information about one another. In this paper, we address the methodological challenge of developing and conducting such experiments on the web, thereby broadening access to online labor markets as well as allowing for participation through mobile devices. In particular, we combine a set of recent web development technologies, including Node.js with the Socket.io module, HTML5 canvas, and jQuery, to provide a secure platform for pedagogical demonstrations and scalable, unsupervised experiment administration. Template code is provided for an example real-time behavioral game theory experiment which automatically pairs participants into dyads and places them into a virtual world. In total, this treatment is intended to allow those with a background in non-web-based programming to modify the template, which handles the technical server-client networking details, for their own experiments.
Analyzing the security of an existing computer system
NASA Technical Reports Server (NTRS)
Bishop, M.
1986-01-01
Most work concerning secure computer systems has dealt with the design, verification, and implementation of provably secure computer systems, or has explored ways of making existing computer systems more secure. The problem of locating security holes in existing systems has received considerably less attention; methods generally rely on thought experiments as a critical step in the procedure. The difficulty is that such experiments require that a large amount of information be available in a format that makes correlating the details of various programs straightforward. This paper describes a method of providing such a basis for the thought experiment by writing a special manual for parts of the operating system, system programs, and library subroutines.
System design and animal experiment study of a novel minimally invasive surgical robot.
Wang, Wei; Li, Jianmin; Wang, Shuxin; Su, He; Jiang, Xueming
2016-03-01
Robot-assisted minimally invasive surgery has shown tremendous advances over the traditional technique. However, currently commercialized systems are large and complicated, which vastly raises the system cost and operation room requirements. A MIS robot named 'MicroHand' was developed over the past few years. The basic principle and the key technologies are analyzed in this paper. Comparison between the proposed robot and the da Vinci system is also presented. Finally, animal experiments were carried out to test the performance of MicroHand. Fifteen animal experiments were carried out from July 2013 to December 2013. All animal experiments were finished successfully. The proposed design method is an effective way to resolve the drawbacks of previous generations of the da Vinci surgical system. The animal experiment results confirmed the feasibility of the design. Copyright © 2015 John Wiley & Sons, Ltd.
The Microgravity Vibration Isolation Mount: A Dynamic Model for Optimal Controller Design
NASA Technical Reports Server (NTRS)
Hampton, R. David; Tryggvason, Bjarni V.; DeCarufel, Jean; Townsend, Miles A.; Wagar, William O.
1997-01-01
Vibration acceleration levels on large space platforms exceed the requirements of many space experiments. The Microgravity Vibration Isolation Mount (MIM) was built by the Canadian Space Agency to attenuate these disturbances to acceptable levels, and has been operational on the Russian Space Station Mir since May 1996. It has demonstrated good isolation performance and has supported several materials science experiments. The MIM uses Lorentz (voice-coil) magnetic actuators to levitate and isolate payloads at the individual experiment/sub-experiment (versus rack) level. Payload acceleration, relative position, and relative orientation (Euler-parameter) measurements are fed to a state-space controller. The controller, in turn, determines the actuator currents needed for effective experiment isolation. This paper presents the development of an algebraic, state-space model of the MIM, in a form suitable for optimal controller design.
Wilson-Mendenhall, Christine D; Barrett, Lisa Feldman; Barsalou, Lawrence W
2015-01-01
The tremendous variability within categories of human emotional experience receives little empirical attention. We hypothesized that atypical instances of emotion categories (e.g. pleasant fear of thrill-seeking) would be processed less efficiently than typical instances of emotion categories (e.g. unpleasant fear of violent threat) in large-scale brain networks. During a novel fMRI paradigm, participants immersed themselves in scenarios designed to induce atypical and typical experiences of fear, sadness or happiness (scenario immersion), and then focused on and rated the pleasant or unpleasant feeling that emerged (valence focus) in most trials. As predicted, reliably greater activity in the 'default mode' network (including medial prefrontal cortex and posterior cingulate) was observed for atypical (vs typical) emotional experiences during scenario immersion, suggesting atypical instances require greater conceptual processing to situate the socio-emotional experience. During valence focus, reliably greater activity was observed for atypical (vs typical) emotional experiences in the 'salience' network (including anterior insula and anterior cingulate), suggesting atypical instances place greater demands on integrating shifting body signals with the sensory and social context. Consistent with emerging psychological construction approaches to emotion, these findings demonstrate that is it important to study the variability within common categories of emotional experience. © The Author (2014). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
Development of a Portable Motor Learning Laboratory (PoMLab)
Shinya, Masahiro
2016-01-01
Most motor learning experiments have been conducted in a laboratory setting. In this type of setting, a huge and expensive manipulandum is frequently used, requiring a large budget and wide open space. Subjects also need to travel to the laboratory, which is a burden for them. This burden is particularly severe for patients with neurological disorders. Here, we describe the development of a novel application based on Unity3D and smart devices, e.g., smartphones or tablet devices, that can be used to conduct motor learning experiments at any time and in any place, without requiring a large budget and wide open space and without the burden of travel on subjects. We refer to our application as POrtable Motor learning LABoratory, or PoMLab. PoMLab is a multiplatform application that is available and sharable for free. We investigated whether PoMLab could be an alternative to the laboratory setting using a visuomotor rotation paradigm that causes sensory prediction error, enabling the investigation of how subjects minimize the error. In the first experiment, subjects could adapt to a constant visuomotor rotation that was abruptly applied at a specific trial. The learning curve for the first experiment could be modeled well using a state space model, a mathematical model that describes the motor leaning process. In the second experiment, subjects could adapt to a visuomotor rotation that gradually increased each trial. The subjects adapted to the gradually increasing visuomotor rotation without being aware of the visuomotor rotation. These experimental results have been reported for conventional experiments conducted in a laboratory setting, and our PoMLab application could reproduce these results. PoMLab can thus be considered an alternative to the laboratory setting. We also conducted follow-up experiments in university physical education classes. A state space model that was fit to the data obtained in the laboratory experiments could predict the learning curves obtained in the follow-up experiments. Further, we investigated the influence of vibration function, weight, and screen size on learning curves. Finally, we compared the learning curves obtained in the PoMLab experiments to those obtained in the conventional reaching experiments. The results of the in-class experiments show that PoMLab can be used to conduct motor learning experiments at any time and place. PMID:27348223
Development of a Portable Motor Learning Laboratory (PoMLab).
Takiyama, Ken; Shinya, Masahiro
2016-01-01
Most motor learning experiments have been conducted in a laboratory setting. In this type of setting, a huge and expensive manipulandum is frequently used, requiring a large budget and wide open space. Subjects also need to travel to the laboratory, which is a burden for them. This burden is particularly severe for patients with neurological disorders. Here, we describe the development of a novel application based on Unity3D and smart devices, e.g., smartphones or tablet devices, that can be used to conduct motor learning experiments at any time and in any place, without requiring a large budget and wide open space and without the burden of travel on subjects. We refer to our application as POrtable Motor learning LABoratory, or PoMLab. PoMLab is a multiplatform application that is available and sharable for free. We investigated whether PoMLab could be an alternative to the laboratory setting using a visuomotor rotation paradigm that causes sensory prediction error, enabling the investigation of how subjects minimize the error. In the first experiment, subjects could adapt to a constant visuomotor rotation that was abruptly applied at a specific trial. The learning curve for the first experiment could be modeled well using a state space model, a mathematical model that describes the motor leaning process. In the second experiment, subjects could adapt to a visuomotor rotation that gradually increased each trial. The subjects adapted to the gradually increasing visuomotor rotation without being aware of the visuomotor rotation. These experimental results have been reported for conventional experiments conducted in a laboratory setting, and our PoMLab application could reproduce these results. PoMLab can thus be considered an alternative to the laboratory setting. We also conducted follow-up experiments in university physical education classes. A state space model that was fit to the data obtained in the laboratory experiments could predict the learning curves obtained in the follow-up experiments. Further, we investigated the influence of vibration function, weight, and screen size on learning curves. Finally, we compared the learning curves obtained in the PoMLab experiments to those obtained in the conventional reaching experiments. The results of the in-class experiments show that PoMLab can be used to conduct motor learning experiments at any time and place.
GPU accelerated fuzzy connected image segmentation by using CUDA.
Zhuge, Ying; Cao, Yong; Miller, Robert W
2009-01-01
Image segmentation techniques using fuzzy connectedness principles have shown their effectiveness in segmenting a variety of objects in several large applications in recent years. However, one problem of these algorithms has been their excessive computational requirements when processing large image datasets. Nowadays commodity graphics hardware provides high parallel computing power. In this paper, we present a parallel fuzzy connected image segmentation algorithm on Nvidia's Compute Unified Device Architecture (CUDA) platform for segmenting large medical image data sets. Our experiments based on three data sets with small, medium, and large data size demonstrate the efficiency of the parallel algorithm, which achieves a speed-up factor of 7.2x, 7.3x, and 14.4x, correspondingly, for the three data sets over the sequential implementation of fuzzy connected image segmentation algorithm on CPU.
Size matters: large objects capture attention in visual search.
Proulx, Michael J
2010-12-23
Can objects or events ever capture one's attention in a purely stimulus-driven manner? A recent review of the literature set out the criteria required to find stimulus-driven attentional capture independent of goal-directed influences, and concluded that no published study has satisfied that criteria. Here visual search experiments assessed whether an irrelevantly large object can capture attention. Capture of attention by this static visual feature was found. The results suggest that a large object can indeed capture attention in a stimulus-driven manner and independent of displaywide features of the task that might encourage a goal-directed bias for large items. It is concluded that these results are either consistent with the stimulus-driven criteria published previously or alternatively consistent with a flexible, goal-directed mechanism of saliency detection.
Mating competitiveness of sterile male Anopheles coluzzii in large cages.
Maïga, Hamidou; Damiens, David; Niang, Abdoulaye; Sawadogo, Simon P; Fatherhaman, Omnia; Lees, Rosemary S; Roux, Olivier; Dabiré, Roch K; Ouédraogo, Georges A; Tripet, Fréderic; Diabaté, Abdoulaye; Gilles, Jeremie R L
2014-11-26
Understanding the factors that account for male mating competitiveness is critical to the development of the sterile insect technique (SIT). Here, the effects of partial sterilization with 90 Gy of radiation on sexual competitiveness of Anopheles coluzzii allowed to mate in different ratios of sterile to untreated males have been assessed. Moreover, competitiveness was compared between males allowed one versus two days of contact with females. Sterile and untreated males four to six days of age were released in large cages (~1.75 sq m) with females of similar age at the following ratios of sterile males: untreated males: untreated virgin females: 100:100:100, 300:100:100, 500:100:100 (three replicates of each) and left for two days. Competitiveness was determined by assessing the egg hatch rate and the insemination rate, determined by dissecting recaptured females. An additional experiment was conducted with a ratio of 500:100:100 and a mating period of either one or two days. Two controls of 0:100:100 (untreated control) and 100:0:100 (sterile control) were used in each experiment. When males and females consort for two days with different ratios, a significant difference in insemination rate was observed between ratio treatments. The competitiveness index (C) of sterile males compared to controls was 0.53. The number of days of exposure to mates significantly increased the insemination rate, as did the increased number of males present in the untreated: sterile male ratio treatments, but the number of days of exposure did not have any effect on the hatch rate. The comparability of the hatch rates between experiments suggest that An. coluzzii mating competitiveness experiments in large cages could be run for one instead of two days, shortening the required length of the experiment. Sterilized males were half as competitive as untreated males, but an effective release ratio of at least five sterile for one untreated male has the potential to impact the fertility of a wild female population. However, further trials in field conditions with wild males and females should be undertaken to estimate the ratio of sterile males to wild males required to produce an effect on wild populations.
The impact of supercomputers on experimentation: A view from a national laboratory
NASA Technical Reports Server (NTRS)
Peterson, V. L.; Arnold, J. O.
1985-01-01
The relative roles of large scale scientific computers and physical experiments in several science and engineering disciplines are discussed. Increasing dependence on computers is shown to be motivated both by the rapid growth in computer speed and memory, which permits accurate numerical simulation of complex physical phenomena, and by the rapid reduction in the cost of performing a calculation, which makes computation an increasingly attractive complement to experimentation. Computer speed and memory requirements are presented for selected areas of such disciplines as fluid dynamics, aerodynamics, aerothermodynamics, chemistry, atmospheric sciences, astronomy, and astrophysics, together with some examples of the complementary nature of computation and experiment. Finally, the impact of the emerging role of computers in the technical disciplines is discussed in terms of both the requirements for experimentation and the attainment of previously inaccessible information on physical processes.
Gas mixture studies for streamer operated Resistive Plate Chambers
NASA Astrophysics Data System (ADS)
Paoloni, A.; Longhin, A.; Mengucci, A.; Pupilli, F.; Ventura, M.
2016-06-01
Resistive Plate Chambers operated in streamer mode are interesting detectors in neutrino and astro-particle physics applications (like OPERA and ARGO experiments). Such experiments are typically characterized by large area apparatuses with no stringent requirements on detector aging and rate capabilities. In this paper, results of cosmic ray tests performed on a RPC prototype using different gas mixtures are presented, the principal aim being the optimization of the TetraFluoroPropene concentration in Argon-based mixtures. The introduction of TetraFluoroPropene, besides its low Global Warming Power, is helpful because it simplifies safety requirements allowing to remove also isobutane from the mixture. Results obtained with mixtures containing SF6, CF4, CO2, N2 and He are also shown, presented both in terms of detectors properties (efficiency, multiple-streamer probability and time resolution) and in terms of streamer characteristics.
Prevention of mental disorders requires action on adverse childhood experiences.
Jorm, Anthony F; Mulder, Roger T
2018-04-01
The increased availability of treatment has not reduced the prevalence of mental disorders, suggesting a need for a greater emphasis on prevention. With chronic physical diseases, successful prevention efforts have focused on reducing the big risk factors. If this approach is applied to mental disorders, the big risk factors are adverse childhood experiences, which have major effects on most classes of mental disorder across the lifespan. While the evidence base is limited, there is support for a number of interventions to reduce adverse childhood experiences, including an important role for mental health professionals. Taking action on adverse childhood experiences may be our best chance of emulating the success of public health action to prevent chronic physical diseases and thereby reduce the large global burden of mental disorders.
Validation of a unique concept for a low-cost, lightweight space-deployable antenna structure
NASA Technical Reports Server (NTRS)
Freeland, R. E.; Bilyeu, G. D.; Veal, G. R.
1993-01-01
An experiment conducted in the framework of a NASA In-Space Technology Experiments Program based on a concept of inflatable deployable structures is described. The concept utilizes very low inflation pressure to maintain the required geometry on orbit and gravity-induced deflection of the structure precludes any meaningful ground-based demonstrations of functions performance. The experiment is aimed at validating and characterizing the mechanical functional performance of a 14-m-diameter inflatable deployable reflector antenna structure in the orbital operational environment. Results of the experiment are expected to significantly reduce the user risk associated with using large space-deployable antennas by demonstrating the functional performance of a concept that meets the criteria for low-cost, lightweight, and highly reliable space-deployable structures.
Large optical glass blanks for the ELT generation
NASA Astrophysics Data System (ADS)
Jedamzik, Ralf; Petzold, Uwe; Dietrich, Volker; Wittmer, Volker; Rexius, Olga
2016-07-01
The upcoming extremely large telescope projects like the E-ELT, TMT or GMT telescopes require not only large amount of mirror blank substrates but have also sophisticated instrument setups. Common instrument components are atmospheric dispersion correctors that compensate for the varying atmospheric path length depending on the telescope inclination angle. These elements consist usually of optical glass blanks that have to be large due to the increased size of the focal beam of the extremely large telescopes. SCHOTT has a long experience in producing and delivering large optical glass blanks for astronomical applications up to 1 m and in homogeneity grades up to H3 quality in the past. The most common optical glass available in large formats is SCHOTT N-BK7. But other glass types like F2 or LLF1 can also be produced in formats up to 1 m. The extremely large telescope projects partly demand atmospheric dispersion components even in sizes beyond 1m up to a range of 1.5 m diameter. The production of such large homogeneous optical glass banks requires tight control of all process steps. To cover this demand in the future SCHOTT initiated a research project to improve the large optical blank production process steps from melting to annealing and measurement. Large optical glass blanks are measured in several sub-apertures that cover the total clear aperture of the application. With SCHOTT's new stitching software it is now possible to combine individual sub-aperture measurements to a total homogeneity map of the blank. In this presentation first results will be demonstrated.
A Roadmap for HEP Software and Computing R&D for the 2020s
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alves, Antonio Augusto, Jr; et al.
Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to preparemore » for this software upgrade.« less
NASA Technical Reports Server (NTRS)
1979-01-01
The development of large space structure (LSS) technology is discussed, with emphasis on space fabricated structures which are automatically manufactured in space from sheet-strip materials and assembled on-orbit. It is concluded that an LSS flight demonstration using an Automated Beam Builder and the orbiter as a construction base, could be performed in the 1983-1984 time period. The estimated cost is $24 million exclusive of shuttle launch costs. During the mission, a simple space platform could be constructed in-orbit to accommodate user requirements associated with earth viewing and materials exposure experiments needs.
Pulsed beam of extremely large helium droplets
NASA Astrophysics Data System (ADS)
Kuma, Susumu; Azuma, Toshiyuki
2017-12-01
We generated a pulsed helium droplet beam with average droplet diameters of up to 2 μ m using a solenoid pulsed valve operated at temperatures as low as 7 K. The droplet diameter was controllable over two orders of magnitude, or six orders of the number of atoms per droplet, by lowering the valve temperature from 21 to 7 K. A sudden droplet size change attributed to the so-called ;supercritical expansion; was firstly observed in pulsed mode, which is necessary to obtain the micrometer-scale droplets. This beam source is beneficial for experiments that require extremely large helium droplets in intense, pulsed form.
NASA Technical Reports Server (NTRS)
Schwan, Karsten
1994-01-01
Atmospheric modeling is a grand challenge problem for several reasons, including its inordinate computational requirements and its generation of large amounts of data concurrent with its use of very large data sets derived from measurement instruments like satellites. In addition, atmospheric models are typically run several times, on new data sets or to reprocess existing data sets, to investigate or reinvestigate specific chemical or physical processes occurring in the earth's atmosphere, to understand model fidelity with respect to observational data, or simply to experiment with specific model parameters or components.
Dispersal Mutualism Incorporated into Large-Scale, Infrequent Disturbances
Parker, V. Thomas
2015-01-01
Because of their influence on succession and other community interactions, large-scale, infrequent natural disturbances also should play a major role in mutualistic interactions. Using field data and experiments, I test whether mutualisms have been incorporated into large-scale wildfire by whether the outcomes of a mutualism depend on disturbance. In this study a seed dispersal mutualism is shown to depend on infrequent, large-scale disturbances. A dominant shrubland plant (Arctostaphylos species) produces seeds that make up a persistent soil seed bank and requires fire to germinate. In post-fire stands, I show that seedlings emerging from rodent caches dominate sites experiencing higher fire intensity. Field experiments show that rodents (Perimyscus californicus, P. boylii) do cache Arctostaphylos fruit and bury most seed caches to a sufficient depth to survive a killing heat pulse that a fire might drive into the soil. While the rodent dispersal and caching behavior itself has not changed compared to other habitats, the environmental transformation caused by wildfire converts the caching burial of seed from a dispersal process to a plant fire adaptive trait, and provides the context for stimulating subsequent life history evolution in the plant host. PMID:26151560
NASA Astrophysics Data System (ADS)
Johnston, William; Ernst, M.; Dart, E.; Tierney, B.
2014-04-01
Today's large-scale science projects involve world-wide collaborations depend on moving massive amounts of data from an instrument to potentially thousands of computing and storage systems at hundreds of collaborating institutions to accomplish their science. This is true for ATLAS and CMS at the LHC, and it is true for the climate sciences, Belle-II at the KEK collider, genome sciences, the SKA radio telescope, and ITER, the international fusion energy experiment. DOE's Office of Science has been collecting science discipline and instrument requirements for network based data management and analysis for more than a decade. As a result of this certain key issues are seen across essentially all science disciplines that rely on the network for significant data transfer, even if the data quantities are modest compared to projects like the LHC experiments. These issues are what this talk will address; to wit: 1. Optical signal transport advances enabling 100 Gb/s circuits that span the globe on optical fiber with each carrying 100 such channels; 2. Network router and switch requirements to support high-speed international data transfer; 3. Data transport (TCP is still the norm) requirements to support high-speed international data transfer (e.g. error-free transmission); 4. Network monitoring and testing techniques and infrastructure to maintain the required error-free operation of the many R&E networks involved in international collaborations; 5. Operating system evolution to support very high-speed network I/O; 6. New network architectures and services in the LAN (campus) and WAN networks to support data-intensive science; 7. Data movement and management techniques and software that can maximize the throughput on the network connections between distributed data handling systems, and; 8. New approaches to widely distributed workflow systems that can support the data movement and analysis required by the science. All of these areas must be addressed to enable large-scale, widely distributed data analysis systems, and the experience of the LHC can be applied to other scientific disciplines. In particular, specific analogies to the SKA will be cited in the talk.
Experimental industrial signal acquisition board in a large scientific device
NASA Astrophysics Data System (ADS)
Zeng, Xiangzhen; Ren, Bin
2018-02-01
In order to measure the industrial signal of neutrino experiment, a set of general-purpose industrial data acquisition board has been designed. It includes the function of switch signal input and output, and the function of analog signal input. The main components are signal isolation amplifier and filter circuit, ADC circuit, microcomputer systems and isolated communication interface circuit. Through the practical experiments, it shows that the system is flexible, reliable, convenient and economical, and the system has characters of high definition and strong anti-interference ability. Thus, the system fully meets the design requirements.
Analysis of the Quality of Parabolic Flight
NASA Technical Reports Server (NTRS)
Lambot, Thomas; Ord, Stephan F.
2016-01-01
Parabolic flights allow researchers to conduct several 20 second micro-gravity experiments in the course of a single day. However, the measurement can have large variations over the course of a single parabola, requiring the knowledge of the actual flight environment as a function of time. The NASA Flight Opportunities program (FO) reviewed the acceleration data of over 400 parabolic flights and investigated the quality of micro-gravity for scientific purposes. It was discovered that a parabolic flight can be segmented into multiple parts of different quality and duration, a fact to be aware of when planning an experiment.
Entanglement witnessing and quantum cryptography with nonideal ferromagnetic detectors
NASA Astrophysics Data System (ADS)
Kłobus, Waldemar; Grudka, Andrzej; Baumgartner, Andreas; Tomaszewski, Damian; Schönenberger, Christian; Martinek, Jan
2014-03-01
We investigate theoretically the use of nonideal ferromagnetic contacts as a means to detect quantum entanglement of electron spins in transport experiments. We use a designated entanglement witness and find a minimal spin polarization of η >1/√3 ≈58% required to demonstrate spin entanglement. This is significantly less stringent than the ubiquitous tests of Bell's inequality with η >1/√24 >≈84%. In addition, we discuss the impact of decoherence and noise on entanglement detection and apply the presented framework to a simple quantum cryptography protocol. Our results are directly applicable to a large variety of experiments.
NASA Astrophysics Data System (ADS)
Sinn, T.; McRobb, M.; Wujek, A.; Skogby, J.; Rogberg, F.; Wang, J.; Vasile, M.; Tibert, G.; Mao, H.
2015-09-01
The Suaineadh experiment had the purpose to deploy a 2m x 2m web in milli gravity conditions by using the centrifugal forces acting on corner sections of a web that is spinning around a central hub. Continuous exploration of our solar system and beyond requires ever larger structures in space. But the biggest problem nowadays is the transport of these structures into space due to launch vehicle payload volume constrains. By making the space structures deployable with minimum storage properties, this constrain may be bypassed. Deployable concepts range from inflatables, foldables, electrostatic to spinning web deployment. The advantage of the web deployment is the very low storage volume and the simple deployment mechanism. These webs can act as lightweight platforms for the construction of large structures in space without the huge expense of launching heavy structures from Earth. The Suaineadh experiment was launched onboard the sounding rocket REXUS12 in March 2012. After achieving the required altidue, the Suaineadh experiment was ejected from the rocket in order to be fully free flying. A specially designed spinning wheel in the ejected section was then used to spin up the experiment until the required rate is achieved for web deployment to commence. Unfortunately during re-entry, the probe was lost and also a recovery mission in August 20 1 2 was only able to find minor components of the experiment. After 18 month, in September 201 3 , the experiment was found in the wilderness of Northern Sweden. In the following months all data from the experiment could be recovered. The images and accelerometer~ data that has been analysed showed the deployment of the web and a very interesting three dimensional behaviour that differs greatly from on ground two dimensional prototype tests. This paper will give an overview on the recovered data and it will present the analysed results of the Suaineadh spinning web experiment.
Off-design Performance Analysis of Multi-Stage Transonic Axial Compressors
NASA Astrophysics Data System (ADS)
Du, W. H.; Wu, H.; Zhang, L.
Because of the complex flow fields and component interaction in modern gas turbine engines, they require extensive experiment to validate performance and stability. The experiment process can become expensive and complex. Modeling and simulation of gas turbine engines are way to reduce experiment costs, provide fidelity and enhance the quality of essential experiment. The flow field of a transonic compressor contains all the flow aspects, which are difficult to present-boundary layer transition and separation, shock-boundary layer interactions, and large flow unsteadiness. Accurate transonic axial compressor off-design performance prediction is especially difficult, due in large part to three-dimensional blade design and the resulting flow field. Although recent advancements in computer capacity have brought computational fluid dynamics to forefront of turbomachinery design and analysis, the grid and turbulence model still limit Reynolds-average Navier-Stokes (RANS) approximations in the multi-stage transonic axial compressor flow field. Streamline curvature methods are still the dominant numerical approach as an important tool for turbomachinery to analyze and design, and it is generally accepted that streamline curvature solution techniques will provide satisfactory flow prediction as long as the losses, deviation and blockage are accurately predicted.
Advances in Grid Computing for the FabrIc for Frontier Experiments Project at Fermialb
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herner, K.; Alba Hernandex, A. F.; Bhat, S.
The FabrIc for Frontier Experiments (FIFE) project is a major initiative within the Fermilab Scientic Computing Division charged with leading the computing model for Fermilab experiments. Work within the FIFE project creates close collaboration between experimenters and computing professionals to serve high-energy physics experiments of diering size, scope, and physics area. The FIFE project has worked to develop common tools for job submission, certicate management, software and reference data distribution through CVMFS repositories, robust data transfer, job monitoring, and databases for project tracking. Since the projects inception the experiments under the FIFE umbrella have signicantly matured, and present an increasinglymore » complex list of requirements to service providers. To meet these requirements, the FIFE project has been involved in transitioning the Fermilab General Purpose Grid cluster to support a partitionable slot model, expanding the resources available to experiments via the Open Science Grid, assisting with commissioning dedicated high-throughput computing resources for individual experiments, supporting the eorts of the HEP Cloud projects to provision a variety of back end resources, including public clouds and high performance computers, and developing rapid onboarding procedures for new experiments and collaborations. The larger demands also require enhanced job monitoring tools, which the project has developed using such tools as ElasticSearch and Grafana. in helping experiments manage their large-scale production work ows. This group in turn requires a structured service to facilitate smooth management of experiment requests, which FIFE provides in the form of the Production Operations Management Service (POMS). POMS is designed to track and manage requests from the FIFE experiments to run particular work ows, and support troubleshooting and triage in case of problems. Recently a new certicate management infrastructure called Distributed Computing Access with Federated Identities (DCAFI) has been put in place that has eliminated our dependence on a Fermilab-specic third-party Certicate Authority service and better accommodates FIFE collaborators without a Fermilab Kerberos account. DCAFI integrates the existing InCommon federated identity infrastructure, CILogon Basic CA, and a MyProxy service using a new general purpose open source tool. We will discuss the general FIFE onboarding strategy, progress in expanding FIFE experiments presence on the Open Science Grid, new tools for job monitoring, the POMS service, and the DCAFI project.« less
Advances in Grid Computing for the Fabric for Frontier Experiments Project at Fermilab
NASA Astrophysics Data System (ADS)
Herner, K.; Alba Hernandez, A. F.; Bhat, S.; Box, D.; Boyd, J.; Di Benedetto, V.; Ding, P.; Dykstra, D.; Fattoruso, M.; Garzoglio, G.; Kirby, M.; Kreymer, A.; Levshina, T.; Mazzacane, A.; Mengel, M.; Mhashilkar, P.; Podstavkov, V.; Retzke, K.; Sharma, N.; Teheran, J.
2017-10-01
The Fabric for Frontier Experiments (FIFE) project is a major initiative within the Fermilab Scientific Computing Division charged with leading the computing model for Fermilab experiments. Work within the FIFE project creates close collaboration between experimenters and computing professionals to serve high-energy physics experiments of differing size, scope, and physics area. The FIFE project has worked to develop common tools for job submission, certificate management, software and reference data distribution through CVMFS repositories, robust data transfer, job monitoring, and databases for project tracking. Since the projects inception the experiments under the FIFE umbrella have significantly matured, and present an increasingly complex list of requirements to service providers. To meet these requirements, the FIFE project has been involved in transitioning the Fermilab General Purpose Grid cluster to support a partitionable slot model, expanding the resources available to experiments via the Open Science Grid, assisting with commissioning dedicated high-throughput computing resources for individual experiments, supporting the efforts of the HEP Cloud projects to provision a variety of back end resources, including public clouds and high performance computers, and developing rapid onboarding procedures for new experiments and collaborations. The larger demands also require enhanced job monitoring tools, which the project has developed using such tools as ElasticSearch and Grafana. in helping experiments manage their large-scale production workflows. This group in turn requires a structured service to facilitate smooth management of experiment requests, which FIFE provides in the form of the Production Operations Management Service (POMS). POMS is designed to track and manage requests from the FIFE experiments to run particular workflows, and support troubleshooting and triage in case of problems. Recently a new certificate management infrastructure called Distributed Computing Access with Federated Identities (DCAFI) has been put in place that has eliminated our dependence on a Fermilab-specific third-party Certificate Authority service and better accommodates FIFE collaborators without a Fermilab Kerberos account. DCAFI integrates the existing InCommon federated identity infrastructure, CILogon Basic CA, and a MyProxy service using a new general purpose open source tool. We will discuss the general FIFE onboarding strategy, progress in expanding FIFE experiments presence on the Open Science Grid, new tools for job monitoring, the POMS service, and the DCAFI project.
ERIC Educational Resources Information Center
Thinyane, Hannah
2010-01-01
In 2001 Marc Prensky coined the phrase "digital natives" to refer to the new generation of students who have grown up surrounded by technology. His companion papers spurred large amounts of research, debating changes that are required to curricula and pedagogical models to cater for the changes in the student population. This article…
GPU applications for data processing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vladymyrov, Mykhailo, E-mail: mykhailo.vladymyrov@cern.ch; Aleksandrov, Andrey; INFN sezione di Napoli, I-80125 Napoli
2015-12-31
Modern experiments that use nuclear photoemulsion imply fast and efficient data acquisition from the emulsion can be performed. The new approaches in developing scanning systems require real-time processing of large amount of data. Methods that use Graphical Processing Unit (GPU) computing power for emulsion data processing are presented here. It is shown how the GPU-accelerated emulsion processing helped us to rise the scanning speed by factor of nine.
ERIC Educational Resources Information Center
Loomis, Corey Campbell
2011-01-01
Comprehensive high schools have been unable to meet the needs of all students (Cotton, 2004). Students face challenges, and some have been labeled "at risk" for various reasons. These students constitute a unique group who often require more time, energy, and resources than large, comprehensive schools can offer. Consequently, they fall behind on…
Simulation of High Power Lasers (Preprint)
2010-06-01
integration, which requires communication of zonal boundary information after each inner- iteration of the Gauss - Seidel or Jacobi matrix solver. Each...experiment consisting of a supersonic (M~2.2) converging -diverging nozzle section with secondary mass injection in the nozzle expansion downstream of...consists of a section of a supersonic (M~2.2) converging -diverging slit nozzle with one large and two small orifices that inject reactants into the
Reconsidering Our Domestic Violence System.
Starsoneck, Leslie; Ake, George
2018-01-01
Children's exposure to domestic violence is well established as an adverse childhood experience (ACE). Much is known about the impact of this exposure, but efforts to ameliorate its effects are too often unsuccessful. Reconsidering our response requires a candid assessment of whether convening large and disparate systems leads to the best outcome. ©2018 by the North Carolina Institute of Medicine and The Duke Endowment. All rights reserved.
Defects in Hardwood Veneer Logs: Their Frequency and Importance
E.S. Harrar
1954-01-01
Most southern hardwood veneer and plywood plants have some method of classifying logs by grade to control the purchase price paid for logs bought on the open market. Such log-grading systems have been developed by experience and are dependent to a large extent upon the ability of the grader and his knowledge of veneer grades and yields required for the specific product...
Recycling/Disposal Alternatives for Depleted Uranium Wastes
1981-01-01
could pass before new sites are available. Recent experi- ence with attempts to dispose of wastes generated by cleanup of the Three Mile Island...commercial sector. Nonordnance uses include counterweights, Lallast, shielding , and special appli- cations machinery. Although the purity requirements...Refer- ence 11). Since the activity of the tailings is higher than allow- able for unrestricted access, large earth -dam retention systems, known as
Fluid flow and heat convection studies for actively cooled airframes
NASA Technical Reports Server (NTRS)
Mills, A. F.
1993-01-01
This report details progress made on the jet impingement - liquid crystal - digital imaging experiment. With the design phase complete, the experiment is currently in the construction phase. In order to reach this phase two design related issues were resolved. The first issue was to determine NASP leading edge active cooling design parameters. Meetings were arranged with personnel at SAIC International, Torrance, CA in order to obtain recent publications that characterized expected leading edge heat fluxes as well as other details of NASP operating conditions. The information in these publications was used to estimate minimum and maximum jet Reynolds numbers needed to accomplish the required leading edge cooling, and to determine the parameters of the experiment. The details of this analysis are shown in Appendix A. One of the concerns for the NASP design is that of thermal stress due to large surface temperature gradients. Using a series of circular jets to cool the leading edge will cause a non-uniform temperature distribution and potentially large thermal stresses. Therefore it was decided to explore the feasibility of using a slot jet to cool the leading edge. The literature contains many investigations into circular jet heat transfer but few investigations of slot jet heat transfer. The first experiments will be done on circular jets impinging on a fiat plate and results compared to previously published data to establish the accuracy of the method. Subsequent experiments will be slot jets impinging on full scale models of the NASP leading edge. Table 1 shows the range of parameters to be explored. Next a preliminary design of the experiment was done. Previous papers which used a similar experimental technique were studied and elements of those experiments adapted to the jet impingement study. Trade-off studies were conducted to determine which design was the least expensive, easy to construct, and easy to use. Once the final design was settled, vendors were contacted to verify that equipment could be obtained to meet our specifications. Much of the equipment required to complete the construction of the experiment has been ordered or received. The material status list is shown in Appendix B.
Harnessing QbD, Programming Languages, and Automation for Reproducible Biology.
Sadowski, Michael I; Grant, Chris; Fell, Tim S
2016-03-01
Building robust manufacturing processes from biological components is a task that is highly complex and requires sophisticated tools to describe processes, inputs, and measurements and administrate management of knowledge, data, and materials. We argue that for bioengineering to fully access biological potential, it will require application of statistically designed experiments to derive detailed empirical models of underlying systems. This requires execution of large-scale structured experimentation for which laboratory automation is necessary. This requires development of expressive, high-level languages that allow reusability of protocols, characterization of their reliability, and a change in focus from implementation details to functional properties. We review recent developments in these areas and identify what we believe is an exciting trend that promises to revolutionize biotechnology. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Technical Reports Server (NTRS)
Pennington, D. F.; Man, T.; Persons, B.
1977-01-01
The DOT classification for transportation, the military classification for quantity distance, and hazard compatibility grouping used to regulate the transportation and storage of explosives are presented along with a discussion of tests used in determining sensitivity of propellants to an impact/shock environment in the absence of a large explosive donor. The safety procedures and requirements of a Scout launch vehicle, Western and Eastern Test Range, and the Minuteman, Delta, and Poseidon programs are reviewed and summarized. Requirements of the space transportation system safety program include safety reviews from the subsystem level to the completed payload. The Scout safety procedures will satisfy a portion of these requirements but additional procedures need to be implemented to comply with the safety requirements for Shuttle operation from the Eastern Test Range.
An internal variable constitutive model for the large deformation of metals at high temperatures
NASA Technical Reports Server (NTRS)
Brown, Stuart; Anand, Lallit
1988-01-01
The advent of large deformation finite element methodologies is beginning to permit the numerical simulation of hot working processes whose design until recently has been based on prior industrial experience. Proper application of such finite element techniques requires realistic constitutive equations which more accurately model material behavior during hot working. A simple constitutive model for hot working is the single scalar internal variable model for isotropic thermal elastoplasticity proposed by Anand. The model is recalled and the specific scalar functions, for the equivalent plastic strain rate and the evolution equation for the internal variable, presented are slight modifications of those proposed by Anand. The modified functions are better able to represent high temperature material behavior. The monotonic constant true strain rate and strain rate jump compression experiments on a 2 percent silicon iron is briefly described. The model is implemented in the general purpose finite element program ABAQUS.
The Web Based Monitoring Project at the CMS Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lopez-Perez, Juan Antonio; Badgett, William; Behrens, Ulf
The Compact Muon Solenoid is a large a complex general purpose experiment at the CERN Large Hadron Collider (LHC), built and maintained by many collaborators from around the world. Efficient operation of the detector requires widespread and timely access to a broad range of monitoring and status information. To the end the Web Based Monitoring (WBM) system was developed to present data to users located anywhere from many underlying heterogeneous sources, from real time messaging systems to relational databases. This system provides the power to combine and correlate data in both graphical and tabular formats of interest to the experimenters,more » including data such as beam conditions, luminosity, trigger rates, detector conditions, and many others, allowing for flexibility on the user’s side. This paper describes the WBM system architecture and describes how the system has been used from the beginning of data taking until now (Run1 and Run 2).« less
The web based monitoring project at the CMS experiment
NASA Astrophysics Data System (ADS)
Lopez-Perez, Juan Antonio; Badgett, William; Behrens, Ulf; Chakaberia, Irakli; Jo, Youngkwon; Maeshima, Kaori; Maruyama, Sho; Patrick, James; Rapsevicius, Valdas; Soha, Aron; Stankevicius, Mantas; Sulmanas, Balys; Toda, Sachiko; Wan, Zongru
2017-10-01
The Compact Muon Solenoid is a large a complex general purpose experiment at the CERN Large Hadron Collider (LHC), built and maintained by many collaborators from around the world. Efficient operation of the detector requires widespread and timely access to a broad range of monitoring and status information. To that end the Web Based Monitoring (WBM) system was developed to present data to users located anywhere from many underlying heterogeneous sources, from real time messaging systems to relational databases. This system provides the power to combine and correlate data in both graphical and tabular formats of interest to the experimenters, including data such as beam conditions, luminosity, trigger rates, detector conditions, and many others, allowing for flexibility on the user’s side. This paper describes the WBM system architecture and describes how the system has been used from the beginning of data taking until now (Run1 and Run 2).
Web Based Monitoring in the CMS Experiment at CERN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Badgett, William; Borrello, Laura; Chakaberia, Irakli
2014-09-03
The Compact Muon Solenoid (CMS) is a large and complex general purpose experiment at the CERN Large Hadron Collider (LHC), built and maintained by many collaborators from around the world. Efficient operation of the detector requires widespread and timely access to a broad range of monitoring and status information. To this end the Web Based Monitoring (WBM) system was developed to present data to users located anywhere from many underlying heterogeneous sources, from real time messaging systems to relational databases. This system provides the power to combine and correlate data in both graphical and tabular formats of interest to themore » experimenters, including data such as beam conditions, luminosity, trigger rates, detector conditions, and many others, allowing for flexibility on the user side. This paper describes the WBM system architecture and describes how the system was used during the first major data taking run of the LHC.« less
NASA Technical Reports Server (NTRS)
Pitts, D. E.; Badhwar, G.
1980-01-01
The development of agricultural remote sensing systems requires knowledge of agricultural field size distributions so that the sensors, sampling frames, image interpretation schemes, registration systems, and classification systems can be properly designed. Malila et al. (1976) studied the field size distribution for wheat and all other crops in two Kansas LACIE (Large Area Crop Inventory Experiment) intensive test sites using ground observations of the crops and measurements of their field areas based on current year rectified aerial photomaps. The field area and size distributions reported in the present investigation are derived from a representative subset of a stratified random sample of LACIE sample segments. In contrast to previous work, the obtained results indicate that most field-size distributions are not log-normally distributed. The most common field size observed in this study was 10 acres for most crops studied.
Advantages of cryopumping with liquid hydrogen instead of helium refrigerators
NASA Technical Reports Server (NTRS)
Anderson, J. W.; Tueller, J. E.
1972-01-01
Open loop hydrogen vaporizers and helium refrigerators are compared for operational complexity, installation and operating cost, and safety requirements. Data from two vacuum chambers using helium refrigerators are used to provide comparative data. In general, the use of hydrogen is attractive in the larger systems, even when extra safety precautions are taken. Emotional resistance to the use of hydrogen because of safety requirements is considered great. However, the experience gained in the handling of large quantities of cryogenics, particularly hydrogen and liquefied natural gases, should be considered in the design of open loop hydrogen cooling systems.
Tolerance Studies of the Mu2e Solenoid System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lopes, M. L.; Ambrosio, G.; Buehler, M.
2014-01-01
The muon-to-electron conversion experiment at Fermilab is designed to explore charged lepton flavor violation. It is composed of three large superconducting solenoids, namely, the production solenoid, the transport solenoid, and the detector solenoid. Each subsystem has a set of field requirements. Tolerance sensitivity studies of the magnet system were performed with the objective of demonstrating that the present magnet design meets all the field requirements. Systematic and random errors were considered on the position and alignment of the coils. The study helps to identify the critical sources of errors and which are translated to coil manufacturing and mechanical support tolerances.
Studying three-phase supply in school
NASA Astrophysics Data System (ADS)
Singhal, Amit Kumar; Arun, P.
2009-07-01
The power distributions of nearly all major countries have accepted three-phase distribution as a standard. With increasing power requirements of instrumentation today even a small physics laboratory requires a three-phase supply. While physics students are given an introduction to this in passing, no experimental work is done with three-phase supply due to the possibility of accidents while working with such large power. We believe a conceptual understanding of three-phase supply would be useful for physics students, with hands-on experience using a simple circuit that can be assembled even in a high school laboratory.
Can tonne-scale direct detection experiments discover nuclear dark matter?
NASA Astrophysics Data System (ADS)
Butcher, Alistair; Kirk, Russell; Monroe, Jocelyn; West, Stephen M.
2017-10-01
Models of nuclear dark matter propose that the dark sector contains large composite states consisting of dark nucleons in analogy to Standard Model nuclei. We examine the direct detection phenomenology of a particular class of nuclear dark matter model at the current generation of tonne-scale liquid noble experiments, in particular DEAP-3600 and XENON1T. In our chosen nuclear dark matter scenario distinctive features arise in the recoil energy spectra due to the non-point-like nature of the composite dark matter state. We calculate the number of events required to distinguish these spectra from those of a standard point-like WIMP state with a decaying exponential recoil spectrum. In the most favourable regions of nuclear dark matter parameter space, we find that a few tens of events are needed to distinguish nuclear dark matter from WIMPs at the 3 σ level in a single experiment. Given the total exposure time of DEAP-3600 and XENON1T we find that at best a 2 σ distinction is possible by these experiments individually, while 3 σ sensitivity is reached for a range of parameters by the combination of the two experiments. We show that future upgrades of these experiments have potential to distinguish a large range of nuclear dark matter models from that of a WIMP at greater than 3 σ.
Can tonne-scale direct detection experiments discover nuclear dark matter?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butcher, Alistair; Kirk, Russell; Monroe, Jocelyn
Models of nuclear dark matter propose that the dark sector contains large composite states consisting of dark nucleons in analogy to Standard Model nuclei. We examine the direct detection phenomenology of a particular class of nuclear dark matter model at the current generation of tonne-scale liquid noble experiments, in particular DEAP-3600 and XENON1T. In our chosen nuclear dark matter scenario distinctive features arise in the recoil energy spectra due to the non-point-like nature of the composite dark matter state. We calculate the number of events required to distinguish these spectra from those of a standard point-like WIMP state with amore » decaying exponential recoil spectrum. In the most favourable regions of nuclear dark matter parameter space, we find that a few tens of events are needed to distinguish nuclear dark matter from WIMPs at the 3 σ level in a single experiment. Given the total exposure time of DEAP-3600 and XENON1T we find that at best a 2 σ distinction is possible by these experiments individually, while 3 σ sensitivity is reached for a range of parameters by the combination of the two experiments. We show that future upgrades of these experiments have potential to distinguish a large range of nuclear dark matter models from that of a WIMP at greater than 3 σ .« less
NASA Technical Reports Server (NTRS)
Vetter, A. A.; Maxwell, C. D.; Swean, T. F., Jr.; Demetriades, S. T.; Oliver, D. A.; Bangerter, C. D.
1981-01-01
Data from sufficiently well-instrumented, short-duration experiments at AEDC/HPDE, Reynolds Metal Co., and Hercules, Inc., are compared to analyses with multidimensional and time-dependent simulations with the STD/MHD computer codes. These analyses reveal detailed features of major transient events, severe loss mechanisms, and anomalous MHD behavior. In particular, these analyses predicted higher-than-design voltage drops, Hall voltage overshoots, and asymmetric voltage drops before the experimental data were available. The predictions obtained with these analyses are in excellent agreement with the experimental data and the failure predictions are consistent with the experiments. The design of large, high-interaction or advanced MHD experiments will require application of sophisticated, detailed and comprehensive computational procedures in order to account for the critical mechanisms which led to the observed behavior in these experiments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paguio, R. R.; Smith, G. E.; Taylor, J. L.
Z-Beamlet (ZBL) experiments conducted at the PECOS test facility at Sandia National Laboratories (SNL) investigated the nonlinear processes in laser plasma interaction (or laserplasma instabilities LPI) that complicate the deposition of laser energy by enhanced absorption, backscatter, filamentation and beam-spray that can occur in large-scale laser-heated gas cell targets. These targets and experiments were designed to provide better insight into the physics of the laser preheat stage of the Magnetized Liner Inertial Fusion (MagLIF) scheme being tested on the SNL Z-machine. The experiments aim to understand the tradeoffs between laser spot size, laser pulse shape, laser entrance hole (LEH) windowmore » thickness, and fuel density for laser preheat. Gas cell target design evolution and fabrication adaptations to accommodate the evolving experiment and scientific requirements are also described in this paper.« less
Paguio, R. R.; Smith, G. E.; Taylor, J. L.; ...
2017-12-04
Z-Beamlet (ZBL) experiments conducted at the PECOS test facility at Sandia National Laboratories (SNL) investigated the nonlinear processes in laser plasma interaction (or laserplasma instabilities LPI) that complicate the deposition of laser energy by enhanced absorption, backscatter, filamentation and beam-spray that can occur in large-scale laser-heated gas cell targets. These targets and experiments were designed to provide better insight into the physics of the laser preheat stage of the Magnetized Liner Inertial Fusion (MagLIF) scheme being tested on the SNL Z-machine. The experiments aim to understand the tradeoffs between laser spot size, laser pulse shape, laser entrance hole (LEH) windowmore » thickness, and fuel density for laser preheat. Gas cell target design evolution and fabrication adaptations to accommodate the evolving experiment and scientific requirements are also described in this paper.« less
Possession experiences in dissociative identity disorder: a preliminary study.
Ross, Colin A
2011-01-01
Dissociative trance disorder, which includes possession experiences, was introduced as a provisional diagnosis requiring further study in the Diagnostic and Statistical Manual of Mental Disorders (4th ed.). Consideration is now being given to including possession experiences within dissociative identity disorder (DID) in the Diagnostic and Statistical Manual of Mental Disorders (5th ed.), which is due to be published in 2013. In order to provide empirical data relevant to the relationship between DID and possession states, I analyzed data on the prevalence of trance, possession states, sleepwalking, and paranormal experiences in 3 large samples: patients with DID from North America; psychiatric outpatients from Shanghai, China; and a general population sample from Winnipeg, Canada. Trance, sleepwalking, paranormal, and possession experiences were much more common in the DID patients than in the 2 comparison samples. The study is preliminary and exploratory in nature because the samples were not matched in any way.
Balanced Branching in Transcription Termination
NASA Technical Reports Server (NTRS)
Harrington, K. J.; Laughlin, R. B.; Liang, S.
2001-01-01
The theory of stochastic transcription termination based on free-energy competition requires two or more reaction rates to be delicately balanced over a wide range of physical conditions. A large body of work on glasses and large molecules suggests that this should be impossible in such a large system in the absence of a new organizing principle of matter. We review the experimental literature of termination and find no evidence for such a principle but many troubling inconsistencies, most notably anomalous memory effects. These suggest that termination has a deterministic component and may conceivably be not stochastic at all. We find that a key experiment by Wilson and von Hippel allegedly refuting deterministic termination was an incorrectly analyzed regulatory effect of Mg(2+) binding.
Support of an Active Science Project by a Large Information System: Lessons for the EOS Era
NASA Technical Reports Server (NTRS)
Angelici, Gary L.; Skiles, J. W.; Popovici, Lidia Z.
1993-01-01
The ability of large information systems to support the changing data requirements of active science projects is being tested in a NASA collaborative study. This paper briefly profiles both the active science project and the large information system involved in this effort and offers some observations about the effectiveness of the project support. This is followed by lessons that are important for those participating in large information systems that need to support active science projects or that make available the valuable data produced by these projects. We learned in this work that it is difficult for a large information system focused on long term data management to satisfy the requirements of an on-going science project. For example, in order to provide the best service, it is important for all information system staff to keep focused on the needs and constraints of the scientists in the development of appropriate services. If the lessons learned in this and other science support experiences are not applied by those involved with large information systems of the EOS (Earth Observing System) era, then the final data products produced by future science projects may not be robust or of high quality, thereby making the conduct of the project science less efficacious and reducing the value of these unique suites of data for future research.
Roark, Dana A; O'Toole, Alice J; Abdi, Hervé; Barrett, Susan E
2006-01-01
Familiarity with a face or person can support recognition in tasks that require generalization to novel viewing contexts. Using naturalistic viewing conditions requiring recognition of people from face or whole body gait stimuli, we investigated the effects of familiarity, facial motion, and direction of learning/test transfer on person recognition. Participants were familiarized with previously unknown people from gait videos and were tested on faces (experiment 1a) or were familiarized with faces and were tested with gait videos (experiment 1b). Recognition was more accurate when learning from the face and testing with the gait videos, than when learning from the gait videos and testing with the face. The repetition of a single stimulus, either the face or gait, produced strong recognition gains across transfer conditions. Also, the presentation of moving faces resulted in better performance than that of static faces. In experiment 2, we investigated the role of facial motion further by testing recognition with static profile images. Motion provided no benefit for recognition, indicating that structure-from-motion is an unlikely source of the motion advantage found in the first set of experiments.
Solenoid Magnet System for the Fermilab Mu2e Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lamm, M. J.; Andreev, N.; Ambrosio, G.
2011-12-14
The Fermilab Mu2e experiment seeks to measure the rare process of direct muon to electron conversion in the field of a nucleus. Key to the design of the experiment is a system of three superconducting solenoids; a muon production solenoid (PS) which is a 1.8 m aperture axially graded solenoid with a peak field of 5 T used to focus secondary pions and muons from a production target located in the solenoid aperture; an 'S shaped' transport solenoid (TS) which selects and transports the subsequent muons towards a stopping target; a detector solenoid (DS) which is an axially graded solenoidmore » at the upstream end to focus transported muons to a stopping target, and a spectrometer solenoid at the downstream end to accurately measure the momentum of the outgoing conversion elections. The magnetic field requirements, the significant magnetic coupling between the solenoids, the curved muon transport geometry and the large beam induced energy deposition into the superconducting coils pose significant challenges to the magnetic, mechanical, and thermal design of this system. In this paper a conceptual design for the magnetic system which meets the Mu2e experiment requirements is presented.« less
NASA Astrophysics Data System (ADS)
Abu-Alqumsan, Mohammad; Kapeller, Christoph; Hintermüller, Christoph; Guger, Christoph; Peer, Angelika
2017-12-01
Objective. This paper discusses the invariance and variability in interaction error-related potentials (ErrPs), where a special focus is laid upon the factors of (1) the human mental processing required to assess interface actions (2) time (3) subjects. Approach. Three different experiments were designed as to vary primarily with respect to the mental processes that are necessary to assess whether an interface error has occurred or not. The three experiments were carried out with 11 subjects in a repeated-measures experimental design. To study the effect of time, a subset of the recruited subjects additionally performed the same experiments on different days. Main results. The ErrP variability across the different experiments for the same subjects was found largely attributable to the different mental processing required to assess interface actions. Nonetheless, we found that interaction ErrPs are empirically invariant over time (for the same subject and same interface) and to a lesser extent across subjects (for the same interface). Significance. The obtained results may be used to explain across-study variability of ErrPs, as well as to define guidelines for approaches to the ErrP classifier transferability problem.
Novel diamond cells for neutron diffraction using multi-carat CVD anvils.
Boehler, R; Molaison, J J; Haberl, B
2017-08-01
Traditionally, neutron diffraction at high pressure has been severely limited in pressure because low neutron flux required large sample volumes and therefore large volume presses. At the high-flux Spallation Neutron Source at the Oak Ridge National Laboratory, we have developed new, large-volume diamond anvil cells for neutron diffraction. The main features of these cells are multi-carat, single crystal chemical vapor deposition diamonds, very large diffraction apertures, and gas membranes to accommodate pressure stability, especially upon cooling. A new cell has been tested for diffraction up to 40 GPa with an unprecedented sample volume of ∼0.15 mm 3 . High quality spectra were obtained in 1 h for crystalline Ni and in ∼8 h for disordered glassy carbon. These new techniques will open the way for routine megabar neutron diffraction experiments.
Stick slip, charge separation and decay
Lockner, D.A.; Byerlee, J.D.; Kuksenko, V.S.; Ponomarev, A.V.
1986-01-01
Measurements of charge separation in rock during stable and unstable deformation give unexpectedly large decay times of 50 sec. Time-domain induced polarization experiments on wet and dry rocks give similar decay times and suggest that the same decay mechanisms operate in the induced polarization response as in the relaxation of charge generated by mechanical deformation. These large decay times are attributed to electrochemical processes in the rocks, and they require low-frequency relative permittivity to be very large, in excess of 105. One consequence of large permittivity, and therefore long decay times, is that a significant portion of any electrical charge generated during an earthquake can persist for tens or hundreds of seconds. As a result, electrical disturbances associated with earthquakes should be observable for these lengths of time rather than for the milliseconds previously suggested. ?? 1986 Birka??user Verlag.
Copying of holograms by spot scanning approach.
Okui, Makoto; Wakunami, Koki; Oi, Ryutaro; Ichihashi, Yasuyuki; Jackin, Boaz Jessie; Yamamoto, Kenji
2018-05-20
To replicate holograms, contact copying has conventionally been used. In this approach, a photosensitive material is fixed together with a master hologram and illuminated with a coherent beam. This method is simple and enables high-quality copies; however, it requires a large optical setup for large-area holograms. In this paper, we present a new method of replicating holograms that uses a relatively compact optical system even for the replication of large holograms. A small laser spot that irradiates only part of the hologram is used to reproduce the hologram by scanning the spot over the whole area of the hologram. We report on the results of experiments carried out to confirm the copy quality, along with a guide to design scanning conditions. The results show the potential effectiveness of the large-area hologram replication technology using a relatively compact apparatus.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cutler, Dylan; Frank, Stephen; Slovensky, Michelle
Rich, well-organized building performance and energy consumption data enable a host of analytic capabilities for building owners and operators, from basic energy benchmarking to detailed fault detection and system optimization. Unfortunately, data integration for building control systems is challenging and costly in any setting. Large portfolios of buildings--campuses, cities, and corporate portfolios--experience these integration challenges most acutely. These large portfolios often have a wide array of control systems, including multiple vendors and nonstandard communication protocols. They typically have complex information technology (IT) networks and cybersecurity requirements and may integrate distributed energy resources into their infrastructure. Although the challenges are significant,more » the integration of control system data has the potential to provide proportionally greater value for these organizations through portfolio-scale analytics, comprehensive demand management, and asset performance visibility. As a large research campus, the National Renewable Energy Laboratory (NREL) experiences significant data integration challenges. To meet them, NREL has developed an architecture for effective data collection, integration, and analysis, providing a comprehensive view of data integration based on functional layers. The architecture is being evaluated on the NREL campus through deployment of three pilot implementations.« less
NIFTE: The Near Infrared Faint-Object Telescope Experiment
NASA Technical Reports Server (NTRS)
Bock, James J.; Lange, Andrew E.; Matsumoto, T.; Eisenhardt, Peter B.; Hacking, Perry B.; Schember, Helene R.
1994-01-01
The high sensitivity of large format InSb arrays can be used to obtain deep images of the sky at 3-5 micrometers. In this spectral range cool or highly redshifted objects (e.g. brown dwarfs and protogalaxies) which are not visible at shorter wavelengths may be observed. Sensitivity at these wavelengths in ground-based observations is severly limited by the thermal flux from the telescope and from the earth's atmosphere. The Near Infrared Faint-Object Telescope Experiment (NIFTE), a 50 cm cooled rocket-borne telescope combined with large format, high performance InSb arrays, can reach a limiting flux less than 1 micro-Jy(1-sigma) over a large field-of-view in a single flight. In comparison, the Infrared Space Observatory (ISO) will require days of observation to reach a sensitivity more than one order of magnitude worse over a similar area of the sky. The deep 3-5 micrometer images obtained by the rocket-borne telescope will assist in determining the nature of faint red objects detected by ground-based telescopes at 2 micrometers, and by ISO at wavelengths longer than 5 micrometers.
Fast Readout Architectures for Large Arrays of Digital Pixels: Examples and Applications
Gabrielli, A.
2014-01-01
Modern pixel detectors, particularly those designed and constructed for applications and experiments for high-energy physics, are commonly built implementing general readout architectures, not specifically optimized in terms of speed. High-energy physics experiments use bidimensional matrices of sensitive elements located on a silicon die. Sensors are read out via other integrated circuits bump bonded over the sensor dies. The speed of the readout electronics can significantly increase the overall performance of the system, and so here novel forms of readout architectures are studied and described. These circuits have been investigated in terms of speed and are particularly suited for large monolithic, low-pitch pixel detectors. The idea is to have a small simple structure that may be expanded to fit large matrices without affecting the layout complexity of the chip, while maintaining a reasonably high readout speed. The solutions might be applied to devices for applications not only in physics but also to general-purpose pixel detectors whenever online fast data sparsification is required. The paper presents also simulations on the efficiencies of the systems as proof of concept for the proposed ideas. PMID:24778588
Sabatino, Denise E.; Nichols, Timothy C.; Merricks, Elizabeth; Bellinger, Dwight A.; Herzog, Roland W.; Monahan, Paul E.
2013-01-01
The X-linked bleeding disorder hemophilia is caused by mutations in coagulation factor VIII (hemophilia A) or factor IX (hemophilia B). Unless prophylactic treatment is provided, patients with severe disease (less than 1% clotting activity) typically experience frequent spontaneous bleeds. Current treatment is largely based on intravenous infusion of recombinant or plasma-derived coagulation factor concentrate. More effective factor products are being developed. Moreover, gene therapies for sustained correction of hemophilia are showing much promise in pre-clinical studies and in clinical trials. These advances in molecular medicine heavily depend on availability of well-characterized small and large animal models of hemophilia, primarily hemophilia mice and dogs. Experiments in these animals represent important early and intermediate steps of translational research aimed at development of better and safer treatments for hemophilia, such a protein and gene therapies or immune tolerance protocols. While murine models are excellent for studies of large groups of animals using genetically defined strains, canine models are important for testing scale-up and for longer-term follow-up as well as for studies that require larger blood volumes. PMID:22137432
Using flatbed scanners in the undergraduate optics laboratory—An example of frugal science
NASA Astrophysics Data System (ADS)
Koopman, Thomas; Gopal, Venkatesh
2017-05-01
We describe the use of a low-cost commercial flatbed scanner in the undergraduate teaching laboratory to image large (˜25 cm) interference and diffraction patterns in two dimensions. Such scanners usually have an 8-bit linear photosensor array that can scan large areas (˜28 cm × 22 cm) at very high spatial resolutions (≥100 Megapixels), which makes them versatile large-format imaging devices. We describe how the scanner can be used to image interference and diffraction from rectangular single-slit, double-slit, and circular apertures. The experiments are very simple to setup and require no specialized components besides a small laser and a flatbed scanner. Due to the presence of Automatic Gain Control in the scanner, which we were not able to override, we were unable to get an excellent fit to the data. Interestingly, we found that the less-than-ideal data were actually pedagogically superior as it forced the students to think about the process of data acquisition in much greater detail instead of simply performing the experiment mechanically.
Numerical Simulations of Hypersonic Boundary Layer Transition
NASA Astrophysics Data System (ADS)
Bartkowicz, Matthew David
Numerical schemes for supersonic flows tend to use large amounts of artificial viscosity for stability. This tends to damp out the small scale structures in the flow. Recently some low-dissipation methods have been proposed which selectively eliminate the artificial viscosity in regions which do not require it. This work builds upon the low-dissipation method of Subbareddy and Candler which uses the flux vector splitting method of Steger and Warming but identifies the dissipation portion to eliminate it. Computing accurate fluxes typically relies on large grid stencils or coupled linear systems that become computationally expensive to solve. Unstructured grids allow for CFD solutions to be obtained on complex geometries, unfortunately, it then becomes difficult to create a large stencil or the coupled linear system. Accurate solutions require grids that quickly become too large to be feasible. In this thesis a method is proposed to obtain more accurate solutions using relatively local data, making it suitable for unstructured grids composed of hexahedral elements. Fluxes are reconstructed using local gradients to extend the range of data used. The method is then validated on several test problems. Simulations of boundary layer transition are then performed. An elliptic cone at Mach 8 is simulated based on an experiment at the Princeton Gasdynamics Laboratory. A simulated acoustic noise boundary condition is imposed to model the noisy conditions of the wind tunnel and the transitioning boundary layer observed. A computation of an isolated roughness element is done based on an experiment in Purdue's Mach 6 quiet wind tunnel. The mechanism for transition is identified as an instability in the upstream separation region and a comparison is made to experimental data. In the CFD a fully turbulent boundary layer is observed downstream.
NASA Technical Reports Server (NTRS)
Snyder, C. T.; Fry, E. B.; Drinkwater, F. J., III; Forrest, R. D.; Scott, B. C.; Benefield, T. D.
1972-01-01
A ground-based simulator investigation was conducted in preparation for and correlation with an-flight simulator program. The objective of these studies was to define minimum acceptable levels of static longitudinal stability for landing approach following stability augmentation systems failures. The airworthiness authorities are presently attempting to establish the requirements for civil transports with only the backup flight control system operating. Using a baseline configuration representative of a large delta wing transport, 20 different configurations, many representing negative static margins, were assessed by three research test pilots in 33 hours of piloted operation. Verification of the baseline model to be used in the TIFS experiment was provided by computed and piloted comparisons with a well-validated reference airplane simulation. Pilot comments and ratings are included, as well as preliminary tracking performance and workload data.
Extreme disorder in an ultrahigh-affinity protein complex
NASA Astrophysics Data System (ADS)
Borgia, Alessandro; Borgia, Madeleine B.; Bugge, Katrine; Kissling, Vera M.; Heidarsson, Pétur O.; Fernandes, Catarina B.; Sottini, Andrea; Soranno, Andrea; Buholzer, Karin J.; Nettels, Daniel; Kragelund, Birthe B.; Best, Robert B.; Schuler, Benjamin
2018-03-01
Molecular communication in biology is mediated by protein interactions. According to the current paradigm, the specificity and affinity required for these interactions are encoded in the precise complementarity of binding interfaces. Even proteins that are disordered under physiological conditions or that contain large unstructured regions commonly interact with well-structured binding sites on other biomolecules. Here we demonstrate the existence of an unexpected interaction mechanism: the two intrinsically disordered human proteins histone H1 and its nuclear chaperone prothymosin-α associate in a complex with picomolar affinity, but fully retain their structural disorder, long-range flexibility and highly dynamic character. On the basis of closely integrated experiments and molecular simulations, we show that the interaction can be explained by the large opposite net charge of the two proteins, without requiring defined binding sites or interactions between specific individual residues. Proteome-wide sequence analysis suggests that this interaction mechanism may be abundant in eukaryotes.
Mapping CMMI Level 2 to Scrum Practices: An Experience Report
NASA Astrophysics Data System (ADS)
Diaz, Jessica; Garbajosa, Juan; Calvo-Manzano, Jose A.
CMMI has been adopted advantageously in large companies for improvements in software quality, budget fulfilling, and customer satisfaction. However SPI strategies based on CMMI-DEV require heavy software development processes and large investments in terms of cost and time that medium/small companies do not deal with. The so-called light software development processes, such as Agile Software Development (ASD), deal with these challenges. ASD welcomes changing requirements and stresses the importance of adaptive planning, simplicity and continuous delivery of valuable software by short time-framed iterations. ASD is becoming convenient in a more and more global, and changing software market. It would be greatly useful to be able to introduce agile methods such as Scrum in compliance with CMMI process model. This paper intends to increase the understanding of the relationship between ASD and CMMI-DEV reporting empirical results that confirm theoretical comparisons between ASD practices and CMMI level2.
Spacecraft Dynamics and Control Program at AFRPL
NASA Technical Reports Server (NTRS)
Das, A.; Slimak, L. K. S.; Schloegel, W. T.
1986-01-01
A number of future DOD and NASA spacecraft such as the space based radar will be not only an order of magnitude larger in dimension than the current spacecraft, but will exhibit extreme structural flexibility with very low structural vibration frequencies. Another class of spacecraft (such as the space defense platforms) will combine large physical size with extremely precise pointing requirement. Such problems require a total departure from the traditional methods of modeling and control system design of spacecraft where structural flexibility is treated as a secondary effect. With these problems in mind, the Air Force Rocket Propulsion Laboratory (AFRPL) initiated research to develop dynamics and control technology so as to enable the future large space structures (LSS). AFRPL's effort in this area can be subdivided into the following three overlapping areas: (1) ground experiments, (2) spacecraft modeling and control, and (3) sensors and actuators. Both the in-house and contractual efforts of the AFRPL in LSS are summarized.
Deciphering landslide behavior using large-scale flume experiments
Reid, Mark E.; Iverson, Richard M.; Iverson, Neal R.; LaHusen, Richard G.; Brien, Dianne L.; Logan, Matthew
2008-01-01
Landslides can be triggered by a variety of hydrologic events and they can exhibit a wide range of movement dynamics. Effective prediction requires understanding these diverse behaviors. Precise evaluation in the field is difficult; as an alternative we performed a series of landslide initiation experiments in the large-scale, USGS debris-flow flume. We systematically investigated the effects of three different hydrologic triggering mechanisms, including groundwater exfiltration from bedrock, prolonged rainfall infiltration, and intense bursts of rain. We also examined the effects of initial soil porosity (loose or dense) relative to the soil’s critical-state porosity. Results show that all three hydrologic mechanisms can instigate landsliding, but water pathways, sensor response patterns, and times to failure differ. Initial soil porosity has a profound influence on landslide movement behavior. Experiments using loose soil show rapid soil contraction during failure, with elevated pore pressures liquefying the sediment and creating fast-moving debris flows. In contrast, dense soil dilated upon shearing, resulting in slow, gradual, and episodic motion. These results have fundamental implications for forecasting landslide behavior and developing effective warning systems.
Long term performance studies of large oil-free bakelite resistive plate chamber
NASA Astrophysics Data System (ADS)
Ganai, R.; Roy, A.; Shiroya, M. K.; Agarwal, K.; Ahammed, Z.; Choudhury, S.; Chattopadhyay, S.
2016-09-01
Several high energy physics and neutrino physics experiments worldwide require large-size RPCs to cover wide acceptances. The muon tracking systems in the Iron calorimeter (ICAL) experiment in the India based Neutrino Observatory (INO), India and the near detector in Deep Underground Neutrino Experiment (DUNE) at Fermilab are two such examples. A single gap bakelite RPC of dimension 240 cm × 120 cm, with gas gap of 0.2 cm, has been built and tested at Variable Energy Cyclotron Centre, Kolkata, using indigenous materials procured from the local market. No additional lubricant, like oil has been used on the electrode surfaces for smoothening. The chamber is in operation for > 365 days. We have tested the chamber for its long term operation. The leakage current, bulk resistivity, efficiency, noise rate and time resolution of the chamber have been found to be quite stable during the testing peroid. It has shown an efficiency > 95% with an average time resolution of ~ 0.83 ns at the point of measurement at ~ 8700 V throughout the testing period. Details of the long term performance of the chamber have been discussed.
Cuadros-Inostroza, Alvaro; Caldana, Camila; Redestig, Henning; Kusano, Miyako; Lisec, Jan; Peña-Cortés, Hugo; Willmitzer, Lothar; Hannah, Matthew A
2009-12-16
Metabolite profiling, the simultaneous quantification of multiple metabolites in an experiment, is becoming increasingly popular, particularly with the rise of systems-level biology. The workhorse in this field is gas-chromatography hyphenated with mass spectrometry (GC-MS). The high-throughput of this technology coupled with a demand for large experiments has led to data pre-processing, i.e. the quantification of metabolites across samples, becoming a major bottleneck. Existing software has several limitations, including restricted maximum sample size, systematic errors and low flexibility. However, the biggest limitation is that the resulting data usually require extensive hand-curation, which is subjective and can typically take several days to weeks. We introduce the TargetSearch package, an open source tool which is a flexible and accurate method for pre-processing even very large numbers of GC-MS samples within hours. We developed a novel strategy to iteratively correct and update retention time indices for searching and identifying metabolites. The package is written in the R programming language with computationally intensive functions written in C for speed and performance. The package includes a graphical user interface to allow easy use by those unfamiliar with R. TargetSearch allows fast and accurate data pre-processing for GC-MS experiments and overcomes the sample number limitations and manual curation requirements of existing software. We validate our method by carrying out an analysis against both a set of known chemical standard mixtures and of a biological experiment. In addition we demonstrate its capabilities and speed by comparing it with other GC-MS pre-processing tools. We believe this package will greatly ease current bottlenecks and facilitate the analysis of metabolic profiling data.
2009-01-01
Background Metabolite profiling, the simultaneous quantification of multiple metabolites in an experiment, is becoming increasingly popular, particularly with the rise of systems-level biology. The workhorse in this field is gas-chromatography hyphenated with mass spectrometry (GC-MS). The high-throughput of this technology coupled with a demand for large experiments has led to data pre-processing, i.e. the quantification of metabolites across samples, becoming a major bottleneck. Existing software has several limitations, including restricted maximum sample size, systematic errors and low flexibility. However, the biggest limitation is that the resulting data usually require extensive hand-curation, which is subjective and can typically take several days to weeks. Results We introduce the TargetSearch package, an open source tool which is a flexible and accurate method for pre-processing even very large numbers of GC-MS samples within hours. We developed a novel strategy to iteratively correct and update retention time indices for searching and identifying metabolites. The package is written in the R programming language with computationally intensive functions written in C for speed and performance. The package includes a graphical user interface to allow easy use by those unfamiliar with R. Conclusions TargetSearch allows fast and accurate data pre-processing for GC-MS experiments and overcomes the sample number limitations and manual curation requirements of existing software. We validate our method by carrying out an analysis against both a set of known chemical standard mixtures and of a biological experiment. In addition we demonstrate its capabilities and speed by comparing it with other GC-MS pre-processing tools. We believe this package will greatly ease current bottlenecks and facilitate the analysis of metabolic profiling data. PMID:20015393
Methods for quantifying simple gravity sensing in Drosophila melanogaster.
Inagaki, Hidehiko K; Kamikouchi, Azusa; Ito, Kei
2010-01-01
Perception of gravity is essential for animals: most animals possess specific sense organs to detect the direction of the gravitational force. Little is known, however, about the molecular and neural mechanisms underlying their behavioral responses to gravity. Drosophila melanogaster, having a rather simple nervous system and a large variety of molecular genetic tools available, serves as an ideal model for analyzing the mechanisms underlying gravity sensing. Here we describe an assay to measure simple gravity responses of flies behaviorally. This method can be applied for screening genetic mutants of gravity perception. Furthermore, in combination with recent genetic techniques to silence or activate selective sets of neurons, it serves as a powerful tool to systematically identify neural substrates required for the proper behavioral responses to gravity. The assay requires 10 min to perform, and two experiments can be performed simultaneously, enabling 12 experiments per hour.
Droplet microfluidics for synthetic biology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gach, Philip Charles; Iwai, Kosuke; Kim, Peter Wonhee
Here, synthetic biology is an interdisciplinary field that aims to engineer biological systems for useful purposes. Organism engineering often requires the optimization of individual genes and/or entire biological pathways (consisting of multiple genes). Advances in DNA sequencing and synthesis have recently begun to enable the possibility of evaluating thousands of gene variants and hundreds of thousands of gene combinations. However, such large-scale optimization experiments remain cost-prohibitive to researchers following traditional molecular biology practices, which are frequently labor-intensive and suffer from poor reproducibility. Liquid handling robotics may reduce labor and improve reproducibility, but are themselves expensive and thus inaccessible to mostmore » researchers. Microfluidic platforms offer a lower entry price point alternative to robotics, and maintain high throughput and reproducibility while further reducing operating costs through diminished reagent volume requirements. Droplet microfluidics have shown exceptional promise for synthetic biology experiments, including DNA assembly, transformation/transfection, culturing, cell sorting, phenotypic assays, artificial cells and genetic circuits.« less
NASA Technical Reports Server (NTRS)
Alexander, Harold L.
1991-01-01
Human productivity was studied for extravehicular tasks performed in microgravity, particularly including in-space assembly of truss structures and other large objects. Human factors research probed the anthropometric constraints imposed on microgravity task performance and the associated workstation design requirements. Anthropometric experiments included reach envelope tests conducted using the 3-D Acoustic Positioning System (3DAPS), which permitted measuring the range of reach possible for persons using foot restraints in neutral buoyancy, both with and without space suits. Much neutral buoyancy research was conducted using the support of water to simulate the weightlessness environment of space. It became clear over time that the anticipated EVA requirement associated with the Space Station and with in-space construction of interplanetary probes would heavily burden astronauts, and remotely operated robots (teleoperators) were increasingly considered to absorb the workload. Experience in human EVA productivity led naturally to teleoperation research into the remote performance of tasks through human controlled robots.
Droplet microfluidics for synthetic biology
Gach, Philip Charles; Iwai, Kosuke; Kim, Peter Wonhee; ...
2017-08-10
Here, synthetic biology is an interdisciplinary field that aims to engineer biological systems for useful purposes. Organism engineering often requires the optimization of individual genes and/or entire biological pathways (consisting of multiple genes). Advances in DNA sequencing and synthesis have recently begun to enable the possibility of evaluating thousands of gene variants and hundreds of thousands of gene combinations. However, such large-scale optimization experiments remain cost-prohibitive to researchers following traditional molecular biology practices, which are frequently labor-intensive and suffer from poor reproducibility. Liquid handling robotics may reduce labor and improve reproducibility, but are themselves expensive and thus inaccessible to mostmore » researchers. Microfluidic platforms offer a lower entry price point alternative to robotics, and maintain high throughput and reproducibility while further reducing operating costs through diminished reagent volume requirements. Droplet microfluidics have shown exceptional promise for synthetic biology experiments, including DNA assembly, transformation/transfection, culturing, cell sorting, phenotypic assays, artificial cells and genetic circuits.« less
Technical challenges in the construction of the steady-state stellarator Wendelstein 7-X
NASA Astrophysics Data System (ADS)
Bosch, H.-S.; Wolf, R. C.; Andreeva, T.; Baldzuhn, J.; Birus, D.; Bluhm, T.; Bräuer, T.; Braune, H.; Bykov, V.; Cardella, A.; Durodié, F.; Endler, M.; Erckmann, V.; Gantenbein, G.; Hartmann, D.; Hathiramani, D.; Heimann, P.; Heinemann, B.; Hennig, C.; Hirsch, M.; Holtum, D.; Jagielski, J.; Jelonnek, J.; Kasparek, W.; Klinger, T.; König, R.; Kornejew, P.; Kroiss, H.; Krom, J. G.; Kühner, G.; Laqua, H.; Laqua, H. P.; Lechte, C.; Lewerentz, M.; Maier, J.; McNeely, P.; Messiaen, A.; Michel, G.; Ongena, J.; Peacock, A.; Pedersen, T. S.; Riedl, R.; Riemann, H.; Rong, P.; Rust, N.; Schacht, J.; Schauer, F.; Schroeder, R.; Schweer, B.; Spring, A.; Stäbler, A.; Thumm, M.; Turkin, Y.; Wegener, L.; Werner, A.; Zhang, D.; Zilker, M.; Akijama, T.; Alzbutas, R.; Ascasibar, E.; Balden, M.; Banduch, M.; Baylard, Ch.; Behr, W.; Beidler, C.; Benndorf, A.; Bergmann, T.; Biedermann, C.; Bieg, B.; Biel, W.; Borchardt, M.; Borowitz, G.; Borsuk, V.; Bozhenkov, S.; Brakel, R.; Brand, H.; Brown, T.; Brucker, B.; Burhenn, R.; Buscher, K.-P.; Caldwell-Nichols, C.; Cappa, A.; Cardella, A.; Carls, A.; Carvalho, P.; Ciupiński, Ł.; Cole, M.; Collienne, J.; Czarnecka, A.; Czymek, G.; Dammertz, G.; Dhard, C. P.; Davydenko, V. I.; Dinklage, A.; Drevlak, M.; Drotziger, S.; Dudek, A.; Dumortier, P.; Dundulis, G.; Eeten, P. v.; Egorov, K.; Estrada, T.; Faugel, H.; Fellinger, J.; Feng, Y.; Fernandes, H.; Fietz, W. H.; Figacz, W.; Fischer, F.; Fontdecaba, J.; Freund, A.; Funaba, T.; Fünfgelder, H.; Galkowski, A.; Gates, D.; Giannone, L.; García Regaña, J. M.; Geiger, J.; Geißler, S.; Greuner, H.; Grahl, M.; Groß, S.; Grosman, A.; Grote, H.; Grulke, O.; Haas, M.; Haiduk, L.; Hartfuß, H.-J.; Harris, J. H.; Haus, D.; Hein, B.; Heitzenroeder, P.; Helander, P.; Heller, R.; Hidalgo, C.; Hildebrandt, D.; Höhnle, H.; Holtz, A.; Holzhauer, E.; Holzthüm, R.; Huber, A.; Hunger, H.; Hurd, F.; Ihrke, M.; Illy, S.; Ivanov, A.; Jablonski, S.; Jaksic, N.; Jakubowski, M.; Jaspers, R.; Jensen, H.; Jenzsch, H.; Kacmarczyk, J.; Kaliatk, T.; Kallmeyer, J.; Kamionka, U.; Karaleviciu, R.; Kern, S.; Keunecke, M.; Kleiber, R.; Knauer, J.; Koch, R.; Kocsis, G.; Könies, A.; Köppen, M.; Koslowski, R.; Koshurinov, J.; Krämer-Flecken, A.; Krampitz, R.; Kravtsov, Y.; Krychowiak, M.; Krzesinski, G.; Ksiazek, I.; Kubkowska, M.; Kus, A.; Langish, S.; Laube, R.; Laux, M.; Lazerson, S.; Lennartz, M.; Li, C.; Lietzow, R.; Lohs, A.; Lorenz, A.; Louche, F.; Lubyako, L.; Lumsdaine, A.; Lyssoivan, A.; Maaßberg, H.; Marek, P.; Martens, C.; Marushchenko, N.; Mayer, M.; Mendelevitch, B.; Mertens, Ph.; Mikkelsen, D.; Mishchenko, A.; Missal, B.; Mizuuchi, T.; Modrow, H.; Mönnich, T.; Morizaki, T.; Murakami, S.; Musielok, F.; Nagel, M.; Naujoks, D.; Neilson, H.; Neubauer, O.; Neuner, U.; Nocentini, R.; Noterdaeme, J.-M.; Nührenberg, C.; Obermayer, S.; Offermanns, G.; Oosterbeek, H.; Otte, M.; Panin, A.; Pap, M.; Paquay, S.; Pasch, E.; Peng, X.; Petrov, S.; Pilopp, D.; Pirsch, H.; Plaum, B.; Pompon, F.; Povilaitis, M.; Preinhaelter, J.; Prinz, O.; Purps, F.; Rajna, T.; Récsei, S.; Reiman, A.; Reiter, D.; Remmel, J.; Renard, S.; Rhode, V.; Riemann, J.; Rimkevicius, S.; Riße, K.; Rodatos, A.; Rodin, I.; Romé, M.; Roscher, H.-J.; Rummel, K.; Rummel, Th.; Runov, A.; Ryc, L.; Sachtleben, J.; Samartsev, A.; Sanchez, M.; Sano, F.; Scarabosio, A.; Schmid, M.; Schmitz, H.; Schmitz, O.; Schneider, M.; Schneider, W.; Scheibl, L.; Scholz, M.; Schröder, G.; Schröder, M.; Schruff, J.; Schumacher, H.; Shikhovtsev, I. V.; Shoji, M.; Siegl, G.; Skodzik, J.; Smirnow, M.; Speth, E.; Spong, D. A.; Stadler, R.; Sulek, Z.; Szabó, V.; Szabolics, T.; Szetefi, T.; Szökefalvi-Nagy, Z.; Tereshchenko, A.; Thomsen, H.; Thumm, M.; Timmermann, D.; Tittes, H.; Toi, K.; Tournianski, M.; Toussaint, U. v.; Tretter, J.; Tulipán, S.; Turba, P.; Uhlemann, R.; Urban, J.; Urbonavicius, E.; Urlings, P.; Valet, S.; Van Eester, D.; Van Schoor, M.; Vervier, M.; Viebke, H.; Vilbrandt, R.; Vrancken, M.; Wauters, T.; Weissgerber, M.; Weiß, E.; Weller, A.; Wendorf, J.; Wenzel, U.; Windisch, T.; Winkler, E.; Winkler, M.; Wolowski, J.; Wolters, J.; Wrochna, G.; Xanthopoulos, P.; Yamada, H.; Yokoyama, M.; Zacharias, D.; Zajac, J.; Zangl, G.; Zarnstorff, M.; Zeplien, H.; Zoletnik, S.; Zuin, M.
2013-12-01
The next step in the Wendelstein stellarator line is the large superconducting device Wendelstein 7-X, currently under construction in Greifswald, Germany. Steady-state operation is an intrinsic feature of stellarators, and one key element of the Wendelstein 7-X mission is to demonstrate steady-state operation under plasma conditions relevant for a fusion power plant. Steady-state operation of a fusion device, on the one hand, requires the implementation of special technologies, giving rise to technical challenges during the design, fabrication and assembly of such a device. On the other hand, also the physics development of steady-state operation at high plasma performance poses a challenge and careful preparation. The electron cyclotron resonance heating system, diagnostics, experiment control and data acquisition are prepared for plasma operation lasting 30 min. This requires many new technological approaches for plasma heating and diagnostics as well as new concepts for experiment control and data acquisition.
NASA Technical Reports Server (NTRS)
Braswell, F. M.
1981-01-01
An energetic experiment using the Z80 family of microcomputer components is described. Data collected from the experiment allowed fast and efficient postprocessing, yielding both energy-spectrum and pitch-angle distribution of energetic particles in the D and E regions. Advanced microprocessor system architecture and software concepts were used in the design to cope with the large amount of data being processed. This required the Z80 system to operate at over 80% of its total capacity. The microprocessor system was included in the payloads of three rockets launched during the Energy Budget Campaign at ESRANGE, Kiruna, Sweden in November 1980. Based on preliminary examination of the data, the performance of the experiment was satisfactory and good data were obtained on the energy spectrum and pitch-angle distribution of the particles.
Atmospheric scavenging exhaust
NASA Technical Reports Server (NTRS)
Fenton, D. L.; Purcell, R. Y.
1977-01-01
Solid propellant rocket exhaust was directly utilized to ascertain raindrop scavenging rates for hydrogen chloride. The airborne HCl concentration varied from 0.2 to 10.0 ppm and the raindrop sizes tested included 0.55 mm, 1.1 mm, and 3.0 mm. Two chambers were used to conduct the experiments. A large, rigid walled, spherical chamber stored the exhaust constituents while the smaller chamber housing all the experiments was charged as required with rocket exhaust HCl. Surface uptake experiments demonstrated an HCl concentration dependence for distilled water. Sea water and brackish water HCl uptake was below the detection limit of the chlorine-ion analysis technique employed. Plant life HCl uptake experiments were limited to corn and soybeans. Plant age effectively correlated the HCl uptake data. Metallic corrosion was not significant for single 20 minute exposures to the exhaust HCl under varying relative humidity.
Atmospheric scavenging of solid rocket exhaust effluents
NASA Technical Reports Server (NTRS)
Fenton, D. L.; Purcell, R. Y.
1978-01-01
Solid propellant rocket exhaust was directly utilized to ascertain raindrop scavenging rates for hydrogen chloride. Two chambers were used to conduct the experiments; a large, rigid walled, spherical chamber stored the exhaust constituents, while the smaller chamber housing all the experiments was charged as required with rocket exhaust HCl. Surface uptake experiments demonstrated an HCl concentration dependence for distilled water. Sea water and brackish water HCl uptake was below the detection limit of the chlorine-ion analysis technique used. Plant life HCl uptake experiments were limited to corn and soybeans. Plant age effectively correlated the HCl uptake data. Metallic corrosion was not significant for single 20 minute exposures to the exhaust HCl under varying relative humidity. Characterization of the aluminum oxide particles substantiated the similarity between the constituents of the small scale rocket and the full size vehicles.
Antenna Technology Shuttle Experiment (ATSE)
NASA Technical Reports Server (NTRS)
Freeland, R. E.; Mettler, E.; Miller, L. J.; Rahmet-Samii, Y.; Weber, W. J., III
1987-01-01
Numerous space applications of the future will require mesh deployable antennas of 15 m in diameter or greater for frequencies up to 20 GHz. These applications include mobile communications satellites, orbiting very long baseline interferometry (VLBI) astrophysics missions, and Earth remote sensing missions. A Lockheed wrap rip antennas was used as the test article. The experiments covered a broad range of structural, control, and RF discipline objectives, which is fulfilled in total, would greatly reduce the risk of employing these antenna systems in future space applications. It was concluded that a flight experiment of a relatively large mesh deployable reflector is achievable with no major technological or cost drivers. The test articles and the instrumentation are all within the state of the art and in most cases rely on proven flight hardware. Every effort was made to design the experiments for low cost.
Feshbach Prize: New Phenomena and New Physics from Strongly-Correlated Quantum Matter
NASA Astrophysics Data System (ADS)
Carlson, Joseph A.
2017-01-01
Strongly correlated quantum matter is ubiquitous in physics from cold atoms to nuclei to the cold dense matter found in neutron stars. Experiments from table-top to the extremely large scale experiments including FRIB and LIGO will help determine the properties of matter across an incredible scale of distances and energies. Questions to be addressed include the existence of exotic states of matter in cold atoms and nuclei, the response of this correlated matter to external probes, and the behavior of matter in extreme astrophysical environments. A more complete understanding is required, both to understand these diverse phenomena and to employ this understanding to probe for new underlying physics in experiments including neutrinoless double beta decay and accelerator neutrino experiments. I will summarize some aspects of our present understanding and highlight several important prospects for the future.
NASA Astrophysics Data System (ADS)
Iwatsuki, Masami; Kato, Yoriyuki; Yonekawa, Akira
State-of-the-art Internet technologies allow us to provide advanced and interactive distance education services. However, we could not help but gather students for experiments and exercises in an education for engineering because large-scale equipments and expensive software are required. On the other hand, teleoperation systems with robot manipulator or vehicle via Internet have been developed in the field of robotics. By fusing these two techniques, we can realize remote experiment and exercise systems for the engineering education based on World Wide Web. This paper presents how to construct the remote environment that allows students to take courses on experiment and exercise independently of their locations. By using the proposed system, users can exercise and practice remotely about control of a manipulator and a robot vehicle and programming of image processing.
Cryogenic Design of the Setup for MARE-1 in Milan
NASA Astrophysics Data System (ADS)
Schaeffer, D.; Arnaboldi, C.; Ceruti, G.; Ferri, E.; Kilbourne, C.; Kraft-Bermuth, S.; Margesin, B.; McCammon, D.; Monfardini, A.; Nucciotti, A.; Pessina, G.; Previtali, E.; Sisti, M.
2008-05-01
A large worldwide collaboration is growing around the project of Micro-calorimeter Arrays for a Rhenium Experiment (MARE) for a direct calorimetric measurement of the neutrino mass. To validate the use of cryogenic detectors by checking the presence of unexpected systematic errors, two first experiments are planned using the available techniques composed of arrays of 300 detectors to measure 1010 events in a reasonable time of 3 years (step MARE-1) to reach a sensitivity on the neutrino mass of ˜2 eV/c2. Our experiment in Milan is based on compensated doped silicon implanted thermistor arrays made in NASA/GSFC and on AgReO4 crystals. We present here the design of the cryogenic system that integrates all the requirements for such experiment (electronics for high impedances, low parasitic capacitances, low micro-phonic noise).
Glovebox Integrated Microgravity Isolation Technology (g-LIMIT): A Linearized State-Space Model
NASA Technical Reports Server (NTRS)
Hampton, R. David; Calhoun, Philip C.; Whorton, Mark S.
2001-01-01
Vibration acceleration levels on large space platforms exceed the requirements of many space experiments. The Glovebox Integrated Microgravity Isolation Technology (g-LIMIT) is being built by the NASA Marshall Space Flight Center to attenuate these disturbances to acceptable levels. G-LIMIT uses Lorentz (voice-coil) magnetic actuators to levitate and isolate payloads at the individual experiment/sub-experiment (versus rack) level. Payload acceleration, relative position, and relative orientation measurements are fed to a state-space controller. The controller, in turn, determines the actuator Currents needed for effective experiment isolation. This paper presents the development of an algebraic, state-space model of g-LIMIT, in a form suitable for optimal controller design. The equations are first derived using Newton's Second Law directly, then simplified to a linear form for the purpose of controller design.
NASA Astrophysics Data System (ADS)
Chehbouni, G.; Goodrich, D.; Kustas, B.; Sorooshian, S.; Shuttleworth, J.; Richter, H.
2008-12-01
The Monsoon'90 Experiment conducted at the USDA-ARS Walnut Gulch Experimental Watershed in southeast Arizona was the start of a long arc of subsequent experiments and research that were larger, longer-term, more international, more interdisciplinary, and led to more direct integration of science for decision making and watershed management. In this era, much of our research and science must be more directly relevant to decision-makers and natural resource managers as they increasingly require sophisticated levels of expert findings and scientific results (e.g. interdisciplinary) to make informed decisions. Significant effort beyond focused, single disciplinary research is required conduct interdisciplinary science typical in large scale field experiments. Even greater effort is required to effectively integrate our research across the physical and ecological sciences for direct use by policy and decision makers. This presentation will provide an overview of the evolution of this arc of experiments and long-term projects into a mature integrated science and decision making program. It will discuss the transition in project focus from science and research for understanding; through science for addressing a need; to integrated science and policy development. At each stage the research conducted became more interdisciplinary, first across abiotic disciplines (hydrology, remote sensing, atmospheric science), then by merging abiotic and biotic disciplines (adding ecology and plant physiology), and finally a further integration of economic and social sciences with and policy and decision making for resource management. Lessons learned from this experience will be reviewed with the intent providing guidance to ensure that the resulting research is socially and scientifically relevant and will not only result in cutting edge science but will also directly address the needs of policy makers and resource managers.
NASA Technical Reports Server (NTRS)
Valinia, Azita; Moe, Rud; Seery, Bernard D.; Mankins, John C.
2013-01-01
We present a concept for an ISS-based optical system assembly demonstration designed to advance technologies related to future large in-space optical facilities deployment, including space solar power collectors and large-aperture astronomy telescopes. The large solar power collector problem is not unlike the large astronomical telescope problem, but at least conceptually it should be easier in principle, given the tolerances involved. We strive in this application to leverage heavily the work done on the NASA Optical Testbed Integration on ISS Experiment (OpTIIX) effort to erect a 1.5 m imaging telescope on the International Space Station (ISS). Specifically, we examine a robotic assembly sequence for constructing a large (meter diameter) slightly aspheric or spherical primary reflector, comprised of hexagonal mirror segments affixed to a lightweight rigidizing backplane structure. This approach, together with a structured robot assembler, will be shown to be scalable to the area and areal densities required for large-scale solar concentrator arrays.
The NRAO Observing for University Classes Program
NASA Astrophysics Data System (ADS)
Cannon, John M.; Van Moorsel, Gustaaf A.
2017-01-01
The NRAO "Observing for University Classes" program is a tremendous resource for instructors of courses in observational astronomy. As a service to the astronomical and educational communities, the NRAO offers small amounts of observing time on the Very Large Array (VLA) and the Very Long Baseline Array to such instructors. The data can be used by students and faculty to demonstrate radio astronomy theory with modern data products. Further, the results may lead to publication; this is a unique opportunity for faculty members to integrate research into the classroom. Previous experience with NRAO facilities is required for instructors; individuals without radio astronomy experience can take advantage of other NRAO educational opportunities (e.g., the Synthesis Imaging Workshop) prior to using the program. No previous experience with radio astronomy data is required for students; this is the primary target audience of the program. To demonstrate concept, this poster describes three different VLA observing programs that have been completed using the "Observing for University Classes" resource at Macalester College; undergraduate students have published the results of all three of these programs. Other recent "Observing for University Classes" programs are also described.
Low-background Gamma Spectroscopy at Sanford Underground Laboratory
NASA Astrophysics Data System (ADS)
Chiller, Christopher; Alanson, Angela; Mei, Dongming
2014-03-01
Rare-event physics experiments require the use of material with unprecedented radio-purity. Low background counting assay capabilities and detectors are critical for determining the sensitivity of the planned ultra-low background experiments. A low-background counting, LBC, facility has been built at the 4850-Level Davis Campus of the Sanford Underground Research Facility to perform screening of material and detector parts. Like many rare event physics experiments, our LBC uses lead shielding to mitigate background radiation. Corrosion of lead brick shielding in subterranean installations creates radon plate-out potential as well as human risks of ingestible or respirable lead compounds. Our LBC facilities employ an exposed lead shield requiring clean smooth surfaces. A cleaning process of low-activity silica sand blasting and borated paraffin hot coating preservation was employed to guard against corrosion due to chemical and biological exposures. The resulting lead shield maintains low background contribution integrity while fully encapsulating the lead surface. We report the performance of the current LBC and a plan to develop a large germanium well detector for PMT screening. Support provided by Sd governors research center-CUBED, NSF PHY-0758120 and Sanford Lab.
Exaggerated risk: prospect theory and probability weighting in risky choice.
Kusev, Petko; van Schaik, Paul; Ayton, Peter; Dent, John; Chater, Nick
2009-11-01
In 5 experiments, we studied precautionary decisions in which participants decided whether or not to buy insurance with specified cost against an undesirable event with specified probability and cost. We compared the risks taken for precautionary decisions with those taken for equivalent monetary gambles. Fitting these data to Tversky and Kahneman's (1992) prospect theory, we found that the weighting function required to model precautionary decisions differed from that required for monetary gambles. This result indicates a failure of the descriptive invariance axiom of expected utility theory. For precautionary decisions, people overweighted small, medium-sized, and moderately large probabilities-they exaggerated risks. This effect is not anticipated by prospect theory or experience-based decision research (Hertwig, Barron, Weber, & Erev, 2004). We found evidence that exaggerated risk is caused by the accessibility of events in memory: The weighting function varies as a function of the accessibility of events. This suggests that people's experiences of events leak into decisions even when risk information is explicitly provided. Our findings highlight a need to investigate how variation in decision content produces variation in preferences for risk.
Improving bed turnover time with a bed management system.
Tortorella, Frank; Ukanowicz, Donna; Douglas-Ntagha, Pamela; Ray, Robert; Triller, Maureen
2013-01-01
Efficient patient throughput requires a high degree of coordination and communication. Opportunities abound to improve the patient experience by eliminating waste from the process and improving communication among the multiple disciplines involved in facilitating patient flow. In this article, we demonstrate how an interdisciplinary team at a large tertiary cancer center implemented an electronic bed management system to improve the bed turnover component of the patient throughput process.
2006-05-01
a significant design project that requires development of a large scale software project . A distinct shortcoming of Purdue ECE...18-540: Rapid Prototyping of Computer Systems This is a project -oriented course which will deal with all four aspects of project development ; the...instructors, will develop specifications for a mobile computer to assist in inspection and maintenance. The application will be partitioned
Large Tunable Delays in Fiber and On-Chip Via Conversion/Dispersion
2013-05-01
Venkataraman , Pablo Londero, and Alexander L. Gaeta School of Applied and Engineering Physics, Cornell University, Ithaca, New York 14853, USA (Received...BHAGWAT, VENKATARAMAN , LONDERO, AND GAETA PHYSICAL REVIEW A 81, 053825 (2010) broadening occurs when fast-moving particles rapidly traverse the light...experiments conducted in bulk vapor cells 053825-3 SLEPKOV, BHAGWAT, VENKATARAMAN , LONDERO, AND GAETA PHYSICAL REVIEW A 81, 053825 (2010) require the two
Women in Combat: Attitudes and Experiences of U.S. Military Officers and Enlisted Personnel
2001-12-01
response rate, but it also requires personnel to choose an answer that may not be entirely accurate (Edwards, Thomas, Rosenfled, & Booth -Kewley, 1997, p...question. The surveys were created using guidance largely from Edwards, Thomas, Rosenfeld, and Booth - Kewley (1997), and included slightly modified... Booth -Kewley, S. (1997). How to conduct organizational surveys: A step-by-step guide. Thousand Oaks, CA: Sage Publications. Eitelberg, M
1989-02-03
known that the large majority of neurons in layers Ill, IV and VI receive direct monosynaptic input from the lateral geniculate nucleus (Toyama et al...1974; Ferster and Lindstrom, 1983; Martin, 1987). The receptive fields of lateral geniculate nucleus (LGN) neurons resemble those of retinal ganglion...the lateral geniculate nucleus only. The second stage of the theoretical analysis requires that relevant intracortical connections be incorporated
Effects of motivation on car-following
NASA Technical Reports Server (NTRS)
Boesser, T.
1982-01-01
Speed- and distance control by automobile-drivers is described best by linear models when the leading vehicles speed varies randomly and when the driver is motivated to keep a large distance. A car-following experiment required subjects to follow at 'safe' or at 'close' distance. Transfer-characteristics of the driver were extended by 1 octave when following 'closely'. Nonlinear properties of drivers control-movements are assumed to reflect different motivation-dependent control strategies.
Research on computer-aided design of modern marine power systems
NASA Astrophysics Data System (ADS)
Ding, Dongdong; Zeng, Fanming; Chen, Guojun
2004-03-01
To make the MPS (Marine Power System) design process more economical and easier, a new CAD scheme is brought forward which takes much advantage of VR (Virtual Reality) and AI (Artificial Intelligence) technologies. This CAD system can shorten the period of design and reduce the requirements on designers' experience in large scale. And some key issues like the selection of hardware and software of such a system are discussed.
A Pipeline Software Architecture for NMR Spectrum Data Translation
Ellis, Heidi J.C.; Weatherby, Gerard; Nowling, Ronald J.; Vyas, Jay; Fenwick, Matthew; Gryk, Michael R.
2012-01-01
The problem of formatting data so that it conforms to the required input for scientific data processing tools pervades scientific computing. The CONNecticut Joint University Research Group (CONNJUR) has developed a data translation tool based on a pipeline architecture that partially solves this problem. The CONNJUR Spectrum Translator supports data format translation for experiments that use Nuclear Magnetic Resonance to determine the structure of large protein molecules. PMID:24634607
Haze production in the atmospheres of super-Earths and mini-Neptunes: Insight from PHAZER lab
NASA Astrophysics Data System (ADS)
Horst, Sarah; He, Chao; Kempton, Eliza; Moses, Julianne I.; Vuitton, Veronique; Lewis, Nikole
2017-10-01
Super-Earths and mini-Neptunes (~1.2-3 Earth radii) comprise a large fraction of planets in the universe and TESS (Transiting Exoplanet Survey Satellite) will increase the number that are amenable to atmospheric characterization with observatories like JWST (James Webb Space Telescope). These atmospheres should span a large range of temperature and atmospheric composition phase space, with no solar system analogues. Interpretation of current and future atmospheric observations of super-Earths and mini-Neptunes requires additional knowledge about atmospheric chemistry and photochemical haze production. We have experimentally investigated haze formation for H2, H2O, and CO2 dominated atmospheres (100x, 1000x, and 10000x solar metallicity) for a range of temperatures (300 K, 400 K, and 600 K) using the PHAZER (Planetary Haze Research) experiment at Johns Hopkins University. This is a necessary step in understanding which, if any, super-Earths and mini-Neptunes possess the conditions required for efficient production of photochemical haze in their atmospheres. We find that the production rates vary over a few orders of magnitudes with some higher than our nominal Titan experiments. We therefore expect that planets in this temperature and atmospheric composition phase space will exhibit a range of particle concentrations and some may be as hazy as Titan.
NASA Astrophysics Data System (ADS)
Guerra, J. C.; Brusa, G.; Christou, J.; Miller, D.; Ricardi, A.; Xompero, M.; Briguglio, R.; Wagner, M.; Lefebvre, M.; Sosa, R.
2013-09-01
The Large Binocular Telescope (LBT) is unique in that it is currently the only large telescope (2 x 8.4m primary mirrors) with permanently mounted adaptive secondary mirrors (ASMs). These ASMs have been used for regular observing since early 2010 on the right side and since late 2011 on the left side. They are currently regularly used for seeing-limited observing as well as for selective diffraction-limited observing and are required to be fully operational every observing night. By comparison the other telescopes using ASMs, the Multi Mirrot Telescope (MMT) and more recently Magellan, use fixed secondaries of seeing-limited observing and switch in the ASMs for diffraction-limited observing. We will discuss the night-to-night operational requirements for ASMs specifically for seeing-limited but also for diffraction-limited observations based on the LBT experience. These will include preparation procedures for observing (mirror flattening and resting as examples); hardware failure statistics and how to deal with them such as for the actuators; observing protocols for; and current limitations of use due to the ASM technology such as the minimum elevation limit (25 degrees) and the hysteresis of the gravity-vector induced astigmatism. We will also discuss the impact of ASM maintenance and preparation
Mohammed, Yassene; Percy, Andrew J; Chambers, Andrew G; Borchers, Christoph H
2015-02-06
Multiplexed targeted quantitative proteomics typically utilizes multiple reaction monitoring and allows the optimized quantification of a large number of proteins. One challenge, however, is the large amount of data that needs to be reviewed, analyzed, and interpreted. Different vendors provide software for their instruments, which determine the recorded responses of the heavy and endogenous peptides and perform the response-curve integration. Bringing multiplexed data together and generating standard curves is often an off-line step accomplished, for example, with spreadsheet software. This can be laborious, as it requires determining the concentration levels that meet the required accuracy and precision criteria in an iterative process. We present here a computer program, Qualis-SIS, that generates standard curves from multiplexed MRM experiments and determines analyte concentrations in biological samples. Multiple level-removal algorithms and acceptance criteria for concentration levels are implemented. When used to apply the standard curve to new samples, the software flags each measurement according to its quality. From the user's perspective, the data processing is instantaneous due to the reactivity paradigm used, and the user can download the results of the stepwise calculations for further processing, if necessary. This allows for more consistent data analysis and can dramatically accelerate the downstream data analysis.
An Intelligent Tool for Activity Data Collection
Jehad Sarkar, A. M.
2011-01-01
Activity recognition systems using simple and ubiquitous sensors require a large variety of real-world sensor data for not only evaluating their performance but also training the systems for better functioning. However, a tremendous amount of effort is required to setup an environment for collecting such data. For example, expertise and resources are needed to design and install the sensors, controllers, network components, and middleware just to perform basic data collections. It is therefore desirable to have a data collection method that is inexpensive, flexible, user-friendly, and capable of providing large and diverse activity datasets. In this paper, we propose an intelligent activity data collection tool which has the ability to provide such datasets inexpensively without physically deploying the testbeds. It can be used as an inexpensive and alternative technique to collect human activity data. The tool provides a set of web interfaces to create a web-based activity data collection environment. It also provides a web-based experience sampling tool to take the user’s activity input. The tool generates an activity log using its activity knowledge and the user-given inputs. The activity knowledge is mined from the web. We have performed two experiments to validate the tool’s performance in producing reliable datasets. PMID:22163832
Protein Folding Using a Vortex Fluidic Device.
Britton, Joshua; Smith, Joshua N; Raston, Colin L; Weiss, Gregory A
2017-01-01
Essentially all biochemistry and most molecular biology experiments require recombinant proteins. However, large, hydrophobic proteins typically aggregate into insoluble and misfolded species, and are directed into inclusion bodies. Current techniques to fold proteins recovered from inclusion bodies rely on denaturation followed by dialysis or rapid dilution. Such approaches can be time consuming, wasteful, and inefficient. Here, we describe rapid protein folding using a vortex fluidic device (VFD). This process uses mechanical energy introduced into thin films to rapidly and efficiently fold proteins. With the VFD in continuous flow mode, large volumes of protein solution can be processed per day with 100-fold reductions in both folding times and buffer volumes.
All-dielectric three-dimensional broadband Eaton lens with large refractive index range
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yin, Ming; Yong Tian, Xiao, E-mail: leoxyt@mail.xjtu.edu.cn; Ling Wu, Ling
2014-03-03
We proposed a method to realize three-dimensional (3D) gradient index (GRIN) devices requiring large refractive index (RI) range with broadband performance. By combining non-resonant GRIN woodpile photonic crystals structure in the metamaterial regime with a compound liquid medium, a wide RI range (1–6.32) was fulfilled flexibly. As a proof-of-principle for the low-loss and non-dispersive method, a 3D Eaton lens was designed and fabricated based on 3D printing process. Full-wave simulation and experiment validated its omnidirectional wave bending effects in a broad bandwidth covering Ku band (12 GHz–18 GHz)
Ashayeri, Ebrahim; Omogbehin, Adedamola; Sridhar, Rajagopalan; Shankar, Ravi A.
2002-01-01
More than two-thirds of the patients with osseous metastases experience debilitating bone pain, requiring some form of pain relief. Analgesics are limited in their efficacy. Palliative application of hemi-body external beam radiation therapy in the treatment of multiple osseous metastases also is limited due to toxicity associated with large treatment ports. Intravenous injections of bone seeking radioisotopes are effective in the palliation of pain with fewer side effects. Forty-one patients with multiple osseous metastases due to prostate and breast cancer were treated with strontium chloride 89 (89Sr) at the department of radiation oncology, in a university hospital. A retrospective analysis of these patients indicated that all subjects had severe pain that diminished their quality of life. Most of these patients had multiple co-morbid factors. Many were on opioids leading to adverse effects such as nausea, constipation, and drowsiness that required additional medication. Objective findings and evaluation of the responses were not always available for all patients. Following treatmentwith 89Sr, over two-thirds of the patients responded favorably and required lower doses of opioids. PMID:12152927
Readout Electronics for the Central Drift Chamber of the Belle-II Detector
NASA Astrophysics Data System (ADS)
Uchida, Tomohisa; Taniguchi, Takashi; Ikeno, Masahiro; Iwasaki, Yoshihito; Saito, Masatoshi; Shimazaki, Shoichi; Tanaka, Manobu M.; Taniguchi, Nanae; Uno, Shoji
2015-08-01
We have developed readout electronics for the central drift chamber (CDC) of the Belle-II detector. The space near the endplate of the CDC for installation of the electronics was limited by the detector structure. Due to the large amounts of data generated by the CDC, a high-speed data link, with a greater than one gigabit transfer rate, was required to transfer the data to a back-end computer. A new readout module was required to satisfy these requirements. This module processes 48 signals from the CDC, converts them to digital data and transfers it directly to the computer. All functions that transfer digital data via the high speed link were implemented on the single module. We have measured its electrical characteristics and confirmed that the results satisfy the requirements of the Belle-II experiment.
Synesthetic art through 3-D projection: The requirements of a computer-based supermedium
NASA Technical Reports Server (NTRS)
Mallary, Robert
1989-01-01
A computer-based form of multimedia art is proposed that uses the computer to fuse aspects of painting, sculpture, dance, music, film, and other media into a one-to-one synthesia of image and sound for spatially synchronous 3-D projection. Called synesthetic art, this conversion of many varied media into an aesthetically unitary experience determines the character and requirements of the system and its software. During the start-up phase, computer stereographic systems are unsuitable for software development. Eventually, a new type of illusory-projective supermedium will be required to achieve the needed combination of large-format projection and convincing real life presence, and to handle the vast amount of 3-D visual and acoustic information required. The influence of the concept on the author's research and creative work is illustrated through two examples.
NASA Astrophysics Data System (ADS)
Cummings, Bill
2004-03-01
Physicists possess many skills highly valued in industrial companies. However, with the exception of a decreasing number of positions in long range research at large companies, job openings in industry rarely say "Physicist Required." One key to a successful industrial career is to know what subset of your physics skills is most highly valued by a given industry and to continue to build these skills while working. This combination of skills from both academic and industrial experience becomes your "Industrial Physics Toolkit" and is a transferable resource when you change positions or companies. This presentation will describe how one builds and sells your own "Industrial Physics Toolkit" using concrete examples from the speaker's industrial experience.
Recent "Ground Testing" Experiences in the National Full-Scale Aerodynamics Complex
NASA Technical Reports Server (NTRS)
Zell, Peter; Stich, Phil; Sverdrup, Jacobs; George, M. W. (Technical Monitor)
2002-01-01
The large test sections of the National Full-scale Aerodynamics Complex (NFAC) wind tunnels provide ideal controlled wind environments to test ground-based objects and vehicles. Though this facility was designed and provisioned primarily for aeronautical testing requirements, several experiments have been designed to utilize existing model mount structures to support "non-flying" systems. This presentation will discuss some of the ground-based testing capabilities of the facility and provide examples of groundbased tests conducted in the facility to date. It will also address some future work envisioned and solicit input from the SATA membership on ways to improve the service that NASA makes available to customers.
Ray Meta: scalable de novo metagenome assembly and profiling
2012-01-01
Voluminous parallel sequencing datasets, especially metagenomic experiments, require distributed computing for de novo assembly and taxonomic profiling. Ray Meta is a massively distributed metagenome assembler that is coupled with Ray Communities, which profiles microbiomes based on uniquely-colored k-mers. It can accurately assemble and profile a three billion read metagenomic experiment representing 1,000 bacterial genomes of uneven proportions in 15 hours with 1,024 processor cores, using only 1.5 GB per core. The software will facilitate the processing of large and complex datasets, and will help in generating biological insights for specific environments. Ray Meta is open source and available at http://denovoassembler.sf.net. PMID:23259615
From specific examples to general knowledge in language learning.
Tamminen, Jakke; Davis, Matthew H; Rastle, Kathleen
2015-06-01
The extraction of general knowledge from individual episodes is critical if we are to learn new knowledge or abilities. Here we uncover some of the key cognitive mechanisms that characterise this process in the domain of language learning. In five experiments adult participants learned new morphological units embedded in fictitious words created by attaching new affixes (e.g., -afe) to familiar word stems (e.g., "sleepafe is a participant in a study about the effects of sleep"). Participants' ability to generalise semantic knowledge about the affixes was tested using tasks requiring the comprehension and production of novel words containing a trained affix (e.g., sailafe). We manipulated the delay between training and test (Experiment 1), the number of unique exemplars provided for each affix during training (Experiment 2), and the consistency of the form-to-meaning mapping of the affixes (Experiments 3-5). In a task where speeded online language processing is required (semantic priming), generalisation was achieved only after a memory consolidation opportunity following training, and only if the training included a sufficient number of unique exemplars. Semantic inconsistency disrupted speeded generalisation unless consolidation was allowed to operate on one of the two affix-meanings before introducing inconsistencies. In contrast, in tasks that required slow, deliberate reasoning, generalisation could be achieved largely irrespective of the above constraints. These findings point to two different mechanisms of generalisation that have different cognitive demands and rely on different types of memory representations. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Checking a Conceptual Model for Groundwater Flow in the Fractured Rock at Äspö, Sweden
NASA Astrophysics Data System (ADS)
Kröhn, K. P.
2015-12-01
The underground Hard Rock Laboratory (HRL) at Äspö, Sweden, is located in granitic rock and dedicated to investigations concerning deep geological disposal of radioactive waste. Several in-situ experiments have been performed in the HRL, among them the recent Buffer-Rock Interaction Experiment (BRIE) and, on a much larger scale, the long-term Prototype Repository (PR) experiment.Interpretation of such experiments requires a profound understanding of the groundwater flow system. Often assumed is a conceptual model where the so-called "intact rock" is interspersed with stochastically distributed fractures. It is also a common assumption, though, that fractures in granite exist on all length-scales implying that the hydraulically relevant rock porosity is basically made up of micro fractures. The conceptual approach of GRS' groundwater flow code d3f thus appeared to be fitting where large fractures are represented discretely by lower-dimensional features while the remaining set of smaller fractures - also called "background fractures" - is assumed to act like an additional homogeneous continuum besides what is believed to be the undisturbed matrix. This approach was applied to a hydraulic model of the BRIE in a cube-like domain of 40 m side length including drifts, boreholes and three intersecting large fractures. According to observations at the underground rock laboratories Stripa and the HRL a narrow zone of reduced permeability - called "skin" - was additionally arranged around all geotechnical openings. Calibration of the model resulted in a considerable increase of matrix permeability due to adding the effect of the background fractures. To check the validity of this approach the calibrated data for the BRIE were applied to a model for the PR which is also located in the HRL but at quite some distance. The related brick-shaped model domain has a size of 200 m x 150 m x 50 m. Fitting the calculated outflow from the rock to the measured outflow distribution along the PR-tunnel and the outflow into the six "deposition boreholes" nevertheless required only a moderate modification of the initially used permeabilities. By and large the chosen approach for the BRIE can thus be considered to have been successfully transferred to the PR.
Kappler, Ulrike; Rowland, Susan L; Pedwell, Rhianna K
2017-05-01
Systems biology is frequently taught with an emphasis on mathematical modeling approaches. This focus effectively excludes most biology, biochemistry, and molecular biology students, who are not mathematics majors. The mathematical focus can also present a misleading picture of systems biology, which is a multi-disciplinary pursuit requiring collaboration between biochemists, bioinformaticians, and mathematicians. This article describes an authentic large-scale undergraduate research experience (ALURE) in systems biology that incorporates proteomics, bacterial genomics, and bioinformatics in the one exercise. This project is designed to engage students who have a basic grounding in protein chemistry and metabolism and no mathematical modeling skills. The pedagogy around the research experience is designed to help students attack complex datasets and use their emergent metabolic knowledge to make meaning from large amounts of raw data. On completing the ALURE, participants reported a significant increase in their confidence around analyzing large datasets, while the majority of the cohort reported good or great gains in a variety of skills including "analysing data for patterns" and "conducting database or internet searches." An environmental scan shows that this ALURE is the only undergraduate-level system-biology research project offered on a large-scale in Australia; this speaks to the perceived difficulty of implementing such an opportunity for students. We argue however, that based on the student feedback, allowing undergraduate students to complete a systems-biology project is both feasible and desirable, even if the students are not maths and computing majors. © 2016 by The International Union of Biochemistry and Molecular Biology, 45(3):235-248, 2017. © 2016 The International Union of Biochemistry and Molecular Biology.
NASA Astrophysics Data System (ADS)
Kurucz, Charles N.; Waite, Thomas D.; Otaño, Suzana E.; Cooper, William J.; Nickelsen, Michael G.
2002-11-01
The effectiveness of using high energy electron beam irradiation for the removal of toxic organic chemicals from water and wastewater has been demonstrated by commercial-scale experiments conducted at the Electron Beam Research Facility (EBRF) located in Miami, Florida and elsewhere. The EBRF treats various waste and water streams up to 450 l min -1 (120 gal min -1) with doses up to 8 kilogray (kGy). Many experiments have been conducted by injecting toxic organic compounds into various plant feed streams and measuring the concentrations of compound(s) before and after exposure to the electron beam at various doses. Extensive experimentation has also been performed by dissolving selected chemicals in 22,700 l (6000 gal) tank trucks of potable water to simulate contaminated groundwater, and pumping the resulting solutions through the electron beam. These large-scale experiments, although necessary to demonstrate the commercial viability of the process, require a great deal of time and effort. This paper compares the results of large-scale electron beam irradiations to those obtained from bench-scale irradiations using gamma rays generated by a 60Co source. Dose constants from exponential contaminant removal models are found to depend on the source of radiation and initial contaminant concentration. Possible reasons for observed differences such as a dose rate effect are discussed. Models for estimating electron beam dose constants from bench-scale gamma experiments are presented. Data used to compare the removal of organic compounds using gamma irradiation and electron beam irradiation are taken from the literature and a series of experiments designed to examine the effects of pH, the presence of turbidity, and initial concentration on the removal of various organic compounds (benzene, toluene, phenol, PCE, TCE and chloroform) from simulated groundwater.
Novel diamond cells for neutron diffraction using multi-carat CVD anvils
Boehler, R.; Molaison, J. J.; Haberl, B.
2017-08-17
Traditionally, neutron diffraction at high pressure has been severely limited in pressure because low neutron flux required large sample volumes and therefore large volume presses. At the high-flux Spallation Neutron Source at the Oak Ridge National Laboratory, we have developed in this paper new, large-volume diamond anvil cells for neutron diffraction. The main features of these cells are multi-carat, single crystal chemical vapor deposition diamonds, very large diffraction apertures, and gas membranes to accommodate pressure stability, especially upon cooling. A new cell has been tested for diffraction up to 40 GPa with an unprecedented sample volume of ~0.15 mm 3.more » High quality spectra were obtained in 1 h for crystalline Ni and in ~8 h for disordered glassy carbon. Finally, these new techniques will open the way for routine megabar neutron diffraction experiments.« less
Balloon-borne three-meter telescope for far-infrared and submillimeter astronomy
NASA Technical Reports Server (NTRS)
Fazio, Giovanni G.; Hoffmann, William F.; Harper, Doyal A.
1988-01-01
The scientific objectives, engineering analysis and design, results of technology development, and focal-plane instrumentation for a two-meter balloon-borne telescope for far-infrared and submillimeter astronomy are presented. The unique capabilities of balloon-borne observations are discussed. A program summary emphasizes the development of the two-meter design. The relationship of the Large Deployable Reflector (LDR) is also discussed. Detailed treatment is given to scientific objectives, gondola design, the mirror development program, experiment accommodations, ground support equipment requirements, NSBF design drivers and payload support requirements, the implementation phase summary development plan, and a comparison of three-meter and two-meter gondola concepts.
Evolution of user analysis on the grid in ATLAS
NASA Astrophysics Data System (ADS)
Dewhurst, A.; Legger, F.; ATLAS Collaboration
2017-10-01
More than one thousand physicists analyse data collected by the ATLAS experiment at the Large Hadron Collider (LHC) at CERN through 150 computing facilities around the world. Efficient distributed analysis requires optimal resource usage and the interplay of several factors: robust grid and software infrastructures, and system capability to adapt to different workloads. The continuous automatic validation of grid sites and the user support provided by a dedicated team of expert shifters have been proven to provide a solid distributed analysis system for ATLAS users. Typical user workflows on the grid, and their associated metrics, are discussed. Measurements of user job performance and typical requirements are also shown.
An Electrostatic Precipitator System for the Martian Environment
NASA Technical Reports Server (NTRS)
Calle, C. I.; Mackey, P. J.; Hogue, M. D.; Johansen, M. R.; Phillips, J. R., III; Clements, J. S.
2012-01-01
Human exploration missions to Mars will require the development of technologies for the utilization of the planet's own resources for the production of commodities. However, the Martian atmosphere contains large amounts of dust. The extraction of commodities from this atmosphere requires prior removal of this dust. We report on our development of an electrostatic precipitator able to collect Martian simulated dust particles in atmospheric conditions approaching those of Mars. Extensive experiments with an initial prototype in a simulated Martian atmosphere showed efficiencies of 99%. The design of a second prototype with aerosolized Martian simulated dust in a flow-through is described. Keywords: Space applications, electrostatic precipitator, particle control, particle charging
Characteristic analysis and simulation for polysilicon comb micro-accelerometer
NASA Astrophysics Data System (ADS)
Liu, Fengli; Hao, Yongping
2008-10-01
High force update rate is a key factor for achieving high performance haptic rendering, which imposes a stringent real time requirement upon the execution environment of the haptic system. This requirement confines the haptic system to simplified environment for reducing the computation cost of haptic rendering algorithms. In this paper, we present a novel "hyper-threading" architecture consisting of several threads for haptic rendering. The high force update rate is achieved with relatively large computation time interval for each haptic loop. The proposed method was testified and proved to be effective with experiments on virtual wall prototype haptic system via Delta Haptic Device.
An assessment of future computer system needs for large-scale computation
NASA Technical Reports Server (NTRS)
Lykos, P.; White, J.
1980-01-01
Data ranging from specific computer capability requirements to opinions about the desirability of a national computer facility are summarized. It is concluded that considerable attention should be given to improving the user-machine interface. Otherwise, increased computer power may not improve the overall effectiveness of the machine user. Significant improvement in throughput requires highly concurrent systems plus the willingness of the user community to develop problem solutions for that kind of architecture. An unanticipated result was the expression of need for an on-going cross-disciplinary users group/forum in order to share experiences and to more effectively communicate needs to the manufacturers.
Data management for interdisciplinary field experiments: OTTER project support
NASA Technical Reports Server (NTRS)
Angelici, Gary; Popovici, Lidia; Skiles, J. W.
1993-01-01
The ability of investigators of an interdisciplinary science project to properly manage the data that are collected during the experiment is critical to the effective conduct of science. When the project becomes large, possibly including several scenes of large-format remotely sensed imagery shared by many investigators requiring several services, the data management effort can involve extensive staff and computerized data inventories. The OTTER (Oregon Transect Ecosystem Research) project was supported by the PLDS (Pilot Land Data System) with several data management services, such as data inventory, certification, and publication. After a brief description of these services, experiences in providing them are compared with earlier data management efforts and some conclusions regarding data management in support of interdisciplinary science are discussed. In addition to providing these services, a major goal of this data management capability was to adopt characteristics of a pro-active attitude, such as flexibility and responsiveness, believed to be crucial for the effective conduct of active, interdisciplinary science. These are also itemized and compared with previous data management support activities. Identifying and improving these services and characteristics can lead to the design and implementation of optimal data management support capabilities, which can result in higher quality science and data products from future interdisciplinary field experiments.
The Cadarache negative ion experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Massmann, P.; Bottereau, J.M.; Belchenko, Y.
1995-12-31
Up to energies of 140 keV neutral beam injection (NBI) based on positive ions has proven to be a reliable and flexible plasma heating method and has provided major contributions to most of the important experiments on virtually all large tokamaks around the world. As a candidate for additional heating and current drive on next step fusion machines (ITER ao) it is hoped that NBI can be equally successful. The ITER NBI parameters of 1 MeV, 50 MW D{degree} demand primary D{sup {minus}} beams with current densities of at least 15 mA/cm{sup 2}. Although considerable progress has been made inmore » the area of negative ion production and acceleration the high demands still require substantial and urgent development. Regarding negative ion production Cs seeded plasma sources lead the way. Adding a small amount of Cs to the discharge (Cs seeding) not only increases the negative ion yield by a factor 3--5 but also has the advantage that the discharge can be run at lower pressures. This is beneficial for the reduction of stripping losses in the accelerator. Multi-ampere negative ion production in a large plasma source is studied in the MANTIS experiment. Acceleration and neutralization at ITER relevant parameters is the objective of the 1 MV SINGAP experiment.« less
Ultrastable, Zerodur-based optical benches for quantum gas experiments.
Duncker, Hannes; Hellmig, Ortwin; Wenzlawski, André; Grote, Alexander; Rafipoor, Amir Jones; Rafipoor, Mona; Sengstock, Klaus; Windpassinger, Patrick
2014-07-10
Operating ultracold quantum gas experiments outside of a laboratory environment has so far been a challenging goal, largely due to the lack of sufficiently stable optical systems. In order to increase the thermal stability of free-space laser systems, the application of nonstandard materials such as glass ceramics is required. Here, we report on Zerodur-based optical systems which include single-mode fiber couplers consisting of multiple components jointed by light-curing adhesives. The thermal stability is thoroughly investigated, revealing excellent fiber-coupling efficiencies between 0.85 and 0.92 in the temperature range from 17°C to 36°C. In conjunction with successfully performed vibration tests, these findings qualify our highly compact systems for atom interferometry experiments aboard a sounding rocket as well as various other quantum information and sensing applications.
Timing, sequencing, and executive control in repetitive movement production.
Krampe, Ralf Th; Mayr, Ulrich; Kliegl, Reinhold
2005-06-01
The authors demonstrate that the timing and sequencing of target durations require low-level timing and executive control. Sixteen young (M-sub(age) = 19 years) and 16 older (M-sub(age) = 70 years) adults participated in 2 experiments. In Experiment 1, individual mean-variance functions for low-level timing (isochronous tapping) and the sequencing of multiple targets (rhythm production) revealed (a) a dissociation of low-level timing and sequencing in both age groups, (b) negligible age differences for low-level timing, and (c) large age differences for sequencing. Experiment 2 supported the distinction between low-level timing and executive functions: Selection against a dominant rhythm and switching between rhythms impaired performances in both age groups and induced pronounced perseveration of the dominant pattern in older adults. ((c) 2005 APA, all rights reserved).
Carasatorre, M; Ramírez-Amaya, V; Díaz Cintra, S
2016-10-01
Long-lasting memory formation requires that groups of neurons processing new information develop the ability to reproduce the patterns of neural activity acquired by experience. Changes in synaptic efficiency let neurons organise to form ensembles that repeat certain activity patterns again and again. Among other changes in synaptic plasticity, structural modifications tend to be long-lasting which suggests that they underlie long-term memory. There is a large body of evidence supporting that experience promotes changes in the synaptic structure, particularly in the hippocampus. Structural changes to the hippocampus may be functionally implicated in stabilising acquired memories and encoding new information. Copyright © 2012 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.
Quality Assurance on Undoped CsI Crystals for the Mu2e Experiment
NASA Astrophysics Data System (ADS)
Atanov, N.; Baranov, V.; Budagov, J.; Davydov, Yu. I.; Glagolev, V.; Tereshchenko, V.; Usubov, Z.; Cervelli, F.; Di Falco, S.; Donati, S.; Morescalchi, L.; Pedreschi, E.; Pezzullo, G.; Raffaelli, F.; Spinella, F.; Colao, F.; Cordelli, M.; Corradi, G.; Diociaiuti, E.; Donghia, R.; Giovannella, S.; Happacher, F.; Martini, M.; Miscetti, S.; Ricci, M.; Saputi, A.; Sarra, I.; Echenard, B.; Hitlin, D. G.; Hu, C.; Miyashita, T.; Porter, F.; Zhang, L.; Zhu, R.-Y.; Grancagnolo, F.; Tassielli, G.; Murat, P.
2018-02-01
The Mu2e experiment is constructing a calorimeter consisting of 1,348 undoped CsI crystals in two disks. Each crystal has a dimension of 34 x 34 x 200 mm, and is readout by a large area silicon PMT array. A series of technical specifications was defined according to physics requirements. Preproduction CsI crystals were procured from three firms: Amcrys, Saint-Gobain and Shanghai Institute of Ceramics. We report the quality assurance on crystal's scintillation properties and their radiation hardness against ionization dose and neutrons. With a fast decay time of 30 ns and a light output of more than 100 p.e./MeV measured with a bi-alkali PMT, undoped CsI crystals provide a cost-effective solution for the Mu2e experiment.
Sacco Casamassima, Maria Grazia; Goldstein, Seth D; Salazar, Jose H; McIltrot, Kimberly H; Abdullah, Fizan; Colombani, Paul M
2014-04-01
The safety and efficacy of minimally invasive pectus excavatum repair have been demonstrated over the last twenty years. However, technical details and perioperative management strategies continue to be debated. The aim of the present study is to review a large single-institution experience with the modified Nuss procedure. A retrospective review was performed of patients who underwent primary pectus excavatum repair at a single tertiary hospital via a modified Nuss procedure that included: no thoracoscopy, retrosternal dissection achieved via a left-to-right thoracic approach, four-point stabilization of the bar, and no routine epidural analgesia. Data collected included demographics, preoperative symptoms, operative characteristics, hospital charges and postoperative outcomes. A total of 336 pediatric patients were identified. No cardiac perforations occurred and the rate of pericarditis was 0.6%. Contemporary rates of bar displacement have fallen to 1.2%. Routine use of chlorhexidine scrub reduced superficial site infections to 0.7%. Two patients (0.6%) with severe recurrence required reoperation. Bars were removed after an average period of 31.7(SD 13.2) months, with satisfactory cosmetic and functional results in 94.9% of cases. We report here a single-institution large volume experience, including modifications to the Nuss procedure that make the technique simpler and safer, improve results, and minimize hospital charges. Copyright © 2014 Elsevier Inc. All rights reserved.
Computer-intensive simulation of solid-state NMR experiments using SIMPSON.
Tošner, Zdeněk; Andersen, Rasmus; Stevensson, Baltzar; Edén, Mattias; Nielsen, Niels Chr; Vosegaard, Thomas
2014-09-01
Conducting large-scale solid-state NMR simulations requires fast computer software potentially in combination with efficient computational resources to complete within a reasonable time frame. Such simulations may involve large spin systems, multiple-parameter fitting of experimental spectra, or multiple-pulse experiment design using parameter scan, non-linear optimization, or optimal control procedures. To efficiently accommodate such simulations, we here present an improved version of the widely distributed open-source SIMPSON NMR simulation software package adapted to contemporary high performance hardware setups. The software is optimized for fast performance on standard stand-alone computers, multi-core processors, and large clusters of identical nodes. We describe the novel features for fast computation including internal matrix manipulations, propagator setups and acquisition strategies. For efficient calculation of powder averages, we implemented interpolation method of Alderman, Solum, and Grant, as well as recently introduced fast Wigner transform interpolation technique. The potential of the optimal control toolbox is greatly enhanced by higher precision gradients in combination with the efficient optimization algorithm known as limited memory Broyden-Fletcher-Goldfarb-Shanno. In addition, advanced parallelization can be used in all types of calculations, providing significant time reductions. SIMPSON is thus reflecting current knowledge in the field of numerical simulations of solid-state NMR experiments. The efficiency and novel features are demonstrated on the representative simulations. Copyright © 2014 Elsevier Inc. All rights reserved.
Miura, Asako; Kobayashi, Tetsuro
2016-01-01
Though survey satisficing, grudging cognitive efforts required to provide optimal answers in the survey response process, poses a serious threat to the validity of online experiments, a detailed explanation of the mechanism has yet to be established. Focusing on attitudes toward immigrants, we examined the mechanism by which survey satisficing distorts treatment effect estimates in online experiments. We hypothesized that satisficers would display more stereotypical responses than non-satisficers would when presented with stereotype-disconfirming information about an immigrant. Results of two experiments largely supported our hypotheses. Satisficers, whom we identified through an instructional manipulation check (IMC), processed information about immigrants' personality traits congruently with the stereotype activated by information provided about nationality. The significantly shorter vignette reading time of satisficers corroborates their time-efficient impression formation based on stereotyping. However, the shallow information processing of satisficers can be rectified by alerting them to their inattentiveness through use of a repeated IMC. PMID:27803680
Research and the planned Space Experiment Research and Processing Laboratory
NASA Technical Reports Server (NTRS)
2000-01-01
Original photo and caption dated June 22, 1988: 'A dwarf wheat variety known as Yecoro Rojo flourishes in KSC's Biomass Production Chamber. Researchers are gathering information on the crop's ability to produce food, water and oxygen, and then remove carbon dioxide. The confined quarters associated with space travel require researchers to focus on smaller plants that yield proportionately large amounts of biomass. This wheat crop takes about 85 days to grow before harvest.' Plant experiments such as this are the type of life sciences research that will be conducted at the Space Experiment Research Procession Laboratory (SERPL). The SERPL is a planned 100,000-square-foot laboratory that will provide expanded and upgraded facilities for hosting International Space Station experiment processing. In addition, it will provide better support for other biological and life sciences payload processing at KSC. It will serve as a magnet facility for a planned 400-acre Space Station Commerce Park.
Experiments to Distribute Map Generalization Processes
NASA Astrophysics Data System (ADS)
Berli, Justin; Touya, Guillaume; Lokhat, Imran; Regnauld, Nicolas
2018-05-01
Automatic map generalization requires the use of computationally intensive processes often unable to deal with large datasets. Distributing the generalization process is the only way to make them scalable and usable in practice. But map generalization is a highly contextual process, and the surroundings of a generalized map feature needs to be known to generalize the feature, which is a problem as distribution might partition the dataset and parallelize the processing of each part. This paper proposes experiments to evaluate the past propositions to distribute map generalization, and to identify the main remaining issues. The past propositions to distribute map generalization are first discussed, and then the experiment hypotheses and apparatus are described. The experiments confirmed that regular partitioning was the quickest strategy, but also the less effective in taking context into account. The geographical partitioning, though less effective for now, is quite promising regarding the quality of the results as it better integrates the geographical context.
Frog: The fast & realistic OpenGL event displayer
NASA Astrophysics Data System (ADS)
Quertenmont, Loïc
2010-04-01
FROG [1] [2] is a generic framework dedicated to visualisation of events in high energy physics experiment. It is suitable to any particular physics experiment or detector design. The code is light (< 3 MB) and fast (browsing time ~ 20 events per second for a large High Energy Physics experiment) and can run on various operating systems, as its object-oriented structure (C++) relies on the cross-platform OpenGL[3] and Glut [4] libraries. Moreover, Frog does not require installation of heavy third party libraries for the visualisation. This documents describes the features and principles of Frog version 1.106, its working scheme and numerous functionalities such as: 3D and 2D visualisation, graphical user interface, mouse interface, configuration files, production of pictures of various format, integration of personal objects, etc. Finally the application of FROG for physic experiment/environement, such as Gastof, CMS, ILD, Delphes will be presented for illustration.
ArrayNinja: An Open Source Platform for Unified Planning and Analysis of Microarray Experiments.
Dickson, B M; Cornett, E M; Ramjan, Z; Rothbart, S B
2016-01-01
Microarray-based proteomic platforms have emerged as valuable tools for studying various aspects of protein function, particularly in the field of chromatin biochemistry. Microarray technology itself is largely unrestricted in regard to printable material and platform design, and efficient multidimensional optimization of assay parameters requires fluidity in the design and analysis of custom print layouts. This motivates the need for streamlined software infrastructure that facilitates the combined planning and analysis of custom microarray experiments. To this end, we have developed ArrayNinja as a portable, open source, and interactive application that unifies the planning and visualization of microarray experiments and provides maximum flexibility to end users. Array experiments can be planned, stored to a private database, and merged with the imaged results for a level of data interaction and centralization that is not currently attainable with available microarray informatics tools. © 2016 Elsevier Inc. All rights reserved.
Signal processing of anthropometric data
NASA Astrophysics Data System (ADS)
Zimmermann, W. J.
1983-09-01
The Anthropometric Measurements Laboratory has accumulated a large body of data from a number of previous experiments. The data is very noisy, therefore it requires the application of some signal processing schemes. Moreover, it was not regarded as time series measurements but as positional information; hence, the data is stored as coordinate points as defined by the motion of the human body. The accumulated data defines two groups or classes. Some of the data was collected from an experiment designed to measure the flexibility of the limbs, referred to as radial movement. The remaining data was collected from experiments designed to determine the surface of the reach envelope. An interactive signal processing package was designed and implemented. Since the data does not include time this package does not include a time series element. Presently the results is restricted to processing data obtained from those experiments designed to measure flexibility.
Signal processing of anthropometric data
NASA Technical Reports Server (NTRS)
Zimmermann, W. J.
1983-01-01
The Anthropometric Measurements Laboratory has accumulated a large body of data from a number of previous experiments. The data is very noisy, therefore it requires the application of some signal processing schemes. Moreover, it was not regarded as time series measurements but as positional information; hence, the data is stored as coordinate points as defined by the motion of the human body. The accumulated data defines two groups or classes. Some of the data was collected from an experiment designed to measure the flexibility of the limbs, referred to as radial movement. The remaining data was collected from experiments designed to determine the surface of the reach envelope. An interactive signal processing package was designed and implemented. Since the data does not include time this package does not include a time series element. Presently the results is restricted to processing data obtained from those experiments designed to measure flexibility.
Radioisotope experiments in physics, chemistry, and biology. Second revised edition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dance, J.B.
It is stated that the main object of the book is to show that a large number of experiments in chemistry, physics and biology can be safely carried out with a minimal amount of equipment. No sophisticated counting equipment is required, in most cases simple geiger counters or photographic emulsions are used, but a few experiments are included for use with other forms of detectors, such as pulse electroscopes, which are often found in schools. Using naturally occurring compounds, sealed sources and some unsealed sources of low specific activity, experiments are given of typical applications in statistics, electronics, photography, healthmore » physics, botany and so on. The necessary theoretical background is presented in the introductory chapters and typical problems are given at the end of the book. The book is intended for GCE and Advanced level students. (UK)« less
NASA Astrophysics Data System (ADS)
Berdalovic, I.; Bates, R.; Buttar, C.; Cardella, R.; Egidos Plaja, N.; Hemperek, T.; Hiti, B.; van Hoorne, J. W.; Kugathasan, T.; Mandic, I.; Maneuski, D.; Marin Tobon, C. A.; Moustakas, K.; Musa, L.; Pernegger, H.; Riedler, P.; Riegel, C.; Schaefer, D.; Schioppa, E. J.; Sharma, A.; Snoeys, W.; Solans Sanchez, C.; Wang, T.; Wermes, N.
2018-01-01
The upgrade of the ATLAS tracking detector (ITk) for the High-Luminosity Large Hadron Collider at CERN requires the development of novel radiation hard silicon sensor technologies. Latest developments in CMOS sensor processing offer the possibility of combining high-resistivity substrates with on-chip high-voltage biasing to achieve a large depleted active sensor volume. We have characterised depleted monolithic active pixel sensors (DMAPS), which were produced in a novel modified imaging process implemented in the TowerJazz 180 nm CMOS process in the framework of the monolithic sensor development for the ALICE experiment. Sensors fabricated in this modified process feature full depletion of the sensitive layer, a sensor capacitance of only a few fF and radiation tolerance up to 1015 neq/cm2. This paper summarises the measurements of charge collection properties in beam tests and in the laboratory using radioactive sources and edge TCT. The results of these measurements show significantly improved radiation hardness obtained for sensors manufactured using the modified process. This has opened the way to the design of two large scale demonstrators for the ATLAS ITk. To achieve a design compatible with the requirements of the outer pixel layers of the tracker, a charge sensitive front-end taking 500 nA from a 1.8 V supply is combined with a fast digital readout architecture. The low-power front-end with a 25 ns time resolution exploits the low sensor capacitance to reduce noise and analogue power, while the implemented readout architectures minimise power by reducing the digital activity.
Surinova, Silvia; Hüttenhain, Ruth; Chang, Ching-Yun; Espona, Lucia; Vitek, Olga; Aebersold, Ruedi
2013-08-01
Targeted proteomics based on selected reaction monitoring (SRM) mass spectrometry is commonly used for accurate and reproducible quantification of protein analytes in complex biological mixtures. Strictly hypothesis-driven, SRM assays quantify each targeted protein by collecting measurements on its peptide fragment ions, called transitions. To achieve sensitive and accurate quantitative results, experimental design and data analysis must consistently account for the variability of the quantified transitions. This consistency is especially important in large experiments, which increasingly require profiling up to hundreds of proteins over hundreds of samples. Here we describe a robust and automated workflow for the analysis of large quantitative SRM data sets that integrates data processing, statistical protein identification and quantification, and dissemination of the results. The integrated workflow combines three software tools: mProphet for peptide identification via probabilistic scoring; SRMstats for protein significance analysis with linear mixed-effect models; and PASSEL, a public repository for storage, retrieval and query of SRM data. The input requirements for the protocol are files with SRM traces in mzXML format, and a file with a list of transitions in a text tab-separated format. The protocol is especially suited for data with heavy isotope-labeled peptide internal standards. We demonstrate the protocol on a clinical data set in which the abundances of 35 biomarker candidates were profiled in 83 blood plasma samples of subjects with ovarian cancer or benign ovarian tumors. The time frame to realize the protocol is 1-2 weeks, depending on the number of replicates used in the experiment.
The Important Role of Physics in Industry and Economic Development
NASA Astrophysics Data System (ADS)
Alvarado, Igor
2012-10-01
Good Physics requires good education. Good education translates into good Physics professionals. The process starts early with Science, Technology, Engineering and Mathematics (STEM) education programs for Middle and High-School students. Then it continues with competitive higher education programs (2 years and 4 years) at colleges and universities designed to satisfy the needs of industry and academia. The research work conducted by graduate students in Physics (and Engineering Physics) frequently translates into new discoveries and innovations that have direct impact in society (e.g. Proton Cancer Therapy). Some of the major and largest scientific experiments in the world today are physics-centered (e.g. Large Hadron Collider-LHC) that generate employment and business opportunities for thousands of scientists, academic research groups and companies from around the world. New superconducting magnets and advanced materials that have resulted from previous research in physics are commonly used in these extreme experiments. But not all physicists will end up working at these large high-energy physics experiments, universities or National Laboratories (e.g. Fermilab); industry requires new generations of (industrial) physicists in such sectors as semiconductor, energy, space, life sciences, defense and advanced manufacturing. This work presents an industry perspective about the role of Physics in economic development and the need for a collaborative Academic-Industry approach for a more effective translational research. A series of examples will be presented with emphasis in the measurement, control, diagnostics and computing capabilities needed to translate the science (physics) into innovations and practical solutions that can benefit society as a whole.
Development of an electrostatic propulsion engine using sub-micron powders as the reaction mass
NASA Technical Reports Server (NTRS)
Herbert, F.; Kendall, K. R.
1991-01-01
Asteroid sample return missions would benefit from development of an improved rocket engine. Chemical rockets achieve their large thrust with high mass consumption rate (dm/dt) but low exhaust velocity; therefore, a large fraction of their total mass is fuel. Present day ion thrusters are characterized by high exhaust velocity, but low dm/dt; thus, they are inherently low thrust devices. However, their high exhausy velocity is poorly matched to typical mission requirements and therefore, wastes energy. A better match would be intermediate between the two forms of propulsion. This could be achieved by electrostatically accelerating solid powder grains, raising the possibility that interplanetary material could be processed to use as reaction mass. An experiment to study the charging properties of sub-micron sized powder grains is described. If a suitable material can be identified, then it could be used as the reaction mass in an electrostatic propulsion engine. The experiment employs a time of flight measurement to determine the exhaust velocity (v) of various negatively charged powder grains that were charged and accelerated in a simple device. The purpose is to determine the charge to mass ratio that can be sustained for various substances. In order to be competitive with present day ion thrusters, a specific impulse (v/g) of 3000 to 5000 seconds is required. Preliminary results are presented. More speculatively, there are some mission profiles that would benefit from collection of reaction mass at the remote asteroid site. Experiments that examine the generation of sub-micron clusters by electrostatic self-disruption of geologically derived material are planned.
Stratospheric experiments on curing of composite materials
NASA Astrophysics Data System (ADS)
Chudinov, Viacheslav; Kondyurin, Alexey; Svistkov, Alexander L.; Efremov, Denis; Demin, Anton; Terpugov, Viktor; Rusakov, Sergey
2016-07-01
Future space exploration requires a large light-weight structure for habitats, greenhouses, space bases, space factories and other constructions. A new approach enabling large-size constructions in space relies on the use of the technology of polymerization of fiber-filled composites with a curable polymer matrix applied in the free space environment on Erath orbit. In orbit, the material is exposed to high vacuum, dramatic temperature changes, plasma of free space due to cosmic rays, sun irradiation and atomic oxygen (in low Earth orbit), micrometeorite fluence, electric charging and microgravitation. The development of appropriate polymer matrix composites requires an understanding of the chemical processes of polymer matrix curing under the specific free space conditions to be encountered. The goal of the stratospheric flight experiment is an investigation of the effect of the stratospheric conditions on the uncured polymer matrix of the composite material. The unique combination of low residual pressure, high intensity UV radiation including short-wave UV component, cosmic rays and other aspects associated with solar irradiation strongly influences the chemical processes in polymeric materials. We have done the stratospheric flight experiments with uncured composites (prepreg). A balloon with payload equipped with heater, temperature/pressure/irradiation sensors, microprocessor, carrying the samples of uncured prepreg has been launched to stratosphere of 25-30 km altitude. After the flight, the samples have been tested with FTIR, gel-fraction, tensile test and DMA. The effect of cosmic radiation has been observed. The composite was successfully cured during the stratospheric flight. The study was supported by RFBR grants 12-08-00970 and 14-08-96011.
Design solutions for dome and main structure (mount) of giant telescopes
NASA Astrophysics Data System (ADS)
Murga, Gaizka; Bilbao, Armando; de Bilbao, Lander; Lorentz, Thomas E.
2016-07-01
During the last recent years, designs for several giant telescopes ranging from 20 to 40m in diameter are being developed: European Extremely Large Telescope Telescope (TMT). (E-ELT), Giant Magellan Telescope (GMT) and Thirty Meter It is evident that simple direct up-scaling of solutions that were more or less successful in the 8 to 10m class telescopes can not lead to viable designs for the future giant telescopes. New solutions are required to provide adequate load sharing, to cope with the large-scale derived deflections and to provide the required compliance, or to respond to structure-mechanism control interaction issues, among others. From IDOM experience in the development of the Dome and Main Structure of the European Extremely Large Telescope and our participation in some other giant telescopes, this paper reviews several design approaches for the main mechanisms and key structural parts of enclosures and mounts/main structures for giant telescopes, analyzing pros and cons of the different alternatives and outlining the preferred design schemes. The assessment is carried out mainly from a technical and performance-based angle but it also considers specific logistical issues for the assembly of these large telescopes in remote and space-limited areas, together with cost and schedule related issues.
E. Hyvarinen; H. Lappalainen; P. Martikainen; J. Kouki
2003-01-01
During the 1900s, the amount of dead and decaying wood has declined drastically in boreal forests in Finland because of intensive forest management. As a result, species requiring such resources have also declined or have even gone extinct. Recently it has been observed that in addition to old-growth forests, natural, early successional phases are also important for...
NASA Astrophysics Data System (ADS)
Shelestov, Andrii; Lavreniuk, Mykola; Kussul, Nataliia; Novikov, Alexei; Skakun, Sergii
2017-02-01
Many applied problems arising in agricultural monitoring and food security require reliable crop maps at national or global scale. Large scale crop mapping requires processing and management of large amount of heterogeneous satellite imagery acquired by various sensors that consequently leads to a “Big Data” problem. The main objective of this study is to explore efficiency of using the Google Earth Engine (GEE) platform when classifying multi-temporal satellite imagery with potential to apply the platform for a larger scale (e.g. country level) and multiple sensors (e.g. Landsat-8 and Sentinel-2). In particular, multiple state-of-the-art classifiers available in the GEE platform are compared to produce a high resolution (30 m) crop classification map for a large territory ( 28,100 km2 and 1.0 M ha of cropland). Though this study does not involve large volumes of data, it does address efficiency of the GEE platform to effectively execute complex workflows of satellite data processing required with large scale applications such as crop mapping. The study discusses strengths and weaknesses of classifiers, assesses accuracies that can be achieved with different classifiers for the Ukrainian landscape, and compares them to the benchmark classifier using a neural network approach that was developed in our previous studies. The study is carried out for the Joint Experiment of Crop Assessment and Monitoring (JECAM) test site in Ukraine covering the Kyiv region (North of Ukraine) in 2013. We found that Google Earth Engine (GEE) provides very good performance in terms of enabling access to the remote sensing products through the cloud platform and providing pre-processing; however, in terms of classification accuracy, the neural network based approach outperformed support vector machine (SVM), decision tree and random forest classifiers available in GEE.
Wang, Jack T H; Daly, Joshua N; Willner, Dana L; Patil, Jayee; Hall, Roy A; Schembri, Mark A; Tyson, Gene W; Hugenholtz, Philip
2015-05-01
Clinical microbiology testing is crucial for the diagnosis and treatment of community and hospital-acquired infections. Laboratory scientists need to utilize technical and problem-solving skills to select from a wide array of microbial identification techniques. The inquiry-driven laboratory training required to prepare microbiology graduates for this professional environment can be difficult to replicate within undergraduate curricula, especially in courses that accommodate large student cohorts. We aimed to improve undergraduate scientific training by engaging hundreds of introductory microbiology students in an Authentic Large-Scale Undergraduate Research Experience (ALURE). The ALURE aimed to characterize the microorganisms that reside in the healthy human oral cavity-the oral microbiome-by analyzing hundreds of samples obtained from student volunteers within the course. Students were able to choose from selective and differential culture media, Gram-staining, microscopy, as well as polymerase chain reaction (PCR) and 16S rRNA gene sequencing techniques, in order to collect, analyze, and interpret novel data to determine the collective oral microbiome of the student cohort. Pre- and postsurvey analysis of student learning gains across two iterations of the course (2012-2013) revealed significantly higher student confidence in laboratory skills following the completion of the ALURE (p < 0.05 using the Mann-Whitney U-test). Learning objectives on effective scientific communication were also met through effective student performance in laboratory reports describing the research outcomes of the project. The integration of undergraduate research in clinical microbiology has the capacity to deliver authentic research experiences and improve scientific training for large cohorts of undergraduate students.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dahlberg, Jeffrey A.; Wolfrum, Edward J.
2010-09-28
The development of a robust source of renewable transportation fuel will require a large amount of biomass feedstocks. It is generally accepted that in addition to agricultural and forestry residues, we will need crops grown specifically for subsequent conversion into fuels. There has been a lot of research on several of these so-called "dedicated bioenergy crops" including switchgrass, miscanthus, sugarcane, and poplar. It is likely that all of these crops will end up playing a role as feedstocks, depending on local environmental and market conditions. Many different types of sorghum have been grown to produce syrup, grain, and animal feedmore » for many years. It has several features that may make it as compelling as other crops mentioned above as a renewable, sustainable biomass feedstock; however, very little work has been done to investigate sorghum as a dedicated bioenergy crop. The goal of this project was to investigate the feasibility of using sorghum biomass to produce ethanol. The work performed included a detailed examination of the agronomics and composition of a large number of sorghum varieties, laboratory experiments to convert sorghum to ethanol, and economic and life-cycle analyses of the sorghum-to-ethanol process. This work showed that sorghum has a very wide range of composition, which depended on the specific sorghum cultivar as well as the growing conditions. The results of laboratory- and pilot-scale experiments indicated that a typical high-biomass sorghum variety performed very similarly to corn stover during the multi-step process required to convert biomass feedstocks to ethanol; yields of ethanol for sorghum were very similar to the corn stover used as a control in these experiments. Based on multi-year agronomic data and theoretical ethanol production, sorghum can achieve more than 1,300 gallons of ethanol per acre given the correct genetics and environment. In summary, sorghum may be a compelling dedicated bioenergy crop that could help provide a major portion of the feedstocks required to produce renewable domestic transportation fuels.« less
Coupled Fracture and Flow in Shale in Hydraulic Fracturing
NASA Astrophysics Data System (ADS)
Carey, J. W.; Mori, H.; Viswanathan, H.
2014-12-01
Production of hydrocarbon from shale requires creation and maintenance of fracture permeability in an otherwise impermeable shale matrix. In this study, we use a combination of triaxial coreflood experiments and x-ray tomography characterization to investigate the fracture-permeability behavior of Utica shale at in situ reservoir conditions (25-50 oC and 35-120 bars). Initially impermeable shale core was placed between flat anvils (compression) or between split anvils (pure shear) and loaded until failure in the triaxial device. Permeability was monitored continuously during this process. Significant deformation (>1%) was required to generate a transmissive fracture system. Permeability generally peaked at the point of a distinct failure event and then dropped by a factor of 2-6 when the system returned to hydrostatic failure. Permeability was very small in compression experiments (< 1 mD), possibly because of limited fracture connectivity through the anvils. In pure share experiments, shale with bedding planes perpendicular to shear loading developed complex fracture networks with narrow apertures and peak permeability of 30 mD. Shale with bedding planes parallel to shear loading developed simple fractures with large apertures and a peak permeability as high as 1 D. Fracture systems held at static conditions for periods of several hours showed little change in effective permeability at hydrostatic conditions as high as 140 bars. However, permeability of fractured systems was a function of hydrostatic pressure, declining in a pseudo-linear, exponential fashion as pressure increased. We also observed that permeability decreased with increasing fluid flow rate indicating that flow did not follow Darcy's Law, possibly due to non-laminar flow conditions, and conformed to Forscheimer's law. The coupled deformation and flow behavior of Utica shale, particularly the large deformation required to initiate flow, indicates the probable importance of activation of existing fractures in hydraulic fracturing and that these fractures can have adequate permeability for the production of hydrocarbon.
Holographic Airborne Rotating Lidar Instrument Experiment (HARLIE)
NASA Technical Reports Server (NTRS)
Schwemmer, Geary K.
1998-01-01
Scanning holographic lidar receivers are currently in use in two operational lidar systems, PHASERS (Prototype Holographic Atmospheric Scanner for Environmental Remote Sensing) and now HARLIE (Holographic Airborne Rotating Lidar Instrument Experiment). These systems are based on volume phase holograms made in dichromated gelatin (DCG) sandwiched between 2 layers of high quality float glass. They have demonstrated the practical application of this technology to compact scanning lidar systems at 532 and 1064 nm wavelengths, the ability to withstand moderately high laser power and energy loading, sufficient optical quality for most direct detection systems, overall efficiencies rivaling conventional receivers, and the stability to last several years under typical lidar system environments. Their size and weight are approximately half of similar performing scanning systems using reflective optics. The cost of holographic systems will eventually be lower than the reflective optical systems depending on their degree of commercialization. There are a number of applications that require or can greatly benefit from a scanning capability. Several of these are airborne systems, which either use focal plane scanning, as in the Laser Vegetation Imaging System or use primary aperture scanning, as in the Airborne Oceanographic Lidar or the Large Aperture Scanning Airborne Lidar. The latter class requires a large clear aperture opening or window in the aircraft. This type of system can greatly benefit from the use of scanning transmission holograms of the HARLIE type because the clear aperture required is only about 25% larger than the collecting aperture as opposed to 200-300% larger for scan angles of 45 degrees off nadir.
Design and Flight Testing of an Inflatable Sunshield for the NGST
NASA Technical Reports Server (NTRS)
Adams, Michael L.; Culver, Harry L.; Kaufman, David M.; Pacini, Linda K.; Sturm, James; Lienard, Sebastien
2000-01-01
The Next Generation Space Telescope (NGST) mission is scheduled to launch in 2007 and be stationed at L2 for a mission life of ten years. The large aperture mirror and optical detectors aboard NGST require shielding from the constant solar energy seen at this orbit. The government reference NGST design, called the Yardstick, baselined a sunshield using an inflation deployment system. During the formulation phase, NGST is spending approximately 25% of the overall budget to foster the development of new technology. The goal is to develop and demonstrate enabling or enhancing technology and provide innovative solutions for the design of the NGST observatory. Inflatable technology falls in the category of enhancing technology due to its advantages in weight, stowed volume and cost. The Inflatable Sunshield in Space (ISIS) flight experiment will provide a realistic space flight demonstration of an inflatable sunshield. The supporting technology development program will provide an information base for the design, manufacture, assembly and testing of large thin membranes and inflatable structural elements for space structures. The ISIS experiment will demonstrate the feasibility of using inflatable technology to passively cool optical systems for NGST and provide correlation between analytical predictions and on orbit results. The experiment will be performed on a Hitchhiker/Space Shuttle mission in late 2001. The ISIS mission is an effort to address several major technical challenges of the NGST inflatable sunshield, namely controlled inflation deployment, plenarity and separation of large stretched membranes, space rigidization of inflatable booms, and dynamic modeling and simulation. This paper will describe the design of the flight experiment and the testing to be performed on-orbit.
Giant-FOG: A new player in ground motion instrumentation
NASA Astrophysics Data System (ADS)
Guattari, F.; de Toldi, E.; Bigueur, A.; Decitre, J. B.; Ponceau, D.; Sèbe, O.; Frenois, A.; Schindelé, F.; Moluçon, C.; Gaffet, S.; Ducloux, E.; Lefèvre, H.
2017-12-01
Based on recent experiences developing very low noise fiber-optic gyroscopes (FOG), first performance results on very large fiber-optic coils of up to 1m diameter are presented. The goal for constructing large FOGs is to evaluate experimentally the physical limits of this kind of technology and to reach the lowest possible noise. While these experiments are probing the fundamental limits of the FOG technology, they also serves as a first step for a cost effective very low noise laboratory rotational seismometer, which could be a game changer in instrumentation of ground motion. Build a Giant-FOG has several difficulties: The first is winding of the coil, the second concerns the mechanical substrate, and third is related to the measurement. - To our knowledge, a winding machine, large enough to wind coil of a 1 meter diameter, does not exist, but thanks to the iXblue expertise in the manufacturing of winding machines and calibration tables, a hydride system has been designed, merging these two technology to fulfill the requirement of winding a large coil on an adequate rotational platform. The characterization of the wobbles of the system will be presented, since this is a critical parameter for the winding and ultimately the performance. - To achieve the highest attainable measurement sensitivity to the real ground rotation, the design of the mechanical substrate of the coil is critical to reduce as much as possible the sensor sensitivities to environmental noises. A preliminary assessment of the global noise performance of the 1m diameter FOG sensor will be presented. - To demonstrate the on-site performance, the low noise inter-disciplinary underground laboratory (LSBB, Rustrel, France), with a dense array of precisely oriented broad-band seismometers, provides the possibility to compare Large FOG rotation records with Array Derivated Rotation measurement method. Results of different prototypes during the development process will be presented to underline the applicability of each technological response to the Large-FOG requirements. Finally we conclude with presentation of the achieved results with a 1m scale diameter FOG having more than 10km of fiber length.
Wright, Robin; Parrish, Mark L; Cadera, Emily; Larson, Lynnelle; Matson, Clinton K; Garrett-Engele, Philip; Armour, Chris; Lum, Pek Yee; Shoemaker, Daniel D
2003-07-30
Increased levels of HMG-CoA reductase induce cell type- and isozyme-specific proliferation of the endoplasmic reticulum. In yeast, the ER proliferations induced by Hmg1p consist of nuclear-associated stacks of smooth ER membranes known as karmellae. To identify genes required for karmellae assembly, we compared the composition of populations of homozygous diploid S. cerevisiae deletion mutants following 20 generations of growth with and without karmellae. Using an initial population of 1,557 deletion mutants, 120 potential mutants were identified as a result of three independent experiments. Each experiment produced a largely non-overlapping set of potential mutants, suggesting that differences in specific growth conditions could be used to maximize the comprehensiveness of similar parallel analysis screens. Only two genes, UBC7 and YAL011W, were identified in all three experiments. Subsequent analysis of individual mutant strains confirmed that each experiment was identifying valid mutations, based on the mutant's sensitivity to elevated HMG-CoA reductase and inability to assemble normal karmellae. The largest class of HMG-CoA reductase-sensitive mutations was a subset of genes that are involved in chromatin structure and transcriptional regulation, suggesting that karmellae assembly requires changes in transcription or that the presence of karmellae may interfere with normal transcriptional regulation. Copyright 2003 John Wiley & Sons, Ltd.
Direct solar heating for Space Station application
NASA Technical Reports Server (NTRS)
Simon, W. E.
1985-01-01
Early investigations have shown that a large percentage of the power generated on the Space Station will be needed in the form of high-temperature thermal energy. The most efficient method of satisfying this requirement is through direct utilization of available solar energy. A system concept for the direct use of solar energy on the Space Station, including its benefits to customers, technologists, and designers of the station, is described. After a brief discussion of energy requirements and some possible applications, results of selective tradeoff studies are discussed, showing area reduction benefits and some possible configurations for the practical use of direct solar heating. Following this is a description of system elements and required technologies. Finally, an assessment of available contributive technologies is presented, and a Space Shuttle Orbiter flight experiment is proposed.
The Large Synoptic Survey Telescope project management control system
NASA Astrophysics Data System (ADS)
Kantor, Jeffrey P.
2012-09-01
The Large Synoptic Survey Telescope (LSST) program is jointly funded by the NSF, the DOE, and private institutions and donors. From an NSF funding standpoint, the LSST is a Major Research Equipment and Facilities (MREFC) project. The NSF funding process requires proposals and D&D reviews to include activity-based budgets and schedules; documented basis of estimates; risk-based contingency analysis; cost escalation and categorization. "Out-of-the box," the commercial tool Primavera P6 contains approximately 90% of the planning and estimating capability needed to satisfy R&D phase requirements, and it is customizable/configurable for remainder with relatively little effort. We describe the customization/configuration and use of Primavera for the LSST Project Management Control System (PMCS), assess our experience to date, and describe future directions. Examples in this paper are drawn from the LSST Data Management System (DMS), which is one of three main subsystems of the LSST and is funded by the NSF. By astronomy standards the LSST DMS is a large data management project, processing and archiving over 70 petabyes of image data, producing over 20 petabytes of catalogs annually, and generating 2 million transient alerts per night. Over the 6-year construction and commissioning phase, the DM project is estimated to require 600,000 hours of engineering effort. In total, the DMS cost is approximately 60% hardware/system software and 40% labor.
Design and manufacture of imaging time-of-propagation optics
NASA Astrophysics Data System (ADS)
Albrecht, Mike; Fast, James; Schwartz, Alan
2016-09-01
There are several challenges associated with the design and manufacture of the optics required for the imaging time-of- propagation detector constructed for the Belle II particle physics experiment. This detector uses Cherenkov light radiated in quartz bars to identify subatomic particles: pions, kaons, and protons. The optics are physically large (125 cm x 45 cm x 2 cm bars and 45 cm x 10 cm x 5 cm prisms), all surfaces are optically polished, and there is very little allowance for chamfers or surface defects. In addition to the optical challenges, there are several logistical and handling challenges associated with measuring, assembling, cleaning, packaging, and shipping these delicate precision optics. This paper describes a collaborative effort between Pacific Northwest National Laboratory, the University of Cincinnati, and ZYGO Corporation for the design and manufacture of 48 fused silica optics (30 bars and 18 prisms) for the iTOP Detector. Details of the iTOP detector design that drove the challenging optical requirements are provided, along with material selection considerations. Since the optics are so large, precise, and delicate, special care had to be given to the selection of a manufacturing process capable of achieving the challenging optical and surface defect requirements on such large and high-aspect-ratio (66:1) components. A brief update on the current status and performance of these optics is also provided.
Webb, D. R.; McNicholas, T. A.; Whitfield, H. N.; Wickham, J. E.
1985-01-01
The management and follow up of 200 consecutive patients with renal and ureteric calculi are presented. The primary treatment of 185 (92.5%) was by extracorporeal shockwave lithotripsy (ESWL), of whom three (1.6)%) with large calculi underwent percutaneous nephrolithotripsy (PCNL) prior to ESWL as a planned combined procedure. Twelve (6%) were treated by PCNL or ureterorenoscopy (URS) as their definitive treatment and three (1.5%) by conventional open renal and ureteric surgery. The average in-patient stay was 3.8 days and most returned to normal activity within one day of discharge. Of the 185 patients 102 (55%) required no analgesia after treatment by ESWL, 29 (15.6%) required parenteral analgesia and the rest were comfortable with oral non-narcotic medication. Thirty (16%) required auxillary treatment by percutaneous nephrostomy (PCN), PCNL and URS following ESWL for obstructive complications from stone particles. Two required further ESWL and one PCNL at three months for large fragments. Overall, open surgery was required for only 1% of renal calculi and 13% of ureteric stones. These results are consistant with the extensive West German experience confirming that most urinary calculi are now best managed by ESWL and endoscopic techniques. Where these facilities are available open surgery should only be necessary for less than 5% of upper urinary tract stones. PMID:4073760
Cuvo, A J; Lerch, L J; Leurquin, D A; Gaffaney, T J; Poppen, R L
1998-01-01
The present experiments examined the effect of work requirements in combination with reinforcement schedule on the choice behavior of adults with mental retardation and preschool children. The work requirements of age-appropriate tasks (i.e., sorting silverware, jumping hurdles, tossing beanbags) were manipulated. Participants were presented with their choice of two response options for each trial that varied simultaneously on both work requirement and reinforcement schedule. Results showed that when responding to both choices occurred on the same reinforcement schedule, participants allocated most of their responses to the option with the easier work requirement. When the response option requiring less work was on a leaner reinforcement schedule, most participants shifted their choice to exert more work. There were individual differences across participants regarding their pattern of responding and when they switched from the lesser to the greater work requirement. Data showed that participants' responding was largely controlled by the reinforcement received for responding to each level of work. Various conceptualizations regarding the effects of work requirements on choice behavior are discussed. PMID:9532750
The cryogenics design of the SuperCDMS SNOLAB experiment
NASA Astrophysics Data System (ADS)
Hollister, M. I.; Bauer, D. A.; Dhuley, R. C.; Lukens, P.; Martin, L. D.; Ruschman, M. K.; Schmitt, R. L.; Tatkowski, G. L.
2017-12-01
The Super Cryogenic Dark Matter Search (SuperCDMS) experiment is a direct detection dark matter experiment intended for deployment to the SNOLAB underground facility in Ontario, Canada. With a payload of up to 186 germanium and silicon crystal detectors operating below 15 mK, the cryogenic architecture of the experiment is complex. Further, the requirement that the cryostat presents a low radioactive background to the detectors limits the materials and techniques available for construction, and heavily influences the design of the cryogenics system. The resulting thermal architecture is a closed cycle (no liquid cryogen) system, with stages at 50 and 4 K cooled with gas and fluid circulation systems and stages at 1 K, 250 mK and 15 mK cooled by the lower temperature stages of a large, cryogen-free dilution refrigerator. This paper describes the thermal design of the experiment, including details of the cooling systems, mechanical designs and expected performance of the system under operational conditions.
The Microgravity Isolation Mount: A Linearized State-Space Model a la Newton and Kane
NASA Technical Reports Server (NTRS)
Hampton, R. David; Tryggvason, Bjarni V.; DeCarufel, Jean; Townsend, Miles A.; Wagar, William O.
1999-01-01
Vibration acceleration levels on large space platforms exceed the requirements of many space experiments. The Microgravity Vibration Isolation Mount (MIM) was built by the Canadian Space Agency to attenuate these disturbances to acceptable levels, and has been operational on the Russian Space Station Mir since May 1996. It has demonstrated good isolation performance and has supported several materials science experiments. The MIM uses Lorentz (voice-coil) magnetic actuators to levitate and isolate payloads at the individual experiment/sub-experiment (versus rack) level. Payload acceleration, relative position, and relative orientation (Euler-parameter) measurements are fed to a state-space controller. The controller, in turn, determines the actuator currents needed for effective experiment isolation. This paper presents the development of an algebraic, state-space model of the MIM, in a form suitable for optimal controller design. The equations are first derived using Newton's Second Law directly; then a second derivation (i.e., validation) of the same equations is provided, using Kane's approach.
NASA Astrophysics Data System (ADS)
Janzen, Kathryn Louise
Largely because of their resistance to magnetic fields, silicon photomultipliers (SiPMs) are being considered as the readout for the GlueX Barrel Calorimeter, a key component of the GlueX detector located immediately inside a 2.2 T superconducting solenoid. SiPMs with active area 1 x 1 mm2 have been investigated for use in other experiments, but detectors with larger active areas are required for the GlueX BCAL. This puts the GlueX collaboration in the unique position of being pioneers in the use of this frontend detection revolution by driving the technology for larger area sensors. SensL, a photonics research and development company in Ireland, has been collaborating with the University of Regina GlueX group to develop prototype large area SiPMs comprising 16 - 3x3 mm2 cells assembled in a close-packed matrix. Performance parameters of individual SensL 1x1 mm2 and 3x3 mm2 SiPMs along with prototype SensL SiPM arrays are tested, including current versus voltage characteristics, photon detection efficiency, and gain uniformity, in an effort to determine the suitability of these detectors to the GlueX BCAL readout.
Making the most of MBSE: pragmatic model-based engineering for the SKA Telescope Manager
NASA Astrophysics Data System (ADS)
Le Roux, Gerhard; Bridger, Alan; MacIntosh, Mike; Nicol, Mark; Schnetler, Hermine; Williams, Stewart
2016-08-01
Many large projects including major astronomy projects are adopting a Model Based Systems Engineering approach. How far is it possible to get value for the effort involved in developing a model that accurately represents a significant project such as SKA? Is it possible for such a large project to ensure that high-level requirements are traceable through the various system-engineering artifacts? Is it possible to utilize the tools available to produce meaningful measures for the impact of change? This paper shares one aspect of the experience gained on the SKA project. It explores some of the recommended and pragmatic approaches developed, to get the maximum value from the modeling activity while designing the Telescope Manager for the SKA. While it is too early to provide specific measures of success, certain areas are proving to be the most helpful and offering significant potential over the lifetime of the project. The experience described here has been on the 'Cameo Systems Modeler' tool-set, supporting a SysML based System Engineering approach; however the concepts and ideas covered would potentially be of value to any large project considering a Model based approach to their Systems Engineering.
NASA Astrophysics Data System (ADS)
Ku, Se-Ju; Yoo, Seung-Hoon; Kwak, Seung-Jun
2009-08-01
This study attempts to apply choice experiments with regard to the residential waste disposal system (RWDS) in Korea by considering various attributes that are related to RWDS. Using data from a survey conducted on 492 households, the empirical analysis yields estimates of the willingness to pay for a clean food-waste collection facility, the collection of small items (such as obsolete mobile phones and add-ons for personal computers), and a more convenient large waste disposal system. The estimation results of multinomial logit models are quite similar to those of nested logit models. The results reveal that residents have preferences for the cleanliness of facilities and the collection of small items. In Korea, residents are required to purchase and attach stickers for the disposal of large items; they want to be able to obtain stickers at not only village offices but also supermarkets. On the other hand, the frequency of waste collection is not a significant factor in the choice of the improved waste management program.
Pedwell, Rhianna K; Fraser, James A; Wang, Jack T H; Clegg, Jack K; Chartres, Jy D; Rowland, Susan L
2018-01-31
Course-integrated Undergraduate Research Experiences (CUREs) involve large numbers of students in real research. We describe a late-year microbiology CURE in which students use yeast to address a research question around beer brewing or synthesizing biofuel; the interdisciplinary student-designed project incorporates genetics, bioinformatics, biochemistry, analytical chemistry, and microbiology. Students perceived significant learning gains around multiple technical and "becoming a scientist" aspects of the project. The project is demanding for both the students and the academic implementers. We examine the rich landscape of support and interaction that this CURE both encourages and requires while also considering how we can support the exercise better and more sustainably. The findings from this study provide a picture of a CURE implementation that has begun to reach the limits of both the students' and the academics' capacities to complete it. © 2018 by The International Union of Biochemistry and Molecular Biology, 2018. © 2018 The International Union of Biochemistry and Molecular Biology.
Ku, Se-Ju; Yoo, Seung-Hoon; Kwak, Seung-Jun
2009-08-01
This study attempts to apply choice experiments with regard to the residential waste disposal system (RWDS) in Korea by considering various attributes that are related to RWDS. Using data from a survey conducted on 492 households, the empirical analysis yields estimates of the willingness to pay for a clean food-waste collection facility, the collection of small items (such as obsolete mobile phones and add-ons for personal computers), and a more convenient large waste disposal system. The estimation results of multinomial logit models are quite similar to those of nested logit models. The results reveal that residents have preferences for the cleanliness of facilities and the collection of small items. In Korea, residents are required to purchase and attach stickers for the disposal of large items; they want to be able to obtain stickers at not only village offices but also supermarkets. On the other hand, the frequency of waste collection is not a significant factor in the choice of the improved waste management program.
NASA Astrophysics Data System (ADS)
Myers, B.; Wiggins, H. V.; Turner-Bogren, E. J.; Warburton, J.
2017-12-01
Project Managers at the Arctic Research Consortium of the U.S. (ARCUS) lead initiatives to convene, communicate with, and connect the Arctic research community across challenging disciplinary, geographic, temporal, and cultural boundaries. They regularly serve as the organizing hubs, archivists and memory-keepers for collaborative projects comprised of many loosely affiliated partners. As leading organizers of large open science meetings and other outreach events, they also monitor the interdisciplinary landscape of community needs, concerns, opportunities, and emerging research directions. However, leveraging the ARCUS Project Manager role to strategically build out the intangible infrastructure necessary to advance Arctic research requires a unique set of knowledge, skills, and experience. Drawing on a range of lessons learned from past and ongoing experiences with collaborative science, education and outreach programming, this presentation will highlight a model of ARCUS project management that we believe works best to support and sustain our community in its long-term effort to conquer the complexities of Arctic research.
Mining algorithm for association rules in big data based on Hadoop
NASA Astrophysics Data System (ADS)
Fu, Chunhua; Wang, Xiaojing; Zhang, Lijun; Qiao, Liying
2018-04-01
In order to solve the problem that the traditional association rules mining algorithm has been unable to meet the mining needs of large amount of data in the aspect of efficiency and scalability, take FP-Growth as an example, the algorithm is realized in the parallelization based on Hadoop framework and Map Reduce model. On the basis, it is improved using the transaction reduce method for further enhancement of the algorithm's mining efficiency. The experiment, which consists of verification of parallel mining results, comparison on efficiency between serials and parallel, variable relationship between mining time and node number and between mining time and data amount, is carried out in the mining results and efficiency by Hadoop clustering. Experiments show that the paralleled FP-Growth algorithm implemented is able to accurately mine frequent item sets, with a better performance and scalability. It can be better to meet the requirements of big data mining and efficiently mine frequent item sets and association rules from large dataset.
A large ion beam device for laboratory solar wind studies
NASA Astrophysics Data System (ADS)
Ulibarri, Zach; Han, Jia; Horányi, Mihály; Munsat, Tobin; Wang, Xu; Whittall-Scherfee, Guy; Yeo, Li Hsia
2017-11-01
The Colorado Solar Wind Experiment is a new device constructed at the Institute for Modeling Plasma, Atmospheres, and Cosmic Dust at the University of Colorado. A large cross-sectional Kaufman ion source is used to create steady state plasma flow to model the solar wind in an experimental vacuum chamber. The plasma beam has a diameter of 12 cm at the source, ion energies of up to 1 keV, and ion flows of up to 0.1 mA/cm2. Chamber pressure can be reduced to 4 × 10-5 Torr under operating conditions to suppress ion-neutral collisions and create a monoenergetic ion beam. The beam profile has been characterized by a Langmuir probe and an ion energy analyzer mounted on a two-dimensional translation stage. The beam profile meets the requirements for planned experiments that will study solar wind interaction with lunar magnetic anomalies, the charging and dynamics of dust in the solar wind, plasma wakes and refilling, and the wakes of topographic features such as craters or boulders. This article describes the technical details of the device, initial operation and beam characterization, and the planned experiments.
Samaha, Jason; Postle, Bradley R
2017-11-29
Adaptive behaviour depends on the ability to introspect accurately about one's own performance. Whether this metacognitive ability is supported by the same mechanisms across different tasks is unclear. We investigated the relationship between metacognition of visual perception and metacognition of visual short-term memory (VSTM). Experiments 1 and 2 required subjects to estimate the perceived or remembered orientation of a grating stimulus and rate their confidence. We observed strong positive correlations between individual differences in metacognitive accuracy between the two tasks. This relationship was not accounted for by individual differences in task performance or average confidence, and was present across two different metrics of metacognition and in both experiments. A model-based analysis of data from a third experiment showed that a cross-domain correlation only emerged when both tasks shared the same task-relevant stimulus feature. That is, metacognition for perception and VSTM were correlated when both tasks required orientation judgements, but not when the perceptual task was switched to require contrast judgements. In contrast with previous results comparing perception and long-term memory, which have largely provided evidence for domain-specific metacognitive processes, the current findings suggest that metacognition of visual perception and VSTM is supported by a domain-general metacognitive architecture, but only when both domains share the same task-relevant stimulus feature. © 2017 The Author(s).
The ATLAS Simulation Infrastructure
Aad, G.; Abbott, B.; Abdallah, J.; ...
2010-09-25
The simulation software for the ATLAS Experiment at the Large Hadron Collider is being used for large-scale production of events on the LHC Computing Grid. This simulation requires many components, from the generators that simulate particle collisions, through packages simulating the response of the various detectors and triggers. All of these components come together under the ATLAS simulation infrastructure. In this paper, that infrastructure is discussed, including that supporting the detector description, interfacing the event generation, and combining the GEANT4 simulation of the response of the individual detectors. Also described are the tools allowing the software validation, performance testing, andmore » the validation of the simulated output against known physics processes.« less
Object Transportation by Two Mobile Robots with Hand Carts
Hara, Tatsunori
2014-01-01
This paper proposes a methodology by which two small mobile robots can grasp, lift, and transport large objects using hand carts. The specific problems involve generating robot actions and determining the hand cart positions to achieve the stable loading of objects onto the carts. These problems are solved using nonlinear optimization, and we propose an algorithm for generating robot actions. The proposed method was verified through simulations and experiments using actual devices in a real environment. The proposed method could reduce the number of robots required to transport large objects with 50–60%. In addition, we demonstrated the efficacy of this task in real environments where errors occur in robot sensing and movement. PMID:27433499
Object Transportation by Two Mobile Robots with Hand Carts.
Sakuyama, Takuya; Figueroa Heredia, Jorge David; Ogata, Taiki; Hara, Tatsunori; Ota, Jun
2014-01-01
This paper proposes a methodology by which two small mobile robots can grasp, lift, and transport large objects using hand carts. The specific problems involve generating robot actions and determining the hand cart positions to achieve the stable loading of objects onto the carts. These problems are solved using nonlinear optimization, and we propose an algorithm for generating robot actions. The proposed method was verified through simulations and experiments using actual devices in a real environment. The proposed method could reduce the number of robots required to transport large objects with 50-60%. In addition, we demonstrated the efficacy of this task in real environments where errors occur in robot sensing and movement.
LAMBDA 2M GaAs—A multi-megapixel hard X-ray detector for synchrotrons
NASA Astrophysics Data System (ADS)
Pennicard, D.; Smoljanin, S.; Pithan, F.; Sarajlic, M.; Rothkirch, A.; Yu, Y.; Liermann, H. P.; Morgenroth, W.; Winkler, B.; Jenei, Z.; Stawitz, H.; Becker, J.; Graafsma, H.
2018-01-01
Synchrotrons can provide very intense and focused X-ray beams, which can be used to study the structure of matter down to the atomic scale. In many experiments, the quality of the results depends strongly on detector performance; in particular, experiments studying dynamics of samples require fast, sensitive X-ray detectors. "LAMBDA" is a photon-counting hybrid pixel detector system for experiments at synchrotrons, based on the Medipix3 readout chip. Its main features are a combination of comparatively small pixel size (55 μm), high readout speed at up to 2000 frames per second with no time gap between images, a large tileable module design, and compatibility with high-Z sensors for efficient detection of higher X-ray energies. A large LAMBDA system for hard X-ray detection has been built using Cr-compensated GaAs as a sensor material. The system is composed of 6 GaAs tiles, each of 768 by 512 pixels, giving a system with approximately 2 megapixels and an area of 8.5 by 8.5 cm2. While the sensor uniformity of GaAs is not as high as that of silicon, its behaviour is stable over time, and it is possible to correct nonuniformities effectively by postprocessing of images. By using multiple 10 Gigabit Ethernet data links, the system can be read out at the full speed of 2000 frames per second. The system has been used in hard X-ray diffraction experiments studying the structure of samples under extreme pressure in diamond anvil cells. These experiments can provide insight into geological processes. Thanks to the combination of high speed readout, large area and high sensitivity to hard X-rays, it is possible to obtain previously unattainable information in these experiments about atomic-scale structure on a millisecond timescale during rapid changes of pressure or temperature.
High-Speed Automatic Microscopy for Real Time Tracks Reconstruction in Nuclear Emulsion
NASA Astrophysics Data System (ADS)
D'Ambrosio, N.
2006-06-01
The Oscillation Project with Emulsion-tRacking Apparatus (OPERA) experiment will use a massive nuclear emulsion detector to search for /spl nu//sub /spl mu///spl rarr//spl nu//sub /spl tau// oscillation by identifying /spl tau/ leptons through the direct detection of their decay topology. The feasibility of experiments using a large mass emulsion detector is linked to the impressive progress under way in the development of automatic emulsion analysis. A new generation of scanning systems requires the development of fast automatic microscopes for emulsion scanning and image analysis to reconstruct tracks of elementary particles. The paper presents the European Scanning System (ESS) developed in the framework of OPERA collaboration.
Modeling and Improving Information Flows in the Development of Large Business Applications
NASA Astrophysics Data System (ADS)
Schneider, Kurt; Lübke, Daniel
Designing a good architecture for an application is a wicked problem. Therefore, experience and knowledge are considered crucial for informing work in software architecture. However, many organizations do not pay sufficient attention to experience exploitation and architectural learning. Many users of information systems are not aware of the options and the needs to report problems and requirements. They often do not have time to describe a problem encountered in sufficient detail for developers to remove it. And there may be a lengthy process for providing feedback. Hence, the knowledge about problems and potential solutions is not shared effectively. Architectural knowledge needs to include evaluative feedback as well as decisions and their reasons (rationale).
Carvalho, Rimenys J; Cruz, Thayana A
2018-01-01
High-throughput screening (HTS) systems have emerged as important tools to provide fast and low cost evaluation of several conditions at once since it requires small quantities of material and sample volumes. These characteristics are extremely valuable for experiments with large number of variables enabling the application of design of experiments (DoE) strategies or simple experimental planning approaches. Once, the capacity of HTS systems to mimic chromatographic purification steps was established, several studies were performed successfully including scale down purification. Here, we propose a method for studying different purification conditions that can be used for any recombinant protein, including complex and glycosylated proteins, using low binding filter microplates.
Development, implementation, and experimentation of parametric routing protocol for sensor networks
NASA Astrophysics Data System (ADS)
Nassr, Matthew S.; Jun, Jangeun; Eidenbenz, Stephan J.; Frigo, Janette R.; Hansson, Anders A.; Mielke, Angela M.; Smith, Mark C.
2006-09-01
The development of a scalable and reliable routing protocol for sensor networks is traced from a theoretical beginning to positive simulation results to the end of verification experiments in large and heavily loaded networks. Design decisions and explanations as well as implementation hurdles are presented to give a complete picture of protocol development. Additional software and hardware is required to accurately test the performance of our protocol in field experiments. In addition, the developed protocol is tested in TinyOS on Mica2 motes against well-established routing protocols frequently used in sensor networks. Our protocol proves to outperform the standard (MINTRoute) and the trivial (Gossip) in a variety of different scenarios.
Simulation studies for the PANDA experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kopf, B.
2005-10-26
One main component of the planned Facility for Antiproton and Ion Research (FAIR) is the High Energy Storage Ring (HESR) at GSI, Darmstadt, which will provide cooled antiprotons with momenta between 1.5 and 15 GeV/c. The PANDA experiment will investigate p-barannihilations with internal hydrogen and nuclear targets. Due to the planned extensive physics program a multipurpose detector with nearly complete solid angle coverage, proper particle identification over a large momentum range, and high resolution calorimetry for neutral particles is required. For the optimization of the detector design simulation studies of several benchmark channels are in progress which are covering themore » most relevant physics topics. Some important simulation results are discussed here.« less
NASA Technical Reports Server (NTRS)
Arya, L. M. (Principal Investigator)
1980-01-01
Predictive procedures for developing soil hydrologic properties (i.e., relationships of soil water pressure and hydraulic conductivity to soil water content) are presented. Three models of the soil water pressure-water content relationship and one model of the hydraulic conductivity-water content relationship are discussed. Input requirements for the models are indicated, and computational procedures are outlined. Computed hydrologic properties for Keith silt loam, a soil typer near Colby, Kansas, on which the 1978 Agricultural Soil Moisture Experiment was conducted, are presented. A comparison of computed results with experimental data in the dry range shows that analytical models utilizing a few basic hydrophysical parameters can produce satisfactory data for large-scale applications.
Inelastic Boosted Dark Matter at direct detection experiments
NASA Astrophysics Data System (ADS)
Giudice, Gian F.; Kim, Doojin; Park, Jong-Chul; Shin, Seodong
2018-05-01
We explore a novel class of multi-particle dark sectors, called Inelastic Boosted Dark Matter (iBDM). These models are constructed by combining properties of particles that scatter off matter by making transitions to heavier states (Inelastic Dark Matter) with properties of particles that are produced with a large Lorentz boost in annihilation processes in the galactic halo (Boosted Dark Matter). This combination leads to new signals that can be observed at ordinary direct detection experiments, but require unconventional searches for energetic recoil electrons in coincidence with displaced multi-track events. Related experimental strategies can also be used to probe MeV-range boosted dark matter via their interactions with electrons inside the target material.
Implementation of IT-based applications in the safeguards field
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ekenstam, G.C. af; Sallstrom, M.
1995-12-31
For many years the Swedish Nuclear Power Inspectorate, SKI, has used computers as a tool within nuclear material control and accountancy. Over the last five years a lot of effort has been put into projects related to the increasing possibilities of fast and reliable data transfer over large distances. The paper discusses related administrative and technical issues and presents experience gained in tasks of the Swedish Support Program to IAEA Safeguards and during the alternative Safeguards trials carried out by SKI. The following topics will be presented: (1) Main Safeguards purposes and data transfer; (2) Administrative systems and requirements; (3)more » Technical possibilities and experiences; and (4) The cost aspect.« less
Channeling of multikilojoule high-intensity laser beams in an inhomogeneous plasma
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ivancic, S.; Haberberger, D.; Habara, H.
Channeling experiments were performed that demonstrate the transport of high-intensity (>10¹⁸ W/cm²), multikilojoule laser light through a millimeter-sized, inhomogeneous (~300-μm density scale length) laser produced plasma up to overcritical density, which is an important step forward for the fast-ignition concept. The background plasma density and the density depression inside the channel were characterized with a novel optical probe system. The channel progression velocity was measured, which agrees well with theoretical predictions based on large scale particle-in-cell simulations, confirming scaling laws for the required channeling laser energy and laser pulse duration, which are important parameters for future integrated fast-ignition channeling experiments.
Otero, Jorge; Guerrero, Hector; Gonzalez, Laura; Puig-Vidal, Manel
2012-01-01
The time required to image large samples is an important limiting factor in SPM-based systems. In multiprobe setups, especially when working with biological samples, this drawback can make impossible to conduct certain experiments. In this work, we present a feedfordward controller based on bang-bang and adaptive controls. The controls are based in the difference between the maximum speeds that can be used for imaging depending on the flatness of the sample zone. Topographic images of Escherichia coli bacteria samples were acquired using the implemented controllers. Results show that to go faster in the flat zones, rather than using a constant scanning speed for the whole image, speeds up the imaging process of large samples by up to a 4× factor. PMID:22368491
An overview of large wind turbine tests by electric utilities
NASA Technical Reports Server (NTRS)
Vachon, W. A.; Schiff, D.
1982-01-01
A summary of recent plants and experiences on current large wind turbine (WT) tests being conducted by electric utilities is provided. The test programs discussed do not include federal research and development (R&D) programs, many of which are also being conducted in conjunction with electric utilities. The information presented is being assembled in a project, funded by the Electric Power Research Institute (EPRI), the objective of which is to provide electric utilities with timely summaries of test performance on key large wind turbines. A summary of key tests, test instrumentation, and recent results and plans is given. During the past year, many of the utility test programs initiated have encountered test difficulties that required specific WT design changes. However, test results to date continue to indicate that long-term machine performance and cost-effectiveness are achievable.
Processing of high performance (LRE)-Ba Cu O large, single-grain bulk superconductors in air
NASA Astrophysics Data System (ADS)
Hari Babu, N.; Iida, K.; Shi, Y.; Cardwell, D. A.
2006-10-01
We report the fabrication of large (LRE)BCO single-grains with improved superconducting properties for LRE = Nd, Sm and Gd using a practical process via both conventional top seeded melt growth (TSMG) and seeded infiltration-growth (SIG). This process uses a new generic seed crystal that promotes heterogeneous grain nucleation in the required orientation and suppresses the formation of solid solution in a controlled manner within individual grains by the addition of excess BaO2 to the precursor powder. The spatial distribution of the superconducting properties of LRE bulk superconductors as a function of BaO2 addition for large (LRE)BCO grains fabricated in air by TSMG and SIG for LRE = Gd, Sm and Nd are compared. The optimum BaO2 content required to fabricate single-grain (LRE)BCO with high and homogeneous Tc is determined from these experiments for each LRE system. The irreversibility fields of (LRE)BCO bulk superconductors processed in air are as high as those processed in reduced PO2. Critical current densities in excess of 105 A/cm2 at 77 K and higher trapped fields have been achieved in optimized (LRE)BCO superconductors fabricated in air for the first time.
The BaBar Data Reconstruction Control System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ceseracciu, A
2005-04-20
The BaBar experiment is characterized by extremely high luminosity and very large volume of data produced and stored, with increasing computing requirements each year. To fulfill these requirements a Control System has been designed and developed for the offline distributed data reconstruction system. The control system described in this paper provides the performance and flexibility needed to manage a large number of small computing farms, and takes full benefit of OO design. The infrastructure is well isolated from the processing layer, it is generic and flexible, based on a light framework providing message passing and cooperative multitasking. The system ismore » distributed in a hierarchical way: the top-level system is organized in farms, farms in services, and services in subservices or code modules. It provides a powerful Finite State Machine framework to describe custom processing models in a simple regular language. This paper describes the design and evolution of this control system, currently in use at SLAC and Padova on {approx}450 CPUs organized in 9 farms.« less
Small-scale dynamic confinement gap test
NASA Astrophysics Data System (ADS)
Cook, Malcolm
2011-06-01
Gap tests are routinely used to ascertain the shock sensitiveness of new explosive formulations. The tests are popular since that are easy and relatively cheap to perform. However, with modern insensitive formulations with big critical diameters, large test samples are required. This can make testing and screening of new formulations expensive since large quantities of test material are required. Thus a new test that uses significantly smaller sample quantities would be very beneficial. In this paper we describe a new small-scale test that has been designed using our CHARM ignition and growth routine in the DYNA2D hydrocode. The new test is a modified gap test and uses detonating nitromethane to provide dynamic confinement (instead of a thick metal case) whilst exposing the sample to a long duration shock wave. The long duration shock wave allows less reactive materials that are below their critical diameter, more time to react. We present details on the modelling of the test together with some preliminary experiments to demonstrate the potential of the new test method.
Sigoillot, Frederic D; Huckins, Jeremy F; Li, Fuhai; Zhou, Xiaobo; Wong, Stephen T C; King, Randall W
2011-01-01
Automated time-lapse microscopy can visualize proliferation of large numbers of individual cells, enabling accurate measurement of the frequency of cell division and the duration of interphase and mitosis. However, extraction of quantitative information by manual inspection of time-lapse movies is too time-consuming to be useful for analysis of large experiments. Here we present an automated time-series approach that can measure changes in the duration of mitosis and interphase in individual cells expressing fluorescent histone 2B. The approach requires analysis of only 2 features, nuclear area and average intensity. Compared to supervised learning approaches, this method reduces processing time and does not require generation of training data sets. We demonstrate that this method is as sensitive as manual analysis in identifying small changes in interphase or mitotic duration induced by drug or siRNA treatment. This approach should facilitate automated analysis of high-throughput time-lapse data sets to identify small molecules or gene products that influence timing of cell division.
The BaBar Data Reconstruction Control System
NASA Astrophysics Data System (ADS)
Ceseracciu, A.; Piemontese, M.; Tehrani, F. S.; Pulliam, T. M.; Galeazzi, F.
2005-08-01
The BaBar experiment is characterized by extremely high luminosity and very large volume of data produced and stored, with increasing computing requirements each year. To fulfill these requirements a control system has been designed and developed for the offline distributed data reconstruction system. The control system described in this paper provides the performance and flexibility needed to manage a large number of small computing farms, and takes full benefit of object oriented (OO) design. The infrastructure is well isolated from the processing layer, it is generic and flexible, based on a light framework providing message passing and cooperative multitasking. The system is distributed in a hierarchical way: the top-level system is organized in farms, farms in services, and services in subservices or code modules. It provides a powerful finite state machine framework to describe custom processing models in a simple regular language. This paper describes the design and evolution of this control system, currently in use at SLAC and Padova on /spl sim/450 CPUs organized in nine farms.
NASA Technical Reports Server (NTRS)
Szuszczewicz, Edward P.
1986-01-01
Large, permanently-manned space platforms can provide exciting opportunities for discoveries in basic plasma and geoplasma sciences. The potential for these discoveries will depend very critically on the properties of the platform, its subsystems, and their abilities to fulfill a spectrum of scientific requirements. With this in mind, the planning of space station research initiatives and the development of attendant platform engineering should allow for the identification of critical science and technology issues that must be clarified far in advance of space station program implementation. An attempt is made to contribute to that process, with a perspective that looks to the development of the space station as a permanently-manned Spaceborne Ionospheric Weather Station. The development of this concept requires a synergism of science and technology which leads to several critical design issues. To explore the identification of these issues, the development of the concept of an Ionospheric Weather Station will necessarily touch upon a number of diverse areas. These areas are discussed.
A Robust Adaptive Autonomous Approach to Optimal Experimental Design
NASA Astrophysics Data System (ADS)
Gu, Hairong
Experimentation is the fundamental tool of scientific inquiries to understand the laws governing the nature and human behaviors. Many complex real-world experimental scenarios, particularly in quest of prediction accuracy, often encounter difficulties to conduct experiments using an existing experimental procedure for the following two reasons. First, the existing experimental procedures require a parametric model to serve as the proxy of the latent data structure or data-generating mechanism at the beginning of an experiment. However, for those experimental scenarios of concern, a sound model is often unavailable before an experiment. Second, those experimental scenarios usually contain a large number of design variables, which potentially leads to a lengthy and costly data collection cycle. Incompetently, the existing experimental procedures are unable to optimize large-scale experiments so as to minimize the experimental length and cost. Facing the two challenges in those experimental scenarios, the aim of the present study is to develop a new experimental procedure that allows an experiment to be conducted without the assumption of a parametric model while still achieving satisfactory prediction, and performs optimization of experimental designs to improve the efficiency of an experiment. The new experimental procedure developed in the present study is named robust adaptive autonomous system (RAAS). RAAS is a procedure for sequential experiments composed of multiple experimental trials, which performs function estimation, variable selection, reverse prediction and design optimization on each trial. Directly addressing the challenges in those experimental scenarios of concern, function estimation and variable selection are performed by data-driven modeling methods to generate a predictive model from data collected during the course of an experiment, thus exempting the requirement of a parametric model at the beginning of an experiment; design optimization is performed to select experimental designs on the fly of an experiment based on their usefulness so that fewest designs are needed to reach useful inferential conclusions. Technically, function estimation is realized by Bayesian P-splines, variable selection is realized by Bayesian spike-and-slab prior, reverse prediction is realized by grid-search and design optimization is realized by the concepts of active learning. The present study demonstrated that RAAS achieves statistical robustness by making accurate predictions without the assumption of a parametric model serving as the proxy of latent data structure while the existing procedures can draw poor statistical inferences if a misspecified model is assumed; RAAS also achieves inferential efficiency by taking fewer designs to acquire useful statistical inferences than non-optimal procedures. Thus, RAAS is expected to be a principled solution to real-world experimental scenarios pursuing robust prediction and efficient experimentation.
Effects of rotation on coolant passage heat transfer. Volume 1: Coolant passages with smooth walls
NASA Technical Reports Server (NTRS)
Hajek, T. J.; Wagner, J. H.; Johnson, B. V.; Higgins, A. W.; Steuber, G. D.
1991-01-01
An experimental program was conducted to investigate heat transfer and pressure loss characteristics of rotating multipass passages, for configurations and dimensions typical of modern turbine blades. The immediate objective was the generation of a data base of heat transfer and pressure loss data required to develop heat transfer correlations and to assess computational fluid dynamic techniques for rotating coolant passages. Experiments were conducted in a smooth wall large scale heat transfer model.
ERIC Educational Resources Information Center
Larsson, Ellinor; Larsson-Lund, Maria; Nilsson, Ingeborg
2013-01-01
The digital gap is a threat to the participation of senior citizens in society, as a large proportion of seniors are not involved in Internet based activities (IBAs). To be able to overcome this disadvantage for seniors, there is a need to both learn more about the conditions that make seniors start performing IBAs and to be able to provide them…
ERIC Educational Resources Information Center
National Association for Gifted Children (NJ1), 2009
2009-01-01
The U.S. is largely neglecting the estimated 3 million academically gifted and talented students who represent diverse experiences, skills, ethnicity, and cultural and economic backgrounds. All of them require a responsive and challenging educational system if they are to achieve to their highest potential. According to the "State of the States"…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burrage, Clare; Copeland, Edmund J.; Hinds, E.A., E-mail: Clare.Burrage@nottingham.ac.uk, E-mail: Edmund.Copeland@nottingham.ac.uk, E-mail: Ed.Hinds@imperial.ac.uk
Theories of dark energy require a screening mechanism to explain why the associated scalar fields do not mediate observable long range fifth forces. The archetype of this is the chameleon field. Here we show that individual atoms are too small to screen the chameleon field inside a large high-vacuum chamber, and therefore can detect the field with high sensitivity. We derive new limits on the chameleon parameters from existing experiments, and show that most of the remaining chameleon parameter space is readily accessible using atom interferometry.
Niehorster, Diederick C.; Li, Li; Lappe, Markus
2017-01-01
The advent of inexpensive consumer virtual reality equipment enables many more researchers to study perception with naturally moving observers. One such system, the HTC Vive, offers a large field-of-view, high-resolution head mounted display together with a room-scale tracking system for less than a thousand U.S. dollars. If the position and orientation tracking of this system is of sufficient accuracy and precision, it could be suitable for much research that is currently done with far more expensive systems. Here we present a quantitative test of the HTC Vive’s position and orientation tracking as well as its end-to-end system latency. We report that while the precision of the Vive’s tracking measurements is high and its system latency (22 ms) is low, its position and orientation measurements are provided in a coordinate system that is tilted with respect to the physical ground plane. Because large changes in offset were found whenever tracking was briefly lost, it cannot be corrected for with a one-time calibration procedure. We conclude that the varying offset between the virtual and the physical tracking space makes the HTC Vive at present unsuitable for scientific experiments that require accurate visual stimulation of self-motion through a virtual world. It may however be suited for other experiments that do not have this requirement. PMID:28567271
Niehorster, Diederick C; Li, Li; Lappe, Markus
2017-01-01
The advent of inexpensive consumer virtual reality equipment enables many more researchers to study perception with naturally moving observers. One such system, the HTC Vive, offers a large field-of-view, high-resolution head mounted display together with a room-scale tracking system for less than a thousand U.S. dollars. If the position and orientation tracking of this system is of sufficient accuracy and precision, it could be suitable for much research that is currently done with far more expensive systems. Here we present a quantitative test of the HTC Vive's position and orientation tracking as well as its end-to-end system latency. We report that while the precision of the Vive's tracking measurements is high and its system latency (22 ms) is low, its position and orientation measurements are provided in a coordinate system that is tilted with respect to the physical ground plane. Because large changes in offset were found whenever tracking was briefly lost, it cannot be corrected for with a one-time calibration procedure. We conclude that the varying offset between the virtual and the physical tracking space makes the HTC Vive at present unsuitable for scientific experiments that require accurate visual stimulation of self-motion through a virtual world. It may however be suited for other experiments that do not have this requirement.
Morales-Conde, Salvador; Cañete-Gómez, Jesús; Gómez, Virginia; Socas Macías, María; Moreno, Antonio Barranco; Del Agua, Isaias Alarcón; Ruíz, Francisco Javier Padillo
2016-10-01
After reports on laparoendoscopic single-site (LESS) cholecystectomy, concerns have been raised over the level of difficulty and a potential increase in complications when moving away from conventional gold standard multiport laparoscopy due to incomplete exposure and larger umbilical incisions. With continued development of technique and technology, it has now become possible to fully replicate this gold standard procedure through an LESS approach. First experiences with the newly developed technique and instrument are reported. Fifteen patients presenting with cholelithiasis without signs of inflammation were operated using all surgical steps considered appropriate for the conventional four-port laparoscopic approach, but applied through a single access device. Operation-centered outcomes are presented. There were no peri- or postoperative complications. Mean operating time was 32.3 minutes. No conversion to regular laparoscopy was required. The critical view of safety was achieved in all cases. Mean skin incision length was 2.2 cm. The application of a standardized technique combined with the use of a four-port LESS device allows us to perform LESS cholecystectomy, giving us a correct exposure of the structures and without increasing the mean operating time combining previously reported advantages of LESS. A universal trait of any new technique should be safety and reproducibility. This will enhance its applicability by large number of surgeons and to large number of patients requiring cholecystectomy.
Wavefront control of high-power laser beams in the National Ignition Facility (NIF)
NASA Astrophysics Data System (ADS)
Zacharias, Richard A.; Bliss, Erlan S.; Winters, Scott; Sacks, Richard A.; Feldman, Mark; Grey, Andrew; Koch, Jeffrey A.; Stolz, Christopher J.; Toeppen, John S.; Van Atta, Lewis; Woods, Bruce W.
2000-04-01
The use of lasers as the driver for inertial confinement fusion and weapons physics experiments is based on their ability to produce high-energy short pulses in a beam with low divergence. Indeed, the focusability of high quality laser beams far exceeds alternate technologies and is a major factor in the rationale for building high power lasers for such applications. The National Ignition Facility (NIF) is a large, 192-beam, high-power laser facility under construction at the Lawrence Livermore National Laboratory for fusion and weapons physics experiments. Its uncorrected minimum focal spot size is limited by laser system aberrations. The NIF includes a Wavefront Control System to correct these aberrations to yield a focal spot small enough for its applications. Sources of aberrations to be corrected include prompt pump-induced distortions in the laser amplifiers, previous-shot thermal distortions, beam off-axis effects, and gravity, mounting, and coating-induced optic distortions. Aberrations from gas density variations and optic-manufacturing figure errors are also partially corrected. This paper provides an overview of the NIF Wavefront Control System and describes the target spot size performance improvement it affords. It describes provisions made to accommodate the NIF's high fluence (laser beam and flashlamp), large wavefront correction range, wavefront temporal bandwidth, temperature and humidity variations, cleanliness requirements, and exception handling requirements (e.g. wavefront out-of-limits conditions).
Droplet microfluidics for synthetic biology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gach, PC; Iwai, K; Kim, PW
2017-01-01
© 2017 The Royal Society of Chemistry. Synthetic biology is an interdisciplinary field that aims to engineer biological systems for useful purposes. Organism engineering often requires the optimization of individual genes and/or entire biological pathways (consisting of multiple genes). Advances in DNA sequencing and synthesis have recently begun to enable the possibility of evaluating thousands of gene variants and hundreds of thousands of gene combinations. However, such large-scale optimization experiments remain cost-prohibitive to researchers following traditional molecular biology practices, which are frequently labor-intensive and suffer from poor reproducibility. Liquid handling robotics may reduce labor and improve reproducibility, but are themselvesmore » expensive and thus inaccessible to most researchers. Microfluidic platforms offer a lower entry price point alternative to robotics, and maintain high throughput and reproducibility while further reducing operating costs through diminished reagent volume requirements. Droplet microfluidics have shown exceptional promise for synthetic biology experiments, including DNA assembly, transformation/transfection, culturing, cell sorting, phenotypic assays, artificial cells and genetic circuits.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
de Vega, F F; Cantu-Paz, E; Lopez, J I
The population size of genetic algorithms (GAs) affects the quality of the solutions and the time required to find them. While progress has been made in estimating the population sizes required to reach a desired solution quality for certain problems, in practice the sizing of populations is still usually performed by trial and error. These trials might lead to find a population that is large enough to reach a satisfactory solution, but there may still be opportunities to optimize the computational cost by reducing the size of the population. This paper presents a technique called plague that periodically removes amore » number of individuals from the population as the GA executes. Recently, the usefulness of the plague has been demonstrated for genetic programming. The objective of this paper is to extend the study of plagues to genetic algorithms. We experiment with deceptive trap functions, a tunable difficult problem for GAs, and the experiments show that plagues can save computational time while maintaining solution quality and reliability.« less
The version control service for the ATLAS data acquisition configuration files
NASA Astrophysics Data System (ADS)
Soloviev, Igor
2012-12-01
The ATLAS experiment at the LHC in Geneva uses a complex and highly distributed Trigger and Data Acquisition system, involving a very large number of computing nodes and custom modules. The configuration of the system is specified by schema and data in more than 1000 XML files, with various experts responsible for updating the files associated with their components. Maintaining an error free and consistent set of XML files proved a major challenge. Therefore a special service was implemented; to validate any modifications; to check the authorization of anyone trying to modify a file; to record who had made changes, plus when and why; and to provide tools to compare different versions of files and to go back to earlier versions if required. This paper provides details of the implementation and exploitation experience, that may be interesting for other applications using many human-readable files maintained by different people, where consistency of the files and traceability of modifications are key requirements.
Defaults, context, and knowledge: alternatives for OWL-indexed knowledge bases.
Rector, A
2004-01-01
The new Web Ontology Language (OWL) and its Description Logic compatible sublanguage (OWL-DL) explicitly exclude defaults and exceptions, as do all logic based formalisms for ontologies. However, many biomedical applications appear to require default reasoning, at least if they are to be engineered in a maintainable way. Default reasoning has always been one of the great strengths of Frame systems such as Protégé. Resolving this conflict requires analysis of the different uses for defaults and exceptions. In some cases, alternatives can be provided within the OWL framework; in others, it appears that hybrid reasoning about a knowledge base of contingent facts built around the core ontology is necessary. Trade-offs include both human factors and the scaling of computational performance. The analysis presented here is based on the OpenGALEN experience with large scale ontologies using a formalism, GRAIL, which explicitly incorporates constructs for hybrid reasoning, numerous experiments with OWL, and initial work on combining OWL and Protégé.
Gamberini, R; Del Buono, D; Lolli, F; Rimini, B
2013-11-01
The definition and utilisation of engineering indexes in the field of Municipal Solid Waste Management (MSWM) is an issue of interest for technicians and scientists, which is widely discussed in literature. Specifically, the availability of consolidated engineering indexes is useful when new waste collection services are designed, along with when their performance is evaluated after a warm-up period. However, most published works in the field of MSWM complete their study with an analysis of isolated case studies. Conversely, decision makers require tools for information collection and exchange in order to trace the trends of these engineering indexes in large experiments. In this paper, common engineering indexes are presented and their values analysed in virtuous Italian communities, with the aim of contributing to the creation of a useful database whose data could be used during experiments, by indicating examples of MSWM demand profiles and the costs required to manage them. Copyright © 2013 Elsevier Ltd. All rights reserved.
MICROROC: MICRO-mesh gaseous structure Read-Out Chip
NASA Astrophysics Data System (ADS)
Adloff, C.; Blaha, J.; Chefdeville, M.; Dalmaz, A.; Drancourt, C.; Dulucq, F.; Espargilière, A.; Gaglione, R.; Geffroy, N.; Jacquemier, J.; Karyotakis, Y.; Martin-Chassard, G.; Prast, J.; Seguin-Moreau, N.; de La Taille, Ch; Vouters, G.
2012-01-01
MICRO MEsh GAseous Structure (MICROMEGAS) and Gas Electron Multipliers (GEM) detectors are two candidates for the active medium of a Digital Hadronic CALorimeter (DHCAL) as part of a high energy physics experiment at a future linear collider (ILC/CLIC). Physics requirements lead to a highly granular hadronic calorimeter with up to thirty million channels with probably only hit information (digital readout calorimeter). To validate the concept of digital hadronic calorimetry with such small cell size, the construction and test of a cubic meter technological prototype, made of 40 planes of one square meter each, is necessary. This technological prototype would contain about 400 000 electronic channels, thus requiring the development of front-end ASIC. Based on the experience gained with previous ASIC that were mounted on detectors and tested in particle beams, a new ASIC called MICROROC has been developped. This paper summarizes the caracterisation campaign that was conducted on this new chip as well as its integration into a large area Micromegas chamber of one square meter.
Non-Contact Temperature Requirements (NCTM) for drop and bubble physics
NASA Technical Reports Server (NTRS)
Hmelo, Anthony B.; Wang, Taylor G.
1989-01-01
Many of the materials research experiments to be conducted in the Space Processing program require a non-contaminating method of manipulating and controlling weightless molten materials. In these experiments, the melt is positioned and formed within a container without physically contacting the container's wall. An acoustic method, which was developed by Professor Taylor G. Wang before coming to Vanderbilt University from the Jet Propulsion Laboratory, has demonstrated the capability of positioning and manipulating room temperature samples. This was accomplished in an earth-based laboratory with a zero-gravity environment of short duration. However, many important facets of high temperature containerless processing technology have not been established yet, nor can they be established from the room temperature studies, because the details of the interaction between an acoustic field an a molten sample are largely unknown. Drop dynamics, bubble dynamics, coalescence behavior of drops and bubbles, electromagnetic and acoustic levitation methods applied to molten metals, and thermal streaming are among the topics discussed.
Quality Assurance on Undoped CsI Crystals for the Mu2e Experiment
Atanov, N.; Baranov, V.; Budagov, J.; ...
2017-12-21
The Mu2e experiment is constructing a calorimeter consisting of 1,348 undoped CsI crystals in two disks. Each crystal has a dimension of 34 x 34 x 200 mm 3, and is readout by a large area silicon PMT array. A series of technical specifications was defined according to physics requirements. Preproduction CsI crystals were procured from three firms: Amcrys, Saint-Gobain and Shanghai Institute of Ceramics. We report the quality assurance on crystal's scintillation properties and their radiation hardness against ionization dose and neutrons. With a fast decay time of 30 ns and a light output of more than 100 p.e./MeVmore » measured with a bi-alkali PMT, undoped CsI crystals provide a cost-effective solution for the Mu2e experiment.« less
Quality Assurance on Undoped CsI Crystals for the Mu2e Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Atanov, N.; Baranov, V.; Budagov, J.
The Mu2e experiment is constructing a calorimeter consisting of 1,348 undoped CsI crystals in two disks. Each crystal has a dimension of 34 x 34 x 200 mm 3, and is readout by a large area silicon PMT array. A series of technical specifications was defined according to physics requirements. Preproduction CsI crystals were procured from three firms: Amcrys, Saint-Gobain and Shanghai Institute of Ceramics. We report the quality assurance on crystal's scintillation properties and their radiation hardness against ionization dose and neutrons. With a fast decay time of 30 ns and a light output of more than 100 p.e./MeVmore » measured with a bi-alkali PMT, undoped CsI crystals provide a cost-effective solution for the Mu2e experiment.« less
The benefits for children's nurses of overseas placements: where is the evidence?
Standage, Richard; Randall, Duncan
2014-06-01
Overseas placements are presumed to provide students with experiences to enhance their cultural competence and to give them insights into other healthcare systems. However, the literature has not focused on what students of children's nursing might gain from an overseas placement. This paper is a report of a literature review (2003-2011) and our own student evaluation, both aimed at shedding new light on this important opportunity for learning for children's nurses. The literature review indicates that current research does not address the learning from overseas placements for children's nurses. Our student evaluation suggests children's nursing students are able to explore the position of children in the host culture and to place this in a healthcare context. Students also reported that they adhered to UK scope of student practice when delivering care to children on overseas placement. These placements provide a valuable learning experience for children's nurses. However, consideration in the shorter term is required to address issues of equity. Looking forward, further large scale studies are required to determine the long term effects of such experience on the health outcomes for children, and development of children's nurses and children's nursing globally.
Selective attention and subjective confidence calibration.
Schoenherr, Jordan R; Leth-Steensen, Craig; Petrusic, William M
2010-02-01
In the present experiments, failures of selective visual attention were invoked using the B. A. Eriksen and C. W. Eriksen (1974) flanker task. On each trial, a three-letter stimulus array was flashed briefly, followed by a mask. The identity of the two flanking letters was response congruent, neutral, or incongruent with the identity of the middle target letter. On half of the trials, confidence ratings were obtained after each response. In the first three experiments, participants were highly overconfident in the accuracy of their responding to incongruent flanker stimulus arrays. In a final experiment, presenting a prestimulus target location cue greatly reduced both selective attention failure and overconfidence. The findings demonstrate that participants are often unaware of such selective attention failures and provide support for the notion that, in these cases, decisional processing is driven largely by the identities of the incongruent flankers. In addition, responding was invariably slower and sometimes more accurate when confidence was required than when it was not required, demonstrating that the need to provide posttrial confidence reports can affect decisional processing. Moreover, there was some evidence that the presence of neutral contextual flanking information can slow responding, suggesting that such nondiagnostic information can, indeed, contribute to decisional processing.
Verification of Space Station Secondary Power System Stability Using Design of Experiment
NASA Technical Reports Server (NTRS)
Karimi, Kamiar J.; Booker, Andrew J.; Mong, Alvin C.; Manners, Bruce
1998-01-01
This paper describes analytical methods used in verification of large DC power systems with applications to the International Space Station (ISS). Large DC power systems contain many switching power converters with negative resistor characteristics. The ISS power system presents numerous challenges with respect to system stability such as complex sources and undefined loads. The Space Station program has developed impedance specifications for sources and loads. The overall approach to system stability consists of specific hardware requirements coupled with extensive system analysis and testing. Testing of large complex distributed power systems is not practical due to size and complexity of the system. Computer modeling has been extensively used to develop hardware specifications as well as to identify system configurations for lab testing. The statistical method of Design of Experiments (DoE) is used as an analysis tool for verification of these large systems. DOE reduces the number of computer runs which are necessary to analyze the performance of a complex power system consisting of hundreds of DC/DC converters. DoE also provides valuable information about the effect of changes in system parameters on the performance of the system. DoE provides information about various operating scenarios and identification of the ones with potential for instability. In this paper we will describe how we have used computer modeling to analyze a large DC power system. A brief description of DoE is given. Examples using applications of DoE to analysis and verification of the ISS power system are provided.
Sensor Selection and Optimization for Health Assessment of Aerospace Systems
NASA Technical Reports Server (NTRS)
Maul, William A.; Kopasakis, George; Santi, Louis M.; Sowers, Thomas S.; Chicatelli, Amy
2007-01-01
Aerospace systems are developed similarly to other large-scale systems through a series of reviews, where designs are modified as system requirements are refined. For space-based systems few are built and placed into service. These research vehicles have limited historical experience to draw from and formidable reliability and safety requirements, due to the remote and severe environment of space. Aeronautical systems have similar reliability and safety requirements, and while these systems may have historical information to access, commercial and military systems require longevity under a range of operational conditions and applied loads. Historically, the design of aerospace systems, particularly the selection of sensors, is based on the requirements for control and performance rather than on health assessment needs. Furthermore, the safety and reliability requirements are met through sensor suite augmentation in an ad hoc, heuristic manner, rather than any systematic approach. A review of the current sensor selection practice within and outside of the aerospace community was conducted and a sensor selection architecture is proposed that will provide a justifiable, dependable sensor suite to address system health assessment requirements.
Sensor Selection and Optimization for Health Assessment of Aerospace Systems
NASA Technical Reports Server (NTRS)
Maul, William A.; Kopasakis, George; Santi, Louis M.; Sowers, Thomas S.; Chicatelli, Amy
2008-01-01
Aerospace systems are developed similarly to other large-scale systems through a series of reviews, where designs are modified as system requirements are refined. For space-based systems few are built and placed into service these research vehicles have limited historical experience to draw from and formidable reliability and safety requirements, due to the remote and severe environment of space. Aeronautical systems have similar reliability and safety requirements, and while these systems may have historical information to access, commercial and military systems require longevity under a range of operational conditions and applied loads. Historically, the design of aerospace systems, particularly the selection of sensors, is based on the requirements for control and performance rather than on health assessment needs. Furthermore, the safety and reliability requirements are met through sensor suite augmentation in an ad hoc, heuristic manner, rather than any systematic approach. A review of the current sensor selection practice within and outside of the aerospace community was conducted and a sensor selection architecture is proposed that will provide a justifiable, defendable sensor suite to address system health assessment requirements.
Intelligent systems engineering methodology
NASA Technical Reports Server (NTRS)
Fouse, Scott
1990-01-01
An added challenge for the designers of large scale systems such as Space Station Freedom is the appropriate incorporation of intelligent system technology (artificial intelligence, expert systems, knowledge-based systems, etc.) into their requirements and design. This presentation will describe a view of systems engineering which successfully addresses several aspects of this complex problem: design of large scale systems, design with requirements that are so complex they only completely unfold during the development of a baseline system and even then continue to evolve throughout the system's life cycle, design that involves the incorporation of new technologies, and design and development that takes place with many players in a distributed manner yet can be easily integrated to meet a single view of the requirements. The first generation of this methodology was developed and evolved jointly by ISX and the Lockheed Aeronautical Systems Company over the past five years on the Defense Advanced Research Projects Agency/Air Force Pilot's Associate Program, one of the largest, most complex, and most successful intelligent systems constructed to date. As the methodology has evolved it has also been applied successfully to a number of other projects. Some of the lessons learned from this experience may be applicable to Freedom.
Comparing memory-efficient genome assemblers on stand-alone and cloud infrastructures.
Kleftogiannis, Dimitrios; Kalnis, Panos; Bajic, Vladimir B
2013-01-01
A fundamental problem in bioinformatics is genome assembly. Next-generation sequencing (NGS) technologies produce large volumes of fragmented genome reads, which require large amounts of memory to assemble the complete genome efficiently. With recent improvements in DNA sequencing technologies, it is expected that the memory footprint required for the assembly process will increase dramatically and will emerge as a limiting factor in processing widely available NGS-generated reads. In this report, we compare current memory-efficient techniques for genome assembly with respect to quality, memory consumption and execution time. Our experiments prove that it is possible to generate draft assemblies of reasonable quality on conventional multi-purpose computers with very limited available memory by choosing suitable assembly methods. Our study reveals the minimum memory requirements for different assembly programs even when data volume exceeds memory capacity by orders of magnitude. By combining existing methodologies, we propose two general assembly strategies that can improve short-read assembly approaches and result in reduction of the memory footprint. Finally, we discuss the possibility of utilizing cloud infrastructures for genome assembly and we comment on some findings regarding suitable computational resources for assembly.
A Preliminary Study of a Solar-Probe Mission
NASA Technical Reports Server (NTRS)
Dugan, Duane W.
1961-01-01
A preliminary study is made of some problems associated with the sending of an instrumented probe close to the Sun for the purpose of gathering and telemetering back to Earth information concerning solar phenomena and circumsolar space. The problems considered are primarily those relating to heating and to launch requirements. A nonanalytic discussion of the communications problem of a solar-probe mission is presented to obtain order-of-magnitude estimates of the output and weight of an auxiliary power supply which might be required. From the study it is believed that approaches to the Sun as close as about 4 or 5 million miles do not present insuperable difficulties insofar as heating and communications are concerned. Guidance requirements, in general, do not appear to be stringent. However, in terms of current experience, velocity requirements may be large. It is found, for example, that to achieve perihelion distances between the orbit of Mercury and the visible disc of the Sun, total burnout velocities ranging between 50,000 and 100,000 feet per second are required.
Optical Properties of the DIRC Fused Silica Radiator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Convery, Mark R
2003-04-15
The DIRC detector is successfully operating as the hadronic particle identification system for the BaBar experiment at SLAC. The production of its Cherenkov radiator required much effort in practice, both in manufacture and conception, which in turn required a large number of R&D measurements. One of the major outcomes of this R&D work was an understanding of methods to select radiation hard and optically uniform fused silica material. Others included measurement of the wavelength dependency of the internal reflection coefficient, and its sensitivity to the surface pollution, selection of the radiator support, selection of good optical glue, etc. This notemore » summarizes the optical R&D test results.« less
Elmer-Dixon, Margaret M; Bowler, Bruce E
2018-05-19
A novel approach to quantify mixed lipid systems is described. Traditional approaches to lipid vesicle quantification are time consuming, require large amounts of material and are destructive. We extend our recently described method for quantification of pure lipid systems to mixed lipid systems. The method only requires a UV-Vis spectrometer and does not destroy sample. Mie scattering data from absorbance measurements are used as input into a Matlab program to calculate the total vesicle concentration and the concentrations of each lipid in the mixed lipid system. The technique is fast and accurate, which is essential for analytical lipid binding experiments. Copyright © 2018. Published by Elsevier Inc.
Structural considerations for fabrication and mounting of the AXAF HRMA optics
NASA Technical Reports Server (NTRS)
Cohen, Lester M.; Cernoch, Larry; Mathews, Gary; Stallcup, Michael
1990-01-01
A methodology is described which minimizes optics distortion in the fabrication, metrology, and launch configuration phases. The significance of finite element modeling and breadboard testing is described with respect to performance analyses of support structures and material effects in NASA's AXAF X-ray optics. The paper outlines the requirements for AXAF performance, optical fabrication, metrology, and glass support fixtures, as well as the specifications for mirror sensitivity and the high-resolution mirror assembly. Analytical modeling of the tools is shown to coincide with grinding and polishing experiments, and is useful for designing large-area polishing and grinding tools. Metrological subcomponents that have undergone initial testing show evidence of meeting force requirements.
The large enriched germanium experiment for neutrinoless double beta decay (LEGEND)
NASA Astrophysics Data System (ADS)
Abgrall, N.; Abramov, A.; Abrosimov, N.; Abt, I.; Agostini, M.; Agartioglu, M.; Ajjaq, A.; Alvis, S. I.; Avignone, F. T.; Bai, X.; Balata, M.; Barabanov, I.; Barabash, A. S.; Barton, P. J.; Baudis, L.; Bezrukov, L.; Bode, T.; Bolozdynya, A.; Borowicz, D.; Boston, A.; Boston, H.; Boyd, S. T. P.; Breier, R.; Brudanin, V.; Brugnera, R.; Busch, M.; Buuck, M.; Caldwell, A.; Caldwell, T. S.; Camellato, T.; Carpenter, M.; Cattadori, C.; Cederkäll, J.; Chan, Y.-D.; Chen, S.; Chernogorov, A.; Christofferson, C. D.; Chu, P.-H.; Cooper, R. J.; Cuesta, C.; Demidova, E. V.; Deng, Z.; Deniz, M.; Detwiler, J. A.; Di Marco, N.; Domula, A.; Du, Q.; Efremenko, Yu.; Egorov, V.; Elliott, S. R.; Fields, D.; Fischer, F.; Galindo-Uribarri, A.; Gangapshev, A.; Garfagnini, A.; Gilliss, T.; Giordano, M.; Giovanetti, G. K.; Gold, M.; Golubev, P.; Gooch, C.; Grabmayr, P.; Green, M. P.; Gruszko, J.; Guinn, I. S.; Guiseppe, V. E.; Gurentsov, V.; Gurov, Y.; Gusev, K.; Hakenmüeller, J.; Harkness-Brennan, L.; Harvey, Z. R.; Haufe, C. R.; Hauertmann, L.; Heglund, D.; Hehn, L.; Heinz, A.; Hiller, R.; Hinton, J.; Hodak, R.; Hofmann, W.; Howard, S.; Howe, M. A.; Hult, M.; Inzhechik, L. V.; Csáthy, J. Janicskó; Janssens, R.; Ješkovský, M.; Jochum, J.; Johansson, H. T.; Judson, D.; Junker, M.; Kaizer, J.; Kang, K.; Kazalov, V.; Kermadic, Y.; Kiessling, F.; Kirsch, A.; Kish, A.; Klimenko, A.; Knöpfle, K. T.; Kochetov, O.; Konovalov, S. I.; Kontul, I.; Kornoukhov, V. N.; Kraetzschmar, T.; Kröninger, K.; Kumar, A.; Kuzminov, V. V.; Lang, K.; Laubenstein, M.; Lazzaro, A.; Li, Y. L.; Li, Y.-Y.; Li, H. B.; Lin, S. T.; Lindner, M.; Lippi, I.; Liu, S. K.; Liu, X.; Liu, J.; Loomba, D.; Lubashevskiy, A.; Lubsandorzhiev, B.; Lutter, G.; Ma, H.; Majorovits, B.; Mamedov, F.; Martin, R. D.; Massarczyk, R.; Matthews, J. A. J.; McFadden, N.; Mei, D.-M.; Mei, H.; Meijer, S. J.; Mengoni, D.; Mertens, S.; Miller, W.; Miloradovic, M.; Mingazheva, R.; Misiaszek, M.; Moseev, P.; Myslik, J.; Nemchenok, I.; Nilsson, T.; Nolan, P.; O'Shaughnessy, C.; Othman, G.; Panas, K.; Pandola, L.; Papp, L.; Pelczar, K.; Peterson, D.; Pettus, W.; Poon, A. W. P.; Povinec, P. P.; Pullia, A.; Quintana, X. C.; Radford, D. C.; Rager, J.; Ransom, C.; Recchia, F.; Reine, A. L.; Riboldi, S.; Rielage, K.; Rozov, S.; Rouf, N. W.; Rukhadze, E.; Rumyantseva, N.; Saakyan, R.; Sala, E.; Salamida, F.; Sandukovsky, V.; Savard, G.; Schönert, S.; Schütz, A.-K.; Schulz, O.; Schuster, M.; Schwingenheuer, B.; Selivanenko, O.; Sevda, B.; Shanks, B.; Shevchik, E.; Shirchenko, M.; Simkovic, F.; Singh, L.; Singh, V.; Skorokhvatov, M.; Smolek, K.; Smolnikov, A.; Sonay, A.; Spavorova, M.; Stekl, I.; Stukov, D.; Tedeschi, D.; Thompson, J.; Van Wechel, T.; Varner, R. L.; Vasenko, A. A.; Vasilyev, S.; Veresnikova, A.; Vetter, K.; von Sturm, K.; Vorren, K.; Wagner, M.; Wang, G.-J.; Waters, D.; Wei, W.-Z.; Wester, T.; White, B. R.; Wiesinger, C.; Wilkerson, J. F.; Willers, M.; Wiseman, C.; Wojcik, M.; Wong, H. T.; Wyenberg, J.; Xu, W.; Yakushev, E.; Yang, G.; Yu, C.-H.; Yue, Q.; Yumatov, V.; Zeman, J.; Zeng, Z.; Zhitnikov, I.; Zhu, B.; Zinatulina, D.; Zschocke, A.; Zsigmond, A. J.; Zuber, K.; Zuzel, G.
2017-10-01
The observation of neutrinoless double-beta decay (0νββ) would show that lepton number is violated, reveal that neu-trinos are Majorana particles, and provide information on neutrino mass. A discovery-capable experiment covering the inverted ordering region, with effective Majorana neutrino masses of 15 - 50 meV, will require a tonne-scale experiment with excellent energy resolution and extremely low backgrounds, at the level of ˜0.1 count /(FWHM.t.yr) in the region of the signal. The current generation 76Ge experiments GERDA and the Majorana Demonstrator, utilizing high purity Germanium detectors with an intrinsic energy resolution of 0.12%, have achieved the lowest backgrounds by over an order of magnitude in the 0νββ signal region of all 0νββ experiments. Building on this success, the LEGEND collaboration has been formed to pursue a tonne-scale 76Ge experiment. The collaboration aims to develop a phased 0νββ experimental program with discovery potential at a half-life approaching or at 1028 years, using existing resources as appropriate to expedite physics results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kafka, Gene
2015-05-01
The Integrable Optics Test Accelerator (IOTA) storage ring at Fermilab will serve as the backbone for a broad spectrum of Advanced Accelerator R&D (AARD) experiments, and as such, must be designed with signi cant exibility in mind, but without compromising cost e ciency. The nonlinear experiments at IOTA will include: achievement of a large nonlinear tune shift/spread without degradation of dynamic aperture; suppression of strong lattice resonances; study of stability of nonlinear systems to perturbations; and studies of di erent variants of nonlinear magnet design. The ring optics control has challenging requirements that reach or exceed the present state ofmore » the art. The development of a complete self-consistent design of the IOTA ring optics, meeting the demands of all planned AARD experiments, is presented. Of particular interest are the precise control for nonlinear integrable optics experiments and the transverse-to-longitudinal coupling and phase stability for the Optical Stochastic Cooling Experiment (OSC). Since the beam time-of- ight must be tightly controlled in the OSC section, studies of second order corrections in this section are presented.« less
Experience-driven plasticity in binocular vision
Klink, P. Christiaan; Brascamp, Jan W.; Blake, Randolph; van Wezel, Richard J.A.
2010-01-01
Summary Experience-driven neuronal plasticity allows the brain to adapt its functional connectivity to recent sensory input. Here we use binocular rivalry [1], an experimental paradigm where conflicting images are presented to the individual eyes, to demonstrate plasticity in the neuronal mechanisms that convert visual information from two separated retinas into single perceptual experiences. Perception during binocular rivalry tended to initially consist of alternations between exclusive representations of monocularly defined images, but upon prolonged exposure, mixture percepts became more prevalent. The completeness of suppression, reflected in the incidence of mixture percepts, plausibly reflects the strength of inhibition that likely plays a role in binocular rivalry [2]. Recovery of exclusivity was possible, but required highly specific binocular stimulation. Documenting the prerequisites for these observed changes in perceptual exclusivity, our experiments suggest experience-driven plasticity at interocular inhibitory synapses, driven by the (lack of) correlated activity of neurons representing the conflicting stimuli. This form of plasticity is consistent with a previously proposed, but largely untested, anti-Hebbian learning mechanism for inhibitory synapses in vision [3, 4]. Our results implicate experience-driven plasticity as one governing principle in the neuronal organization of binocular vision. PMID:20674360
NASA Astrophysics Data System (ADS)
Kafka, Gene
The Integrable Optics Test Accelerator (IOTA) storage ring at Fermilab will serve as the backbone for a broad spectrum of Advanced Accelerator R&D (AARD) experiments, and as such, must be designed with significant flexibility in mind, but without compromising cost efficiency. The nonlinear experiments at IOTA will include: achievement of a large nonlinear tune shift/spread without degradation of dynamic aperture; suppression of strong lattice resonances; study of stability of nonlinear systems to perturbations; and studies of different variants of nonlinear magnet design. The ring optics control has challenging requirements that reach or exceed the present state of the art. The development of a complete self-consistent design of the IOTA ring optics, meeting the demands of all planned AARD experiments, is presented. Of particular interest are the precise control for nonlinear integrable optics experiments and the transverse-to-longitudinal coupling and phase stability for the Optical Stochastic Cooling Experiment (OSC). Since the beam time-of-flight must be tightly controlled in the OSC section, studies of second order corrections in this section are presented.
The role of patients' explanatory models and daily-lived experience in hypertension self-management.
Bokhour, Barbara G; Cohn, Ellen S; Cortés, Dharma E; Solomon, Jeffrey L; Fix, Gemmae M; Elwy, A Rani; Mueller, Nora; Katz, Lois A; Haidet, Paul; Green, Alexander R; Borzecki, Ann M; Kressin, Nancy R
2012-12-01
Uncontrolled hypertension remains a significant problem for many patients. Few interventions to improve patients' hypertension self-management have had lasting effects. Previous work has focused largely on patients' beliefs as predictors of behavior, but little is understood about beliefs as they are embedded in patients' social contexts. This study aims to explore how patients' "explanatory models" of hypertension (understandings of the causes, mechanisms or pathophysiology, course of illness, symptoms and effects of treatment) and social context relate to their reported daily hypertension self-management behaviors. Semi-structured qualitative interviews with a diverse group of patients at two large urban Veterans Administration Medical centers. PARTICIPANTS (OR PATIENTS OR SUBJECTS): African-American, white and Latino Veterans Affairs (VA) primary care patients with uncontrolled blood pressure. We conducted thematic analysis using tools of grounded theory to identify key themes surrounding patients' explanatory models, social context and hypertension management behaviors. Patients' perceptions of the cause and course of hypertension, experiences of hypertension symptoms, and beliefs about the effectiveness of treatment were related to different hypertension self-management behaviors. Moreover, patients' daily-lived experiences, such as an isolated lifestyle, serious competing health problems, a lack of habits and routines, barriers to exercise and prioritizing lifestyle choices, also interfered with optimal hypertension self-management. Designing interventions to improve patients' hypertension self-management requires consideration of patients' explanatory models and their daily-lived experience. We propose a new conceptual model - the dynamic model of hypertension self-management behavior - which incorporates these key elements of patients' experiences.
Ward, Megan; Johnson, Steven D; Zalucki, Myron P
2013-04-01
One of the essential requirements for an introduced plant species to become invasive is an ability to reproduce outside the native range, particularly when initial populations are small. If a reproductive Allee effect is operating, plants in small populations will have reduced reproductive success relative to plants in larger populations. Alternatively, if plants in small populations experience less competition for pollination than those in large populations, they may actually have higher levels of reproductive success than plants in large populations. To resolve this uncertainty, we investigated how the per capita fecundity of plants was affected by population size in three invasive milkweed species. Field surveys of seed production in natural populations of different sizes but similar densities were conducted for three pollinator-dependent invasive species, namely Asclepias curassavica, Gomphocarpus fruticosus and G. physocarpus. Additionally, supplemental hand-pollinations were performed in small and large populations in order to determine whether reproductive output was limited by pollinator activity in these populations. Reproductive Allee effects were not detected in any of the study species. Instead, plants in small populations exhibited remarkably high levels of reproductive output compared to those in large populations. Increased fruit production following supplemental hand-pollinations suggested that the lower reproductive output of naturally pollinated plants in large populations is a consequence of pollen limitation rather than limitation due to abiotic resources. This is consistent with increased intraspecific competition for pollination amongst plants in large populations. It is likely that the invasion of these milkweed species in Australia has been enhanced because plants in small founding populations experience less intraspecific competition for pollinators than those in large populations, and thus have the ability to produce copious amounts of seeds.
Validating a large geophysical data set: Experiences with satellite-derived cloud parameters
NASA Technical Reports Server (NTRS)
Kahn, Ralph; Haskins, Robert D.; Knighton, James E.; Pursch, Andrew; Granger-Gallegos, Stephanie
1992-01-01
We are validating the global cloud parameters derived from the satellite-borne HIRS2 and MSU atmospheric sounding instrument measurements, and are using the analysis of these data as one prototype for studying large geophysical data sets in general. The HIRS2/MSU data set contains a total of 40 physical parameters, filling 25 MB/day; raw HIRS2/MSU data are available for a period exceeding 10 years. Validation involves developing a quantitative sense for the physical meaning of the derived parameters over the range of environmental conditions sampled. This is accomplished by comparing the spatial and temporal distributions of the derived quantities with similar measurements made using other techniques, and with model results. The data handling needed for this work is possible only with the help of a suite of interactive graphical and numerical analysis tools. Level 3 (gridded) data is the common form in which large data sets of this type are distributed for scientific analysis. We find that Level 3 data is inadequate for the data comparisons required for validation. Level 2 data (individual measurements in geophysical units) is needed. A sampling problem arises when individual measurements, which are not uniformly distributed in space or time, are used for the comparisons. Standard 'interpolation' methods involve fitting the measurements for each data set to surfaces, which are then compared. We are experimenting with formal criteria for selecting geographical regions, based upon the spatial frequency and variability of measurements, that allow us to quantify the uncertainty due to sampling. As part of this project, we are also dealing with ways to keep track of constraints placed on the output by assumptions made in the computer code. The need to work with Level 2 data introduces a number of other data handling issues, such as accessing data files across machine types, meeting large data storage requirements, accessing other validated data sets, processing speed and throughput for interactive graphical work, and problems relating to graphical interfaces.
Bano, Kiran; Kennedy, Gareth F; Zhang, Jie; Bond, Alan M
2012-04-14
The theory for large amplitude Fourier transformed ac voltammetry at a rotating disc electrode is described. Resolution of time domain data into dc and ac harmonic components reveals that the mass transport for the dc component is controlled by convective-diffusion, while the background free higher order harmonic components are flow rate insensitive and mainly governed by linear diffusion. Thus, remarkable versatility is available; Levich behaviour of the dc component limiting current provides diffusion coefficient values and access to higher harmonics allows fast electrode kinetics to be probed. Two series of experiments (dc and ac voltammetry) have been required to extract these parameters; here large amplitude ac voltammetry with RDE methodology is used to demonstrate that kinetics and diffusion coefficient information can be extracted from a single experiment. To demonstrate the power of this approach, theoretical and experimental comparisons of data obtained for the reversible [Ru(NH(3))(6)](3+/2+) and quasi-reversible [Fe(CN)(6)](3-/4-) electron transfer processes are presented over a wide range of electrode rotation rates and with different concentrations and electrode materials. Excellent agreement of experimental and simulated data is achieved, which allows parameters such as electron transfer rate, diffusion coefficient, uncompensated resistance and others to be determined using a strategically applied approach that takes into account the different levels of sensitivity of each parameter to the dc or the ac harmonic.
The Outdoor Atmospheric Simulation Chamber of Orleans-France (HELIOS)
NASA Astrophysics Data System (ADS)
Mellouki, A.; Véronique, D.; Grosselin, B.; Peyroux, F.; Benoit, R.; Ren, Y.; Idir, M.
2016-12-01
Atmospheric simulation chambers are among the most advanced tools for investigating the atmospheric processes to derive physico-chemical parameters which are required for air quality and climate models. Recently, the ICARE-CNRS at Orléans (France) has set up a new large outdoor simulation chamber, HELIOS. HELIOS is one of the most advanced simulation chambers in Europe. It is one of the largest outdoor chambers and is especially suited to processes studies performed under realistic atmospheric conditions. HELIOS is a large hemispherical outdoor simulation chamber (volume of 90 m3) positioned on the top of ICARE-CNRS building at Orléans (47°50'18.39N; 1°56'40.03E). The chamber is made of FEP film ensuring more than 90 % solar light transmission. The chamber is protected against severe meteorological conditions by a moveable "box" which contains a series of Xenon lamps enabling to conduct experiments using artificial light. This special design makes HELIOS a unique platform where experiments can be made using both types of irradiations. HELIOS is dedicated mainly to the investigation of the chemical processes under different conditions (sunlight, artificial light and dark). The platform allows conducting the same type of experiments under both natural and artificial light irradiation. The available large range of complementary and highly sensitive instruments allows investigating the radical chemistry, gas phase processes and aerosol formation under realistic conditions. The characteristics of HELIOS will be presented as well as the first series of experimental results obtained so far.
Thermal dissociation and relaxation in vinyl fluoride, 1,1-difluoroethane and 1,3,5-triazine
NASA Astrophysics Data System (ADS)
Xu, Hui
This study reports measurements of the thermal dissociation of 1,1-difluoroethane in the shock tube. The experiments employ laser schlieren measurements of rate for the dominant HF elimination using 10% 1,1-difluoroethane in Kr over 1500--2000 K and 43 < P < 424 torr. The product vinyl then dissociates affecting the late density gradient. We include a laser schlieren study (1717--2332 K, 75 < P < 482 torr in 10% and 4% vinyl fluoride in Kr) of this dissociation. This latter work also includes a set of experiments using shock-tube time-of-flight mass-spectrometry (4% vinyl fluoride in neon, 1500--1980 K, 500 < P < 1300 torr), which confirm the theoretical expectation that the only reaction in vinyl fluoride is HF elimination. The relaxation experiments (1--20% C2H3F in Kr, 415--1975 K, 5 < P < 50 torr, and 2% and 5% C2H4F2 in Kr, 700--1350 K, 6 < P < 22 torr) exhibit very rapid relaxation, and incubation delays should be negligible in dissociation. A RRKM model of dissociation in 1,1-difluoroethane based on a G3B3 calculation of barrier and other properties fits the experiments but requires a very large
Pretest reference calculation for the Heated Axisymmetric Pillar (WIPP Room H in situ experiment)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morgan, H.S.; Stone, C.M.
A pretest reference calculation for the Heated Axisymmetric Pillar or Room H experiment is presented in this report. The Heated Axisymmetric Pillar is one of several large scale in situ experiments currently under construction near Carlsbad, New Mexico, at the site of the Waste Isolation Pilot Plant (WIPP). This test is an intermediate step in validating numerical techniques for design and performance calculations for radioactive waste repositories in salt. The test consists of a cylindrically shaped pillar, centrally located in an annular drift, which is uniformly heated by blanket heaters. These heaters produce a thermal output of 135 W/m/sup 2/.more » This load will be supplied for a period of three years. Room H is heavily instrumented for monitoring both temperature increases due to the thermal loading and deformations due to creep of the salt. Data from the experiment are not available at the present time, but the measurements for Room H will eventually be compared to the calculation presented in this report to assess and improve thermal and mechanical modeling capabilities for the WIPP. The thermal/structural model used in the calculation represents the state of the art at the present time. A large number of plots are included since an appropriate result is required for every Room H gauge location. 56 refs., 97 figs., 4 tabs.« less
The AlpArray Seismic Network: A Large-Scale European Experiment to Image the Alpine Orogen
NASA Astrophysics Data System (ADS)
Hetényi, György; Molinari, Irene; Clinton, John; Bokelmann, Götz; Bondár, István; Crawford, Wayne C.; Dessa, Jean-Xavier; Doubre, Cécile; Friederich, Wolfgang; Fuchs, Florian; Giardini, Domenico; Gráczer, Zoltán; Handy, Mark R.; Herak, Marijan; Jia, Yan; Kissling, Edi; Kopp, Heidrun; Korn, Michael; Margheriti, Lucia; Meier, Thomas; Mucciarelli, Marco; Paul, Anne; Pesaresi, Damiano; Piromallo, Claudia; Plenefisch, Thomas; Plomerová, Jaroslava; Ritter, Joachim; Rümpker, Georg; Šipka, Vesna; Spallarossa, Daniele; Thomas, Christine; Tilmann, Frederik; Wassermann, Joachim; Weber, Michael; Wéber, Zoltán; Wesztergom, Viktor; Živčić, Mladen
2018-04-01
The AlpArray programme is a multinational, European consortium to advance our understanding of orogenesis and its relationship to mantle dynamics, plate reorganizations, surface processes and seismic hazard in the Alps-Apennines-Carpathians-Dinarides orogenic system. The AlpArray Seismic Network has been deployed with contributions from 36 institutions from 11 countries to map physical properties of the lithosphere and asthenosphere in 3D and thus to obtain new, high-resolution geophysical images of structures from the surface down to the base of the mantle transition zone. With over 600 broadband stations operated for 2 years, this seismic experiment is one of the largest simultaneously operated seismological networks in the academic domain, employing hexagonal coverage with station spacing at less than 52 km. This dense and regularly spaced experiment is made possible by the coordinated coeval deployment of temporary stations from numerous national pools, including ocean-bottom seismometers, which were funded by different national agencies. They combine with permanent networks, which also required the cooperation of many different operators. Together these stations ultimately fill coverage gaps. Following a short overview of previous large-scale seismological experiments in the Alpine region, we here present the goals, construction, deployment, characteristics and data management of the AlpArray Seismic Network, which will provide data that is expected to be unprecedented in quality to image the complex Alpine mountains at depth.
NASA Astrophysics Data System (ADS)
Chen, Y. L.
2015-12-01
Measurement technologies for velocity of river flow are divided into intrusive and nonintrusive methods. Intrusive method requires infield operations. The measuring process of intrusive methods are time consuming, and likely to cause damages of operator and instrument. Nonintrusive methods require fewer operators and can reduce instrument damages from directly attaching to the flow. Nonintrusive measurements may use radar or image velocimetry to measure the velocities at the surface of water flow. The image velocimetry, such as large scale particle image velocimetry (LSPIV) accesses not only the point velocity but the flow velocities in an area simultaneously. Flow properties of an area hold the promise of providing spatially information of flow fields. This study attempts to construct a mobile system UAV-LSPIV by using an unmanned aerial vehicle (UAV) with LSPIV to measure flows in fields. The mobile system consists of a six-rotor UAV helicopter, a Sony nex5T camera, a gimbal, an image transfer device, a ground station and a remote control device. The activate gimbal helps maintain the camera lens orthogonal to the water surface and reduce the extent of images being distorted. The image transfer device can monitor the captured image instantly. The operator controls the UAV by remote control device through ground station and can achieve the flying data such as flying height and GPS coordinate of UAV. The mobile system was then applied to field experiments. The deviation of velocities measured by UAV-LSPIV of field experiments and handhold Acoustic Doppler Velocimeter (ADV) is under 8%. The results of the field experiments suggests that the application of UAV-LSPIV can be effectively applied to surface flow studies.
Pharmacist home visits: A 1-year experience from a community pharmacy.
Monte, Scott V; Passafiume, Sarah N; Kufel, Wesley D; Comerford, Patrick; Trzewieczynski, Dean P; Andrus, Kenneth; Brody, Peter M
2016-01-01
To provide experience on the methods and costs for delivering a large-scale community pharmacist home visit service. Independent urban community pharmacy, Buffalo, NY. Mobile Pharmacy Solutions provides traditional community pharmacy walk-in service and a suite of clinically oriented services, including outbound adherence calls linked to home delivery, payment planning, medication refill synchronization, adherence packaging, and pharmacist home visits. Pharmacist daily staffing included three dispensing pharmacists, one residency-trained pharmacist, and two postgraduate year 1 community pharmacy residents. A large-scale community pharmacy home visit service delivered over a 1-year period. Pharmacist time and cost to administer the home visit service as well as home visit request sources and description of patient demographics. A total of 172 visits were conducted (137 initial, 35 follow-up). Patients who received a home visit averaged 9.8 ± 5.2 medications and 3.0 ± 1.6 chronic disease states. On average, a home visit required 2.0 ± 0.8 hours, which included travel time. The percentages of visits completed by pharmacists and residents were 60% and 40%, respectively. The amounts of time to complete a visit were similar. Average home visit cost including pharmacist time and travel was $119 ($147 for a pharmacist, $77 for a resident). In this community pharmacy-based home visit service, costs are an important factor, with each pharmacist visit requiring 2 hours to complete. This experience provides a blueprint and real-world perspective for community pharmacies endeavoring to implement a home visit service and sets a foundation for future prospective trials to evaluate the impact of the service on important indicators of health and cost. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
Large number discrimination by mosquitofish.
Agrillo, Christian; Piffer, Laura; Bisazza, Angelo
2010-12-22
Recent studies have demonstrated that fish display rudimentary numerical abilities similar to those observed in mammals and birds. The mechanisms underlying the discrimination of small quantities (<4) were recently investigated while, to date, no study has examined the discrimination of large numerosities in fish. Subjects were trained to discriminate between two sets of small geometric figures using social reinforcement. In the first experiment mosquitofish were required to discriminate 4 from 8 objects with or without experimental control of the continuous variables that co-vary with number (area, space, density, total luminance). Results showed that fish can use the sole numerical information to compare quantities but that they preferentially use cumulative surface area as a proxy of the number when this information is available. A second experiment investigated the influence of the total number of elements to discriminate large quantities. Fish proved to be able to discriminate up to 100 vs. 200 objects, without showing any significant decrease in accuracy compared with the 4 vs. 8 discrimination. The third experiment investigated the influence of the ratio between the numerosities. Performance was found to decrease when decreasing the numerical distance. Fish were able to discriminate numbers when ratios were 1:2 or 2:3 but not when the ratio was 3:4. The performance of a sample of undergraduate students, tested non-verbally using the same sets of stimuli, largely overlapped that of fish. Fish are able to use pure numerical information when discriminating between quantities larger than 4 units. As observed in human and non-human primates, the numerical system of fish appears to have virtually no upper limit while the numerical ratio has a clear effect on performance. These similarities further reinforce the view of a common origin of non-verbal numerical systems in all vertebrates.
Social Experiments in the Mesoscale: Humans Playing a Spatial Prisoner's Dilemma
Grujić, Jelena; Fosco, Constanza; Araujo, Lourdes; Cuesta, José A.; Sánchez, Angel
2010-01-01
Background The evolutionary origin of cooperation among unrelated individuals remains a key unsolved issue across several disciplines. Prominent among the several mechanisms proposed to explain how cooperation can emerge is the existence of a population structure that determines the interactions among individuals. Many models have explored analytically and by simulation the effects of such a structure, particularly in the framework of the Prisoner's Dilemma, but the results of these models largely depend on details such as the type of spatial structure or the evolutionary dynamics. Therefore, experimental work suitably designed to address this question is needed to probe these issues. Methods and Findings We have designed an experiment to test the emergence of cooperation when humans play Prisoner's Dilemma on a network whose size is comparable to that of simulations. We find that the cooperation level declines to an asymptotic state with low but nonzero cooperation. Regarding players' behavior, we observe that the population is heterogeneous, consisting of a high percentage of defectors, a smaller one of cooperators, and a large group that shares features of the conditional cooperators of public goods games. We propose an agent-based model based on the coexistence of these different strategies that is in good agreement with all the experimental observations. Conclusions In our large experimental setup, cooperation was not promoted by the existence of a lattice beyond a residual level (around 20%) typical of public goods experiments. Our findings also indicate that both heterogeneity and a “moody” conditional cooperation strategy, in which the probability of cooperating also depends on the player's previous action, are required to understand the outcome of the experiment. These results could impact the way game theory on graphs is used to model human interactions in structured groups. PMID:21103058
Uncertainty and operational considerations in mass prophylaxis workforce planning.
Hupert, Nathaniel; Xiong, Wei; King, Kathleen; Castorena, Michelle; Hawkins, Caitlin; Wu, Cindie; Muckstadt, John A
2009-12-01
The public health response to an influenza pandemic or other large-scale health emergency may include mass prophylaxis using multiple points of dispensing (PODs) to deliver countermeasures rapidly to affected populations. Computer models created to date to determine "optimal" staffing levels at PODs typically assume stable patient demand for service. The authors investigated POD function under dynamic and uncertain operational environments. The authors constructed a Monte Carlo simulation model of mass prophylaxis (the Dynamic POD Simulator, or D-PODS) to assess the consequences of nonstationary patient arrival patterns on POD function under a variety of POD layouts and staffing plans. Compared are the performance of a standard POD layout under steady-state and variable patient arrival rates that may mimic real-life variation in patient demand. To achieve similar performance, PODs functioning under nonstationary patient arrival rates require higher staffing levels than would be predicted using the assumption of stationary arrival rates. Furthermore, PODs may develop severe bottlenecks unless staffing levels vary over time to meet changing patient arrival patterns. Efficient POD networks therefore require command and control systems capable of dynamically adjusting intra- and inter-POD staff levels to meet demand. In addition, under real-world operating conditions of heightened uncertainty, fewer large PODs will require a smaller total staff than many small PODs to achieve comparable performance. Modeling environments that capture the effects of fundamental uncertainties in public health disasters are essential for the realistic evaluation of response mechanisms and policies. D-PODS quantifies POD operational efficiency under more realistic conditions than have been modeled previously. The authors' experiments demonstrate that effective POD staffing plans must be responsive to variation and uncertainty in POD arrival patterns. These experiments highlight the need for command and control systems to be created to manage emergency response successfully.
A two-dimensional contaminant fate and transport model for the lower Athabasca River
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brownlee, B.G.; Booty, W.G.; MacInnis, G.A.
1995-12-31
The lower Athabasca River flows through the Athabasca Oil Sands deposits in northeastern Alberta. Two oil sands mining/extraction/upgrading plants operate near the river downstream from Fort McMurray. Process water is stored in large tailings ponds. One of the plants (Suncor) has a licensed discharge (mostly cooling water) to the river. This effluent contains low concentrations ({<=} 1 {micro}g/L) of various polycyclic aromatic compounds (PACs). Several tributary streams which cut through oil sands deposits are potential sources of hydrocarbons to the Athabasca. The authors have found that river suspended sediments give positive responses in a number of toxicity tests, using bothmore » direct and indirect (organic-solvent extract) methods. Several environmental impact assessments are required as a result of industry expansion. To provide an assessment tool for PACs, the authors are developing a two-dimensional contaminant fate and transport model for a 120-km portion of the Athabasca River downstream from Fort McMurray. Hydraulic calibration of the model was done using sodium and chloride from a major tributary as tracers. Two groups of compounds are being modelled: (1) PACs from the Suncor effluent, and (2) PACs from natural/background sources. PAC concentrations in the river were typically < 1 ng/L, requiring large volume extractions and highly sensitive analysis. Processes such as sediment-water partitioning and biodegradation are being estimated from field experiments using river water and suspended sediment. Photodegradation is likely unimportant in this turbid river due to low penetration of 280--350 nm light. Initially, volatilization will be modelled using estimated or literature values for Henry`s constants, but may require more refined estimates from laboratory experiments.« less
PharmTeX: a LaTeX-Based Open-Source Platform for Automated Reporting Workflow.
Rasmussen, Christian Hove; Smith, Mike K; Ito, Kaori; Sundararajan, Vijayakumar; Magnusson, Mats O; Niclas Jonsson, E; Fostvedt, Luke; Burger, Paula; McFadyen, Lynn; Tensfeldt, Thomas G; Nicholas, Timothy
2018-03-16
Every year, the pharmaceutical industry generates a large number of scientific reports related to drug research, development, and regulatory submissions. Many of these reports are created using text processing tools such as Microsoft Word. Given the large number of figures, tables, references, and other elements, this is often a tedious task involving hours of copying and pasting and substantial efforts in quality control (QC). In the present article, we present the LaTeX-based open-source reporting platform, PharmTeX, a community-based effort to make reporting simple, reproducible, and user-friendly. The PharmTeX creators put a substantial effort into simplifying the sometimes complex elements of LaTeX into user-friendly functions that rely on advanced LaTeX and Perl code running in the background. Using this setup makes LaTeX much more accessible for users with no prior LaTeX experience. A software collection was compiled for users not wanting to manually install the required software components. The PharmTeX templates allow for inclusion of tables directly from mathematical software output as well and figures from several formats. Code listings can be included directly from source. No previous experience and only a few hours of training are required to start writing reports using PharmTeX. PharmTeX significantly reduces the time required for creating a scientific report fully compliant with regulatory and industry expectations. QC is made much simpler, since there is a direct link between analysis output and report input. PharmTeX makes available to report authors the strengths of LaTeX document processing without the need for extensive training. Graphical Abstract ᅟ.
Houdelet, Marcel; Galinski, Anna; Holland, Tanja; Wenzel, Kathrin; Schillberg, Stefan; Buyel, Johannes Felix
2017-04-01
Transient expression systems allow the rapid production of recombinant proteins in plants. Such systems can be scaled up to several hundred kilograms of biomass, making them suitable for the production of pharmaceutical proteins required at short notice, such as emergency vaccines. However, large-scale transient expression requires the production of recombinant Agrobacterium tumefaciens strains with the capacity for efficient gene transfer to plant cells. The complex media often used for the cultivation of this species typically include animal-derived ingredients that can contain human pathogens, thus conflicting with the requirements of good manufacturing practice (GMP). We replaced all the animal-derived components in yeast extract broth (YEB) cultivation medium with soybean peptone, and then used a design-of-experiments approach to optimize the medium composition, increasing the biomass yield while maintaining high levels of transient expression in subsequent infiltration experiments. The resulting plant peptone Agrobacterium medium (PAM) achieved a two-fold increase in OD 600 compared to YEB medium during a 4-L batch fermentation lasting 18 h. Furthermore, the yields of the monoclonal antibody 2G12 and the fluorescent protein DsRed were maintained when the cells were cultivated in PAM rather than YEB. We have thus demonstrated a simple, efficient and scalable method for medium optimization that reduces process time and costs. The final optimized medium for the cultivation of A. tumefaciens completely lacks animal-derived components, thus facilitating the GMP-compliant large-scale transient expression of recombinant proteins in plants. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Test of the CLAS12 RICH large-scale prototype in the direct proximity focusing configuration
Anefalos Pereira, S.; Baltzell, N.; Barion, L.; ...
2016-02-11
A large area ring-imaging Cherenkov detector has been designed to provide clean hadron identification capability in the momentum range from 3 GeV/c up to 8 GeV/c for the CLAS12 experiments at the upgraded 12 GeV continuous electron beam accelerator facility of Jefferson Laboratory. The adopted solution foresees a novel hybrid optics design based on aerogel radiator, composite mirrors and high-packed and high-segmented photon detectors. Cherenkov light will either be imaged directly (forward tracks) or after two mirror reflections (large angle tracks). We report here the results of the tests of a large scale prototype of the RICH detector performed withmore » the hadron beam of the CERN T9 experimental hall for the direct detection configuration. As a result, the tests demonstrated that the proposed design provides the required pion-to-kaon rejection factor of 1:500 in the whole momentum range.« less
Numerical modelling on stabilizing large magnetic island by RF current for disruption avoidance
NASA Astrophysics Data System (ADS)
Wang, Xiaojing; Yu, Qingquan; Zhang, Xiaodong; Zhu, Sizheng; Wang, Xiaoguang; Wu, Bin
2018-01-01
Numerical modelling on tearing mode stabilization by RF current due to electron cyclotron current drive (ECCD) has been carried out for the purposes of disruption avoidance, focusing on stabilizing the magnetic island which can grow to a large width and therefore, might cause plasma disruption. When the island has become large, a threshold in driven current for fully stabilizing the mode is found; below this threshold, the island width only slightly decreases. The island’s O-point shifts radially towards the magnetic axis as the mode grows, as a result, applying ECCD at the minor radius of the island’s O-point has a stronger effect than that at the original equilibrium rational surface for stabilizing a large island. During the island growth, the required driven current for mode stabilization increases with the island’s width, indicating that it is more effective to apply ECCD as early as possible for disruption avoidance, as observed in experiments. The numerical results have been compared with those obtained from the modified Rutherford equation.
Development of high purity large forgings for nuclear power plants
NASA Astrophysics Data System (ADS)
Tanaka, Yasuhiko; Sato, Ikuo
2011-10-01
The recent increase in the size of energy plants has been supported by the development of manufacturing technology for high purity large forgings for the key components of the plant. To assure the reliability and performance of the large forgings, refining technology to make high purity steels, casting technology for gigantic ingots, forging technology to homogenize the material and consolidate porosity are essential, together with the required heat treatment and machining technologies. To meet these needs, the double degassing method to reduce impurities, multi-pouring methods to cast the gigantic ingots, vacuum carbon deoxidization, the warm forging process and related technologies have been developed and further improved. Furthermore, melting facilities including vacuum induction melting and electro slag re-melting furnaces have been installed. By using these technologies and equipment, large forgings have been manufactured and shipped to customers. These technologies have also been applied to the manufacture of austenitic steel vessel components of the fast breeder reactors and components for fusion experiments.
Avionics test bed development plan
NASA Technical Reports Server (NTRS)
Harris, L. H.; Parks, J. M.; Murdock, C. R.
1981-01-01
A development plan for a proposed avionics test bed facility for the early investigation and evaluation of new concepts for the control of large space structures, orbiter attached flex body experiments, and orbiter enhancements is presented. A distributed data processing facility that utilizes the current laboratory resources for the test bed development is outlined. Future studies required for implementation, the management system for project control, and the baseline system configuration are defined. A background analysis of the specific hardware system for the preliminary baseline avionics test bed system is included.
Teaching and physics education research: bridging the gap.
Fraser, James M; Timan, Anneke L; Miller, Kelly; Dowd, Jason E; Tucker, Laura; Mazur, Eric
2014-03-01
Physics faculty, experts in evidence-based research, often rely on anecdotal experience to guide their teaching practices. Adoption of research-based instructional strategies is surprisingly low, despite the large body of physics education research (PER) and strong dissemination effort of PER researchers and innovators. Evidence-based PER has validated specific non-traditional teaching practices, but many faculty raise valuable concerns toward their applicability. We address these concerns and identify future studies required to overcome the gap between research and practice.
Mira: Argonne's 10-petaflops supercomputer
Papka, Michael; Coghlan, Susan; Isaacs, Eric; Peters, Mark; Messina, Paul
2018-02-13
Mira, Argonne's petascale IBM Blue Gene/Q system, ushers in a new era of scientific supercomputing at the Argonne Leadership Computing Facility. An engineering marvel, the 10-petaflops supercomputer is capable of carrying out 10 quadrillion calculations per second. As a machine for open science, any researcher with a question that requires large-scale computing resources can submit a proposal for time on Mira, typically in allocations of millions of core-hours, to run programs for their experiments. This adds up to billions of hours of computing time per year.
Control of an Experiment to Measure Acoustic Noise in the Space Shuttle
1989-06-01
validity of the analysis of acoustic waves whose collection was itself trig- gered by the occurrence of those same waves. As a result of all these factors...record the ambient noise, we will avoid the problem mentioned in Chapter 1, Section A. Get Away Special (GAS) on page 1. The data collected by this...few places where it was required, we used it. Along with the large collection of C language subroutines of general applicability, the routines we have
Foam structure, rheology and coarsening : the shape, feel and aging of random soap froth.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reinelt, Douglas A.; van Swol, Frank B.; Hilgenfeldt, Sascha
2010-05-01
Simulations are in excellent agreement with experiments: structure - Matzke, shear modulus - Princen and Kiss E = 3.30 {sigma}/R{sub 32} = 5.32/(1 + p) {sigma}/(V){sup 1/2}, G {approx} 0.155 E = 0.512 {sigma}/R{sub 32}. IPP theory captures dependence of cell geometry on V and F. Future challenges are: simulating simple shearing flow is very expensive because of frequent topological transitions. Random wet foams require very large simulations.
Mira: Argonne's 10-petaflops supercomputer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Papka, Michael; Coghlan, Susan; Isaacs, Eric
2013-07-03
Mira, Argonne's petascale IBM Blue Gene/Q system, ushers in a new era of scientific supercomputing at the Argonne Leadership Computing Facility. An engineering marvel, the 10-petaflops supercomputer is capable of carrying out 10 quadrillion calculations per second. As a machine for open science, any researcher with a question that requires large-scale computing resources can submit a proposal for time on Mira, typically in allocations of millions of core-hours, to run programs for their experiments. This adds up to billions of hours of computing time per year.
Edge enhancement of color images using a digital micromirror device.
Di Martino, J Matías; Flores, Jorge L; Ayubi, Gastón A; Alonso, Julia R; Fernández, Ariel; Ferrari, José A
2012-06-01
A method for orientation-selective enhancement of edges in color images is proposed. The method utilizes the capacity of digital micromirror devices to generate a positive and a negative color replica of the image used as input. When both images are slightly displaced and imagined together, one obtains an image with enhanced edges. The proposed technique does not require a coherent light source or precise alignment. The proposed method could be potentially useful for processing large image sequences in real time. Validation experiments are presented.
Project Echo: Antenna Steering System
NASA Technical Reports Server (NTRS)
Klahn, R.; Norton, J. A.; Githens, J. A.
1961-01-01
The Project Echo communications experiment employed large, steerable,transmitting and receiving antennas at the ground terminals. It was necessary that these highly directional antennas be continuously and accurately pointed at the passing satellite. This paper describes a new type of special purpose data converter for directing narrow-beam communication antennas on the basis of predicted information. The system is capable of converting digital input data into real-time analog voltage commands with a dynamic accuracy of +/- 0.05 degree, which meets the requirements of the present antennas.
Slewing control experiment for a flexible panel
NASA Technical Reports Server (NTRS)
Juang, Jer-Nan
1987-01-01
Technology areas are identified in which better analytical and/or experimental methods are needed to adequately and accurately control the dynamic responses of multibody space platforms such as the space station. A generic space station solar panel is used to experimentally evaluate current control technologies. Active suppression of solar panel vibrations induced by large angle maneuvers is studied with a torque actuator at the root of the solar panel. These active suppression tests will identify the hardware requirements and adequacy of various controller designs.
Probing dark energy with atom interferometry
NASA Astrophysics Data System (ADS)
Burrage, Clare; Copeland, Edmund J.; Hinds, E. A.
2015-03-01
Theories of dark energy require a screening mechanism to explain why the associated scalar fields do not mediate observable long range fifth forces. The archetype of this is the chameleon field. Here we show that individual atoms are too small to screen the chameleon field inside a large high-vacuum chamber, and therefore can detect the field with high sensitivity. We derive new limits on the chameleon parameters from existing experiments, and show that most of the remaining chameleon parameter space is readily accessible using atom interferometry.
Automatic differential analysis of NMR experiments in complex samples.
Margueritte, Laure; Markov, Petar; Chiron, Lionel; Starck, Jean-Philippe; Vonthron-Sénécheau, Catherine; Bourjot, Mélanie; Delsuc, Marc-André
2018-06-01
Liquid state nuclear magnetic resonance (NMR) is a powerful tool for the analysis of complex mixtures of unknown molecules. This capacity has been used in many analytical approaches: metabolomics, identification of active compounds in natural extracts, and characterization of species, and such studies require the acquisition of many diverse NMR measurements on series of samples. Although acquisition can easily be performed automatically, the number of NMR experiments involved in these studies increases very rapidly, and this data avalanche requires to resort to automatic processing and analysis. We present here a program that allows the autonomous, unsupervised processing of a large corpus of 1D, 2D, and diffusion-ordered spectroscopy experiments from a series of samples acquired in different conditions. The program provides all the signal processing steps, as well as peak-picking and bucketing of 1D and 2D spectra, the program and its components are fully available. In an experiment mimicking the search of a bioactive species in a natural extract, we use it for the automatic detection of small amounts of artemisinin added to a series of plant extracts and for the generation of the spectral fingerprint of this molecule. This program called Plasmodesma is a novel tool that should be useful to decipher complex mixtures, particularly in the discovery of biologically active natural products from plants extracts but can also in drug discovery or metabolomics studies. Copyright © 2017 John Wiley & Sons, Ltd.
Networking for large-scale science: infrastructure, provisioning, transport and application mapping
NASA Astrophysics Data System (ADS)
Rao, Nageswara S.; Carter, Steven M.; Wu, Qishi; Wing, William R.; Zhu, Mengxia; Mezzacappa, Anthony; Veeraraghavan, Malathi; Blondin, John M.
2005-01-01
Large-scale science computations and experiments require unprecedented network capabilities in the form of large bandwidth and dynamically stable connections to support data transfers, interactive visualizations, and monitoring and steering operations. A number of component technologies dealing with the infrastructure, provisioning, transport and application mappings must be developed and/or optimized to achieve these capabilities. We present a brief account of the following technologies that contribute toward achieving these network capabilities: (a) DOE UltraScienceNet and NSF CHEETAH network testbeds that provide on-demand and scheduled dedicated network connections; (b) experimental results on transport protocols that achieve close to 100% utilization on dedicated 1Gbps wide-area channels; (c) a scheme for optimally mapping a visualization pipeline onto a network to minimize the end-to-end delays; and (d) interconnect configuration and protocols that provides multiple Gbps flows from Cray X1 to external hosts.
From rotating atomic rings to quantum Hall states.
Roncaglia, M; Rizzi, M; Dalibard, J
2011-01-01
Considerable efforts are currently devoted to the preparation of ultracold neutral atoms in the strongly correlated quantum Hall regime. However, the necessary angular momentum is very large and in experiments with rotating traps this means spinning frequencies extremely near to the deconfinement limit; consequently, the required control on parameters turns out to be too stringent. Here we propose instead to follow a dynamic path starting from the gas initially confined in a rotating ring. The large moment of inertia of the ring-shaped fluid facilitates the access to large angular momenta, corresponding to giant vortex states. The trapping potential is then adiabatically transformed into a harmonic confinement, which brings the interacting atomic gas in the desired quantum-Hall regime. We provide numerical evidence that for a broad range of initial angular frequencies, the giant-vortex state is adiabatically connected to the bosonic ν = 1/2 Laughlin state.
Lampa, Samuel; Alvarsson, Jonathan; Spjuth, Ola
2016-01-01
Predictive modelling in drug discovery is challenging to automate as it often contains multiple analysis steps and might involve cross-validation and parameter tuning that create complex dependencies between tasks. With large-scale data or when using computationally demanding modelling methods, e-infrastructures such as high-performance or cloud computing are required, adding to the existing challenges of fault-tolerant automation. Workflow management systems can aid in many of these challenges, but the currently available systems are lacking in the functionality needed to enable agile and flexible predictive modelling. We here present an approach inspired by elements of the flow-based programming paradigm, implemented as an extension of the Luigi system which we name SciLuigi. We also discuss the experiences from using the approach when modelling a large set of biochemical interactions using a shared computer cluster.Graphical abstract.
A table top experiment to study plasma confined by a dipole magnet
NASA Astrophysics Data System (ADS)
Bhattacharjee, Sudeep; Baitha, Anuj Ram
2016-10-01
There has been a long quest to understand charged particle generation, confinement and underlying complex processes in a plasma confined by a dipole magnet. Our earth's magnetosphere is an example of such a naturally occurring system. A few laboratory experiments have been designed for such investigations, such as the Levitated Dipole Experiment (LDX) at MIT, the Terella experiment at Columbia university, and the Ring Trap-1 (RT-1) experiment at the University of Tokyo. However, these are large scale experiments, where the dipole magnetic field is created with superconducting coils, thereby, necessitating power supplies and stringent cryogenic requirements. We report a table top experiment to investigate important physical processes in a dipole plasma. A strong cylindrical permanent magnet, is employed to create the dipole field inside a vacuum chamber. The magnet is suspended and cooled by circulating chilled water. The plasma is heated by electromagnetic waves of 2.45 GHz and a second frequency in the range 6 - 11 GHz. Some of the initial results of measurements and numerical simulation of magnetic field, visual observations of the first plasma, and spatial measurements of plasma parameters will be presented.
W.K.H. Panofsky Prize: The Long Journey to the Higgs Boson: CMS
NASA Astrophysics Data System (ADS)
Virdee, Tejinder
2017-01-01
There has been a rich harvest of physics from the experiments at the Large Hadron Collider (LHC). In July 2012, the ground-breaking discovery of the Higgs boson was made by the ATLAS and CMS experiments. This boson is a long-sought particle expected from the mechanism for spontaneous symmetry breaking in the electro-weak sector that provides an explanation of how elementary particles acquire mass. The discovery required experiments of unprecedented capability and complexity. This talk, complementing that of Peter Jenni, will trace the background to the search for the Higgs boson at the LHC, the conception, the construction and the operation of the CMS experiment, and its subsequent discovery of the boson. The SM is considered to be a low energy manifestation of a more complete theory - physics beyond the SM is therefore widely anticipated. Selected CMS results will be presented from the search for physics beyond the SM from the 13 TeV Run-2 at the LHC.
Sodium Handling Technology and Engineering Design of the Madison Dynamo Experiment.
NASA Astrophysics Data System (ADS)
Kendrick, R.; Forest, C. B.; O'Connell, R.; Wright, A.; Robinson, K.
1998-11-01
A new liquid metal MHD experiment is being constructed at the University of Wisconsin to test several key predictions of dynamo theory: magnetic instabilities driven by sheared flow, the effects of turbulence on current generation, and the back-reaction of the self-generated magnetic field on the fluid motion which brings saturation. This presentation describes the engineering design of the experiment, which is a 0.5 m radius spherical vessel, filled with liquid sodium at 150 degrees Celsius. The experiment is designed to achieve a magnetic Reynolds number in excess of 100, which requires approximately 80 Hp of mechanical drive, producing flow velocities in sodium of 15 m/s through impellers. Handling liquid sodium offers a number of technical challenges, but routine techniques have been developed over the past several decades for safely handling large quantities for the fast breeder reactor. The handling strategy is discussed, technical details concerning seals and pressurazation are presented, and safety elements are highlighted.
Initial operation with sodium in the Madison Dynamo Experiment.
NASA Astrophysics Data System (ADS)
Kendrick, R.; Spence, Ej; Forest, C. B.; O'Connell, R.; Nornberg, Md; Canary, Hw; Wright, A.; Robinson, K.
1999-11-01
A new liquid metal MHD experiment has been constructed at the University of Wisconsin to test several key predictions of dynamo theory: magnetic instabilities driven by sheared flow, the effects of turbulence on current generation, and the back-reaction of the self-generated magnetic field on the fluid motion which brings saturation. This presentation describes the engineering design of the experiment, which is a 0.5 m radius spherical vessel, filled with liquid sodium at 150 ^circC. The experiment is designed to achieve a magnetic Reynolds number in excess of 100, which requires approximately 80 Hp of mechanical drive, producing flow velocities in sodium of 15 m/s through impellers. Handling liquid sodium offers a number of technical challenges, but routine techniques have been developed over the past several decades for safely handling large quantities for the fast breeder reactor. The handling strategy is discussed, technical details concerning seals and pressurization are presented, and safety elements are highlighted.
Back to the future with hands-on science: students' perceptions of learning anatomy and physiology.
Johnston, Amy Nicole Burne; McAllister, Margaret
2008-09-01
This article examines student perceptions of learning related to anatomy and physiology in a bachelor of nursing program. One strategy to teach the sciences is simulated learning, a technology that offers exciting potential. Virtual environments for laboratory learning may offer numerous benefits: teachers can convey information to a larger group of students, reducing the need for small laboratory classes; less equipment is required, thus containing ongoing costs; and students can learn in their own time and place. However, simulated learning may also diminish access to the teacher-student relationship and the opportunity for guided practice and guided linking of theory with practice. Without this hands-on experience, there is a risk that students will not engage as effectively, and thus conceptual learning and the development of critical thinking skills are diminished. However, student perceptions of these learning experiences are largely unknown. Thus, this study examined students' perceptions of anatomy and physiology laboratory experiences and the importance they placed on hands-on experience in laboratory settings.
An overview of occupational health and safety in Australia.
Smith, Derek Richard; Yamagata, Zentaro
2002-03-01
Australia is a developed country with a high standard of living, small population and large land area. Manufacturing is currently the largest economic contributor, although mining and agriculture are also significant industries. There are around 10 million employees in total, with retail trade and manufacturing being the largest employers. Manufacturing currently has the highest incidence of workplace injury, although around 5% of all Australian workers suffer from some kind of occupational disease or injury every year. Occupational Health and Safety (OHS) legislation is individually managed and enforced by the 8 states and territories. Training and registration for OHS professionals varies between the speciaities and usually requires a combination of academic qualifications and workplace experience. Non-medical personnel constitute a large proportion of OHS professionals in Australia.
The implications of the COBE diffuse microwave radiation results for cosmic strings
NASA Technical Reports Server (NTRS)
Bennett, David P.; Stebbins, Albert; Bouchet, Francois R.
1992-01-01
We compare the anisotropies in the cosmic microwave background radiation measured by the COBE experiment to those predicted by cosmic string theories. We use an analytic model for the Delta T/T power spectrum that is based on our previous numerical simulations of strings, under the assumption that cosmic strings are the sole source of the measured anisotropy. This implies a value for the string mass per unit length of 1.5 +/- 0.5 x 10 exp -6 C-squared/G. This is within the range of values required for cosmic strings to successfully seed the formation of large-scale structures in the universe. These results clearly encourage further studies of Delta T/T and large-scale structure in the cosmic string model.
A multilevel probabilistic beam search algorithm for the shortest common supersequence problem.
Gallardo, José E
2012-01-01
The shortest common supersequence problem is a classical problem with many applications in different fields such as planning, Artificial Intelligence and especially in Bioinformatics. Due to its NP-hardness, we can not expect to efficiently solve this problem using conventional exact techniques. This paper presents a heuristic to tackle this problem based on the use at different levels of a probabilistic variant of a classical heuristic known as Beam Search. The proposed algorithm is empirically analysed and compared to current approaches in the literature. Experiments show that it provides better quality solutions in a reasonable time for medium and large instances of the problem. For very large instances, our heuristic also provides better solutions, but required execution times may increase considerably.