Sample records for studies require large

  1. Advanced technology requirements for large space structures. Part 5: Atlas program requirements

    NASA Technical Reports Server (NTRS)

    Katz, E.; Lillenas, A. N.; Broddy, J. A.

    1977-01-01

    The results of a special study which identifies and assigns priorities to technology requirements needed to accomplish a particular scenario of future large area space systems are described. Proposed future systems analyzed for technology requirements included large Electronic Mail, Microwave Radiometer, and Radar Surveillance Satellites. Twenty technology areas were identified as requirements to develop the proposed space systems.

  2. A Nationwide Overview of Sight-Singing Requirements of Large-Group Choral Festivals

    ERIC Educational Resources Information Center

    Norris, Charles E.

    2004-01-01

    The purpose of this study was to examine sight-singing requirements at junior and senior high school large-group ratings-based choral festivals throughout the United States. Responses to the following questions were sought from each state: (1) Are there ratings-based large-group choral festivals? (2) Is sight-singing a requirement? (3) Are there…

  3. Student Perceptions of General Education Requirements at a Large Public University: No Surprises?

    ERIC Educational Resources Information Center

    Thompson, Clarissa A.; Eodice, Michele; Tran, Phuoc

    2015-01-01

    The current study surveyed students' knowledge of and perceptions about general education requirements at a large research-intensive university. Findings revealed that students harbored misconceptions about general education requirements and illuminated the reasons why students were choosing to take required general education courses at other…

  4. Large Deployable Reflector (LDR) feasibility study update

    NASA Technical Reports Server (NTRS)

    Alff, W. H.; Banderman, L. W.

    1983-01-01

    In 1982 a workshop was held to refine the science rationale for large deployable reflectors (LDR) and develop technology requirements that support the science rationale. At the end of the workshop, a set of LDR consensus systems requirements was established. The subject study was undertaken to update the initial LDR study using the new systems requirements. The study included mirror materials selection and configuration, thermal analysis, structural concept definition and analysis, dynamic control analysis and recommendations for further study. The primary emphasis was on the dynamic controls requirements and the sophistication of the controls system needed to meet LDR performance goals.

  5. Large Deployable Reflector Science and Technology Workshop. Volume 2: Scientific Rationale and Technology Requirements

    NASA Technical Reports Server (NTRS)

    Hollenbach, D. (Editor)

    1983-01-01

    The scientific rationale for the large deployable reflector (LDR) and the overall technological requirements are discussed. The main scientific objectives include studies of the origins of planets, stars and galaxies, and of the ultimate fate of the universe. The envisioned studies require a telescope with a diameter of at least 20 m, diffraction-limited to wavelengths as short as 30-50 micron. In addition, light-bucket operation with 1 arcsec spatial resolution in the 2-4 microns wavelength region would be useful in studies of high-redshifted galaxies. Such a telescope would provide a large increase in spectroscopic sensitivity and spatial resolving power compared with existing or planned infrared telescopes.

  6. Study of auxiliary propulsion requirements for large space systems, volume 2

    NASA Technical Reports Server (NTRS)

    Smith, W. W.; Machles, G. W.

    1983-01-01

    A range of single shuttle launched large space systems were identified and characterized including a NASTRAN and loading dynamics analysis. The disturbance environment, characterization of thrust level and APS mass requirements, and a study of APS/LSS interactions were analyzed. State-of-the-art capabilities for chemical and ion propulsion were compared with the generated propulsion requirements to assess the state-of-the-art limitations and benefits of enhancing current technology.

  7. Platform options for the Space Station program

    NASA Technical Reports Server (NTRS)

    Mangano, M. J.; Rowley, R. W.

    1986-01-01

    Platforms for polar and 28.5 deg orbits were studied to determine the platform requirements and characteristics necessary to support the science objectives. Large platforms supporting the Earth-Observing System (EOS) were initially studied. Co-orbiting platforms were derived from these designs. Because cost estimates indicated that the large platform approach was likely to be too expensive, require several launches, and generally be excessively complex, studies of small platforms were undertaken. Results of these studies show the small platform approach to be technically feasible at lower overall cost. All designs maximized hardware inheritance from the Space Station program to reduce costs. Science objectives as defined at the time of these studies are largely achievable.

  8. Large Deployable Reflector (LDR) system concept and technology definition study. Analysis of space station requirements for LDR

    NASA Astrophysics Data System (ADS)

    Agnew, Donald L.; Vinkey, Victor F.; Runge, Fritz C.

    1989-04-01

    A study was conducted to determine how the Large Deployable Reflector (LDR) might benefit from the use of the space station for assembly, checkout, deployment, servicing, refurbishment, and technology development. Requirements that must be met by the space station to supply benefits for a selected scenario are summarized. Quantitative and qualitative data are supplied. Space station requirements for LDR which may be utilized by other missions are identified. A technology development mission for LDR is outlined and requirements summarized. A preliminary experiment plan is included. Space Station Data Base SAA 0020 and TDM 2411 are updated.

  9. Large Deployable Reflector (LDR) system concept and technology definition study. Analysis of space station requirements for LDR

    NASA Technical Reports Server (NTRS)

    Agnew, Donald L.; Vinkey, Victor F.; Runge, Fritz C.

    1989-01-01

    A study was conducted to determine how the Large Deployable Reflector (LDR) might benefit from the use of the space station for assembly, checkout, deployment, servicing, refurbishment, and technology development. Requirements that must be met by the space station to supply benefits for a selected scenario are summarized. Quantitative and qualitative data are supplied. Space station requirements for LDR which may be utilized by other missions are identified. A technology development mission for LDR is outlined and requirements summarized. A preliminary experiment plan is included. Space Station Data Base SAA 0020 and TDM 2411 are updated.

  10. Active large structures

    NASA Technical Reports Server (NTRS)

    Soosaar, K.

    1982-01-01

    Some performance requirements and development needs for the design of large space structures are described. Areas of study include: (1) dynamic response of large space structures; (2) structural control and systems integration; (3) attitude control; and (4) large optics and flexibility. Reference is made to a large space telescope.

  11. Autonomous Aerobraking: A Design, Development, and Feasibility Study

    NASA Technical Reports Server (NTRS)

    Prince, Jill L. H.; Powell, Richard W.; Murri, Dan

    2011-01-01

    Aerobraking has been used four times to decrease the apoapsis of a spacecraft in a captured orbit around a planetary body with a significant atmosphere utilizing atmospheric drag to decelerate the spacecraft. While aerobraking requires minimum fuel, the long time required for aerobraking requires both a large operations staff, and large Deep Space Network resources. A study to automate aerobraking has been sponsored by the NASA Engineering and Safety Center to determine initial feasibility of equipping a spacecraft with the onboard capability for autonomous aerobraking, thus saving millions of dollars incurred by a large aerobraking operations workforce and continuous DSN coverage. This paper describes the need for autonomous aerobraking, the development of the Autonomous Aerobraking Development Software that includes an ephemeris estimator, an atmospheric density estimator, and maneuver calculation, and the plan forward for continuation of this study.

  12. High voltage cabling for high power spacecraft

    NASA Technical Reports Server (NTRS)

    Dunbar, W. G.

    1981-01-01

    Studies by NASA have shown that many of the space missions proposed for the time period 1980 to 2000 will require large spacecraft structures to be assembled in orbit. Large antennas and power systems up to 2.5 MW size are predicted to supply the electrical/electronic subsystems, solar electric subsystems, solar electric propulsion, and space processing for the near-term programs. Platforms of 100 meters/length for stable foundations, utility stations, and supports for these multi-antenna and electronic powered mechanisms are also being considered. This paper includes the findings of an analytic and conceptual design study for large spacecraft power distribution, and electrical loads and their influence on the cable and connector requirements for these proposed large spacecraft.

  13. Sight-Reading Requirements at Concert Band Festivals: A National Survey

    ERIC Educational Resources Information Center

    Paul, Timothy A.

    2010-01-01

    This study, a replication and extension of work by Norris (2004), examined sight-reading requirements at middle and high school large-group band festivals across the United States. As in the earlier investigation, answers to the following questions were solicited from all 50 states: (1) Are there ratings-based large-group band festivals? (2) Is…

  14. Space Construction Automated Fabrication Experiment Definition Study (SCAFEDS), part 3. Volume 3: Requirements

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The performance, design and verification requirements for the space Construction Automated Fabrication Experiment (SCAFE) are defined. The SCAFE program defines, develops, and demonstrates the techniques, processes, and equipment required for the automatic fabrication of structural elements in space and for the assembly of such elements into a large, lightweight structure. The program defines a large structural platform to be constructed in orbit using the space shuttle as a launch vehicle and construction base.

  15. Extravehicular Crewman Work System (ECWS) study program. Volume 2: Construction

    NASA Technical Reports Server (NTRS)

    Wilde, R. C.

    1980-01-01

    The construction portion of the Extravehicular Crewman Work System Study defines the requirements and selects the concepts for the crewman work system required to support the construction of large structures in space.

  16. Requirements for a mobile communications satellite system. Volume 3: Large space structures measurements study

    NASA Technical Reports Server (NTRS)

    Akle, W.

    1983-01-01

    This study report defines a set of tests and measurements required to characterize the performance of a Large Space System (LSS), and to scale this data to other LSS satellites. Requirements from the Mobile Communication Satellite (MSAT) configurations derived in the parent study were used. MSAT utilizes a large, mesh deployable antenna, and encompasses a significant range of LSS technology issues in the areas of structural/dynamics, control, and performance predictability. In this study, performance requirements were developed for the antenna. Special emphasis was placed on antenna surface accuracy, and pointing stability. Instrumentation and measurement systems, applicable to LSS, were selected from existing or on-going technology developments. Laser ranging and angulation systems, presently in breadboard status, form the backbone of the measurements. Following this, a set of ground, STS, and GEO-operational were investigated. A third scale (15 meter) antenna system as selected for ground characterization followed by STS flight technology development. This selection ensures analytical scaling from ground-to-orbit, and size scaling. Other benefits are cost and ability to perform reasonable ground tests. Detail costing of the various tests and measurement systems were derived and are included in the report.

  17. Low-cost floating emergence net and bottle trap: Comparison of two designs

    USGS Publications Warehouse

    Cadmus, Pete; Pomeranz, Justin; Kraus, Johanna M.

    2016-01-01

    Sampling emergent aquatic insects is of interest to many freshwater ecologists. Many quantitative emergence traps require the use of aspiration for collection. However, aspiration is infeasible in studies with large amounts of replication that is often required in large biomonitoring projects. We designed an economic, collapsible pyramid-shaped floating emergence trap with an external collection bottle that avoids the need for aspiration. This design was compared experimentally to a design of similar dimensions that relied on aspiration to ensure comparable results. The pyramid-shaped design captured twice as many total emerging insects. When a preservative was used in bottle collectors, >95% of the emergent abundance was collected in the bottle. When no preservative was used, >81% of the total insects were collected from the bottle. In addition to capturing fewer emergent insects, the traps that required aspiration took significantly longer to sample. Large studies and studies sampling remote locations could benefit from the economical construction, speed of sampling, and capture efficiency.

  18. Low-authority control synthesis for large space structures

    NASA Technical Reports Server (NTRS)

    Aubrun, J. N.; Margulies, G.

    1982-01-01

    The control of vibrations of large space structures by distributed sensors and actuators is studied. A procedure is developed for calculating the feedback loop gains required to achieve specified amounts of damping. For moderate damping (Low Authority Control) the procedure is purely algebraic, but it can be applied iteratively when larger amounts of damping are required and is generalized for arbitrary time invariant systems.

  19. pycola: N-body COLA method code

    NASA Astrophysics Data System (ADS)

    Tassev, Svetlin; Eisenstein, Daniel J.; Wandelt, Benjamin D.; Zaldarriagag, Matias

    2015-09-01

    pycola is a multithreaded Python/Cython N-body code, implementing the Comoving Lagrangian Acceleration (COLA) method in the temporal and spatial domains, which trades accuracy at small-scales to gain computational speed without sacrificing accuracy at large scales. This is especially useful for cheaply generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing. The COLA method achieves its speed by calculating the large-scale dynamics exactly using LPT while letting the N-body code solve for the small scales, without requiring it to capture exactly the internal dynamics of halos.

  20. USE OF DISPOSABLE DIAPERS TO COLLECT URINE IN EXPOSURE STUDIES

    EPA Science Inventory

    Large studies of children's health as it relates to exposures to chemicals in the environment often require measurements of biomarkers of chemical exposures or effects in urine samples. But collection of urine samples from infants and toddlers is difficult. For large exposure s...

  1. Case Study Effectiveness in a Team-Teaching and General-Education Environment

    ERIC Educational Resources Information Center

    Olorunnisola, Anthony A.; Ramasubramanian, Srividya; Russill, Chris; Dumas, Josephine

    2003-01-01

    This paper examines the effectiveness of the case study method in a team-teaching environment designed to augment a large capstone communications course that satisfies general education requirements. Results from a survey revealed that the use of case study enhanced the otherwise missing connection between the large lecture and the recitation…

  2. Impact of rainfall on the moisture content of large woody fuels

    Treesearch

    Helen H. Mohr; Thomas A. Waldrop

    2013-01-01

    This unreplicated case study evaluates the impact of rainfall on large woody fuels over time. We know that one rainfall event may decrease the Keetch-Byram Drought Index, but this study shows no real increase in fuel moisture in 1,000- hour fuels after just one rainfall. Several rain events over time are required for the moisture content of large woody fuels to...

  3. Scientific Instrument Package for the large space telescope (SIP)

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The feasibility of a scientific instrument package (SIP) that will satisfy the requirements of the large space telescope was established. A reference configuration serving as a study model and data which will aid in the trade-off studies leading to the final design configuration are reported.

  4. Control of Flexible Structures (COFS) Flight Experiment Background and Description

    NASA Technical Reports Server (NTRS)

    Hanks, B. R.

    1985-01-01

    A fundamental problem in designing and delivering large space structures to orbit is to provide sufficient structural stiffness and static configuration precision to meet performance requirements. These requirements are directly related to control requirements and the degree of control system sophistication available to supplement the as-built structure. Background and rationale are presented for a research study in structures, structural dynamics, and controls using a relatively large, flexible beam as a focus. This experiment would address fundamental problems applicable to large, flexible space structures in general and would involve a combination of ground tests, flight behavior prediction, and instrumented orbital tests. Intended to be multidisciplinary but basic within each discipline, the experiment should provide improved understanding and confidence in making design trades between structural conservatism and control system sophistication for meeting static shape and dynamic response/stability requirements. Quantitative results should be obtained for use in improving the validity of ground tests for verifying flight performance analyses.

  5. Weight optimization of ultra large space structures

    NASA Technical Reports Server (NTRS)

    Reinert, R. P.

    1979-01-01

    The paper describes the optimization of a solar power satellite structure for minimum mass and system cost. The solar power satellite is an ultra large low frequency and lightly damped space structure; derivation of its structural design requirements required accommodation of gravity gradient torques which impose primary loads, life up to 100 years in the rigorous geosynchronous orbit radiation environment, and prevention of continuous wave motion in a solar array blanket suspended from a huge, lightly damped structure subject to periodic excitations. The satellite structural design required a parametric study of structural configurations and consideration of the fabrication and assembly techniques, which resulted in a final structure which met all requirements at a structural mass fraction of 10%.

  6. Instrument control software requirement specification for Extremely Large Telescopes

    NASA Astrophysics Data System (ADS)

    Young, Peter J.; Kiekebusch, Mario J.; Chiozzi, Gianluca

    2010-07-01

    Engineers in several observatories are now designing the next generation of optical telescopes, the Extremely Large Telescopes (ELT). These are very complex machines that will host sophisticated astronomical instruments to be used for a wide range of scientific studies. In order to carry out scientific observations, a software infrastructure is required to orchestrate the control of the multiple subsystems and functions. This paper will focus on describing the considerations, strategies and main issues related to the definition and analysis of the software requirements for the ELT's Instrument Control System using modern development processes and modelling tools like SysML.

  7. Crew size affects fire fighting efficiency: A progress report on time studies of the fire fighting job.

    Treesearch

    Donald N. Matthews

    1940-01-01

    Fire fighting is still largely a hand-work job in the heavy cover and fuel conditions and rugged topography of the Douglas fir region, in spite of recent advances that have been made in %he use of machinery. Controlling a fire in this region requires immense amounts of work per unit of fire perimeter, so that large numbers of men are required to attack all but the...

  8. Identification of linearised RMS-voltage dip patterns based on clustering in renewable plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    García-Sánchez, Tania; Gómez-Lázaro, Emilio; Muljadi, Edward

    Generation units connected to the grid are currently required to meet low-voltage ride-through (LVRT) requirements. In most developed countries, these requirements also apply to renewable sources, mainly wind power plants and photovoltaic installations connected to the grid. This study proposes an alternative characterisation solution to classify and visualise a large number of collected events in light of current limits and requirements. The authors' approach is based on linearised root-mean-square-(RMS)-voltage trajectories, taking into account LRVT requirements, and a clustering process to identify the most likely pattern trajectories. The proposed solution gives extensive information on an event's severity by providing a simplemore » but complete visualisation of the linearised RMS-voltage patterns. In addition, these patterns are compared to current LVRT requirements to determine similarities or discrepancies. A large number of collected events can then be automatically classified and visualised for comparative purposes. Real disturbances collected from renewable sources in Spain are used to assess the proposed solution. Extensive results and discussions are also included in this study.« less

  9. Compressor Study to Meet Large Civil Tilt Rotor Engine Requirements

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.

    2009-01-01

    A vehicle concept study has been made to meet the requirements of the Large Civil Tilt Rotorcraft vehicle mission. A vehicle concept was determined, and a notional turboshaft engine system study was conducted. The engine study defined requirements for the major engine components, including the compressor. The compressor design-point goal was to deliver a pressure ratio of 31:1 at an inlet weight flow of 28.4 lbm/sec. To perform a conceptual design of two potential compressor configurations to meet the design requirement, a mean-line compressor flow analysis and design code were used. The first configuration is an eight-stage axial compressor. Some challenges of the all-axial compressor are the small blade spans of the rear-block stages being 0.28 in., resulting in the last-stage blade tip clearance-to-span ratio of 2.4%. The second configuration is a seven-stage axial compressor, with a centrifugal stage having a 0.28-in. impeller-exit blade span. The compressors conceptual designs helped estimate the flow path dimensions, rotor leading and trailing edge blade angles, flow conditions, and velocity triangles for each stage.

  10. Compressor Study to Meet Large Civil Tilt Rotor Engine Requirements

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.

    2009-01-01

    A vehicle concept study has been made to meet the requirements of the Large Civil Tilt Rotorcraft vehicle mission. A vehicle concept was determined, and a notional turboshaft engine system study was conducted. The engine study defined requirements for the major engine components, including the compressor. The compressor design-point goal was to deliver a pressure ratio of 31:1 at an inlet weight flow of 28.4 lbm/sec. To perform a conceptual design of two potential compressor configurations to meet the design requirement, a mean-line compressor flow analysis and design code were used. The first configuration is an eight-stage axial compressor. Some challenges of the all-axial compressor are the small blade spans of the rear-block stages being 0.28 in., resulting in the last-stage blade tip clearance-to-span ratio of 2.4 percent. The second configuration is a seven-stage axial compressor, with a centrifugal stage having a 0.28-in. impeller-exit blade span. The compressors conceptual designs helped estimate the flow path dimensions, rotor leading and trailing edge blade angles, flow conditions, and velocity triangles for each stage.

  11. Exploring Google Earth Engine platform for big data processing: classification of multi-temporal satellite imagery for crop mapping

    NASA Astrophysics Data System (ADS)

    Shelestov, Andrii; Lavreniuk, Mykola; Kussul, Nataliia; Novikov, Alexei; Skakun, Sergii

    2017-02-01

    Many applied problems arising in agricultural monitoring and food security require reliable crop maps at national or global scale. Large scale crop mapping requires processing and management of large amount of heterogeneous satellite imagery acquired by various sensors that consequently leads to a “Big Data” problem. The main objective of this study is to explore efficiency of using the Google Earth Engine (GEE) platform when classifying multi-temporal satellite imagery with potential to apply the platform for a larger scale (e.g. country level) and multiple sensors (e.g. Landsat-8 and Sentinel-2). In particular, multiple state-of-the-art classifiers available in the GEE platform are compared to produce a high resolution (30 m) crop classification map for a large territory ( 28,100 km2 and 1.0 M ha of cropland). Though this study does not involve large volumes of data, it does address efficiency of the GEE platform to effectively execute complex workflows of satellite data processing required with large scale applications such as crop mapping. The study discusses strengths and weaknesses of classifiers, assesses accuracies that can be achieved with different classifiers for the Ukrainian landscape, and compares them to the benchmark classifier using a neural network approach that was developed in our previous studies. The study is carried out for the Joint Experiment of Crop Assessment and Monitoring (JECAM) test site in Ukraine covering the Kyiv region (North of Ukraine) in 2013. We found that Google Earth Engine (GEE) provides very good performance in terms of enabling access to the remote sensing products through the cloud platform and providing pre-processing; however, in terms of classification accuracy, the neural network based approach outperformed support vector machine (SVM), decision tree and random forest classifiers available in GEE.

  12. Shuttle cryogenic supply system. Optimization study. Volume 5 B-1: Programmers manual for math models

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A computer program for rapid parametric evaluation of various types of cryogenics spacecraft systems is presented. The mathematical techniques of the program provide the capability for in-depth analysis combined with rapid problem solution for the production of a large quantity of soundly based trade-study data. The program requires a large data bank capable of providing characteristics performance data for a wide variety of component assemblies used in cryogenic systems. The program data requirements are divided into: (1) the semipermanent data tables and source data for performance characteristics and (2) the variable input data which contains input parameters which may be perturbated for parametric system studies.

  13. Influence of the large-small split effect on strategy choice in complex subtraction.

    PubMed

    Xiang, Yan Hui; Wu, Hao; Shang, Rui Hong; Chao, Xiaomei; Ren, Ting Ting; Zheng, Li Ling; Mo, Lei

    2018-04-01

    Two main theories have been used to explain the arithmetic split effect: decision-making process theory and strategy choice theory. Using the inequality paradigm, previous studies have confirmed that individuals tend to adopt a plausibility-checking strategy and a whole-calculation strategy to solve large and small split problems in complex addition arithmetic, respectively. This supports strategy choice theory, but it is unknown whether this theory also explains performance in solving different split problems in complex subtraction arithmetic. This study used small, intermediate and large split sizes, with each split condition being further divided into problems requiring and not requiring borrowing. The reaction times (RTs) for large and intermediate splits were significantly shorter than those for small splits, while accuracy was significantly higher for large and middle splits than for small splits, reflecting no speed-accuracy trade-off. Further, RTs and accuracy differed significantly between the borrow and no-borrow conditions only for small splits. This study indicates that strategy choice theory is suitable to explain the split effect in complex subtraction arithmetic. That is, individuals tend to choose the plausibility-checking strategy or the whole-calculation strategy according to the split size. © 2016 International Union of Psychological Science.

  14. Very Large Graphs for Information Extraction (VLG) Detection and Inference in the Presence of Uncertainty

    DTIC Science & Technology

    2015-09-21

    this framework, MIT LL carried out a one-year proof- of-concept study to determine the capabilities and challenges in the detection of anomalies in...extremely large graphs [5]. Under this effort, two real datasets were considered, and algorithms for data modeling and anomaly detection were developed...is required in a well-defined experimental framework for the detection of anomalies in very large graphs. This study is intended to inform future

  15. An improved filter elution and cell culture assay procedure for evaluating public groundwater systems for culturable enteroviruses.

    PubMed

    Dahling, Daniel R

    2002-01-01

    Large-scale virus studies of groundwater systems require practical and sensitive procedures for both sample processing and viral assay. Filter adsorption-elution procedures have traditionally been used to process large-volume water samples for viruses. In this study, five filter elution procedures using cartridge filters were evaluated for their effectiveness in processing samples. Of the five procedures tested, the third method, which incorporated two separate beef extract elutions (one being an overnight filter immersion in beef extract), recovered 95% of seeded poliovirus compared with recoveries of 36 to 70% for the other methods. For viral enumeration, an expanded roller bottle quantal assay was evaluated using seeded poliovirus. This cytopathic-based method was considerably more sensitive than the standard plaque assay method. The roller bottle system was more economical than the plaque assay for the evaluation of comparable samples. Using roller bottles required less time and manipulation than the plaque procedure and greatly facilitated the examination of large numbers of samples. The combination of the improved filter elution procedure and the roller bottle assay for viral analysis makes large-scale virus studies of groundwater systems practical. This procedure was subsequently field tested during a groundwater study in which large-volume samples (exceeding 800 L) were processed through the filters.

  16. Definition of technology development missions for early space stations: Large space structures

    NASA Technical Reports Server (NTRS)

    Gates, R. M.; Reid, G.

    1984-01-01

    The objectives studied are the definition of the tested role of an early Space Station for the construction of large space structures. This is accomplished by defining the LSS technology development missions (TDMs) identified in phase 1. Design and operations trade studies are used to identify the best structural concepts and procedures for each TDMs. Details of the TDM designs are then developed along with their operational requirements. Space Station resources required for each mission, both human and physical, are identified. The costs and development schedules for the TDMs provide an indication of the programs needed to develop these missions.

  17. Reading, Reforms, and Resources: How Elementary Teachers Teach Literacy in Contexts of Complex Educational Policies and Required Curriculum

    ERIC Educational Resources Information Center

    Waldron, Chad H.

    2014-01-01

    This descriptive, mixed method study investigated the literacy-related contextual factors and local curricular decision-making of experienced elementary literacy teachers in one large U.S. public school district. The study's guiding research question was: How do elementary in-service teachers teach literacy within the contexts of required literacy…

  18. Expression, purification, and characterization of almond (Prunus dulcis) allergen Pru du 4

    USDA-ARS?s Scientific Manuscript database

    Biochemical characterizations of food allergens are required for understanding the allergenicity of food allergens. Such studies require a relatively large amount of highly purified allergens. Profilins from numerous species are known to be allergens, including food allergens, such as almond (Prunus...

  19. Basic relationships for LTA technical analysis

    NASA Technical Reports Server (NTRS)

    Ausrotas, R. A.

    1975-01-01

    An introduction to airship performance is presented. Static lift equations are shown which, when combined with power requirements for conventional airships, allow parametric studies of range, payload, speed and airship size. It is shown that very large airships are required to attain reasonable speeds at transoceanic ranges.

  20. Impact of large field angles on the requirements for deformable mirror in imaging satellites

    NASA Astrophysics Data System (ADS)

    Kim, Jae Jun; Mueller, Mark; Martinez, Ty; Agrawal, Brij

    2018-04-01

    For certain imaging satellite missions, a large aperture with wide field-of-view is needed. In order to achieve diffraction limited performance, the mirror surface Root Mean Square (RMS) error has to be less than 0.05 waves. In the case of visible light, it has to be less than 30 nm. This requirement is difficult to meet as the large aperture will need to be segmented in order to fit inside a launch vehicle shroud. To reduce this requirement and to compensate for the residual wavefront error, Micro-Electro-Mechanical System (MEMS) deformable mirrors can be considered in the aft optics of the optical system. MEMS deformable mirrors are affordable and consume low power, but are small in size. Due to the major reduction in pupil size for the deformable mirror, the effective field angle is magnified by the diameter ratio of the primary and deformable mirror. For wide field of view imaging, the required deformable mirror correction is field angle dependant, impacting the required parameters of a deformable mirror such as size, number of actuators, and actuator stroke. In this paper, a representative telescope and deformable mirror system model is developed and the deformable mirror correction is simulated to study the impact of the large field angles in correcting a wavefront error using a deformable mirror in the aft optics.

  1. Preliminary Engineering Study of Long-Lead Time Equipment Required for Large Lightweight Mirror Manufacture

    DTIC Science & Technology

    1981-06-01

    numnber) Annealing Fusion Sealed Mirrors ULED Mirrors Boule Large Lightweight Mirror Core Low Expansion Glass Coremaker Mirror Blanks Forming Furnace...Experiments 34 4 10.6 Grinder Procurement 35 J 1 I GLOSSARY Alpha - Coef. of thermal expansion. Boule - The disc of glass formed in the furnace. Cell...turning over of large plates, cores or mirrors. Flowout - Method used to produce large diameter plates from small diameter boules. Glass - Used in the

  2. Solar array study for solar electric propulsion spacecraft for the Encke rendezvous mission

    NASA Technical Reports Server (NTRS)

    Sequeira, E. A.; Patterson, R. E.

    1974-01-01

    The work is described which was performed on the design, analysis and performance of a 20 kW rollup solar array capable of meeting the design requirements of a solar electric spacecraft for the 1980 Encke rendezvous mission. To meet the high power requirements of the proposed electric propulsion mission, solar arrays on the order of 186.6 sq m were defined. Because of the large weights involved with arrays of this size, consideration of array configurations is limited to lightweight, large area concepts with maximum power-to-weight ratios. Items covered include solar array requirements and constraints, array concept selection and rationale, structural and electrical design considerations, and reliability considerations.

  3. Analytical Study of Self-Motivations among a Southwest Public University Nonpolitical Science Major Students in Required Political Science Courses

    ERIC Educational Resources Information Center

    Gasim, Gamal; Stevens, Tara; Zebidi, Amira

    2012-01-01

    All undergraduate students are required by state law to take six credited hours in political science. This study will help us identify if differences exist in self-determination among students enrolled in American Public Policy and American Government at a large, Southwestern public university. Because some types of motivation are associated with…

  4. Joint Distributed Regional Training Capacity: A Scoping Study

    DTIC Science & Technology

    2007-12-01

    use management mechanisms 4. Develop assessment tools to rapidly quantify temporary land-use dis- turbance risks . The development of such...the Army Environmental Requirements and Technology Assessments (AERTA) process to develop validated requirements upon which to base more focused...conducting a large environmental assessment study each time an exercise is planned is needlessly expensive and does not give the flexibility to

  5. Site productivity and diversity of the Middle Mountain long-term soil productivity study, West Virginia: Pre-experimental site characterization

    Treesearch

    Mary Beth Adams

    2018-01-01

    To better understand the impacts of a changing environment and interactions with forest management options for forest resources, including soil, large long-term experiments are required. Such experiments require careful documentation of reference or pre-experimental conditions. This publication describes the Middle Mountain Long-term Soil Productivity (LTSP) Study,...

  6. How Do I Satisfy the General Education Language Requirement? University Students' Attitudes toward Language Study

    ERIC Educational Resources Information Center

    Thomas, Juan Antonio

    2010-01-01

    This study aims to identify the two principal reasons why college students choose a certain language to satisfy a general education second language requirement by polling 172 students enrolled in first-year language courses in 13 languages at a large Northeastern research university. Students answered a questionnaire and chose the two main reasons…

  7. Managing Large Scale Project Analysis Teams through a Web Accessible Database

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel A.

    2008-01-01

    Large scale space programs analyze thousands of requirements while mitigating safety, performance, schedule, and cost risks. These efforts involve a variety of roles with interdependent use cases and goals. For example, study managers and facilitators identify ground-rules and assumptions for a collection of studies required for a program or project milestone. Task leaders derive product requirements from the ground rules and assumptions and describe activities to produce needed analytical products. Disciplined specialists produce the specified products and load results into a file management system. Organizational and project managers provide the personnel and funds to conduct the tasks. Each role has responsibilities to establish information linkages and provide status reports to management. Projects conduct design and analysis cycles to refine designs to meet the requirements and implement risk mitigation plans. At the program level, integrated design and analysis cycles studies are conducted to eliminate every 'to-be-determined' and develop plans to mitigate every risk. At the agency level, strategic studies analyze different approaches to exploration architectures and campaigns. This paper describes a web-accessible database developed by NASA to coordinate and manage tasks at three organizational levels. Other topics in this paper cover integration technologies and techniques for process modeling and enterprise architectures.

  8. A technology program for the development of the large deployable reflector for space based astronomy

    NASA Technical Reports Server (NTRS)

    Kiya, M. K.; Gilbreath, W. P.; Swanson, P. N.

    1982-01-01

    Technologies for the development of the Large Deployable Reflector (LDR), a NASA project for the 1990's, for infrared and submillimeter astronomy are presented. The proposed LDR is a 10-30 diameter spaceborne observatory operating in the spectral region from 30 microns to one millimeter, where ground observations are nearly impossible. Scientific rationales for such a system include the study of ancient signals from galaxies at the edge of the universe, the study of star formation, and the observation of fluctuations in the cosmic background radiation. System requirements include the ability to observe faint objects at large distances and to map molecular clouds and H II regions. From these requirements, mass, photon noise, and tolerance budgets are developed. A strawman concept is established, and some alternate concepts are considered, but research is still necessary in the areas of segment, optical control, and instrument technologies.

  9. A recirculating stream aquarium for ecological studies.

    Treesearch

    Gordon H. Reeves; Fred H. Everest; Carl E. McLemore

    1983-01-01

    Investigations of the ecological behavior of fishes often require studies in both natural and artificial stream environments. We describe a large, recirculating stream aquarium and its controls, constructed for ecological studies at the Forestry Sciences Laboratory in Corvallis.

  10. Large Advanced Space Systems (LASS) computer-aided design program additions

    NASA Technical Reports Server (NTRS)

    Farrell, C. E.

    1982-01-01

    The LSS preliminary and conceptual design requires extensive iteractive analysis because of the effects of structural, thermal, and control intercoupling. A computer aided design program that will permit integrating and interfacing of required large space system (LSS) analyses is discussed. The primary objective of this program is the implementation of modeling techniques and analysis algorithms that permit interactive design and tradeoff studies of LSS concepts. Eight software modules were added to the program. The existing rigid body controls module was modified to include solar pressure effects. The new model generator modules and appendage synthesizer module are integrated (interfaced) to permit interactive definition and generation of LSS concepts. The mass properties module permits interactive specification of discrete masses and their locations. The other modules permit interactive analysis of orbital transfer requirements, antenna primary beam n, and attitude control requirements.

  11. Definition of technology development missions for early space stations. Large space structures, phase 2, midterm review

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The large space structures technology development missions to be performed on an early manned space station was studied and defined and the resources needed and the design implications to an early space station to carry out these large space structures technology development missions were determined. Emphasis is being placed on more detail in mission designs and space station resource requirements.

  12. Design of a large magnetic-bearing turbomolecular pump for NET

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernhardt, K.H.; Conrad, A.; Dinner, P.J.

    1988-09-01

    The feasibility of development of large vacuum components for operation in fusion machines have been investigated in the framework of the European Fusion Technology Programme. The requirements and the results of the feasibility study for the large turbomolecular pump units (TMP) are presented. Design parameters for single flow 50.000 l/s TMP and a double flow 15.000 and a 50.000 l/s TMP have been compared.

  13. A Study on Fast Gates for Large-Scale Quantum Simulation with Trapped Ions

    PubMed Central

    Taylor, Richard L.; Bentley, Christopher D. B.; Pedernales, Julen S.; Lamata, Lucas; Solano, Enrique; Carvalho, André R. R.; Hope, Joseph J.

    2017-01-01

    Large-scale digital quantum simulations require thousands of fundamental entangling gates to construct the simulated dynamics. Despite success in a variety of small-scale simulations, quantum information processing platforms have hitherto failed to demonstrate the combination of precise control and scalability required to systematically outmatch classical simulators. We analyse how fast gates could enable trapped-ion quantum processors to achieve the requisite scalability to outperform classical computers without error correction. We analyze the performance of a large-scale digital simulator, and find that fidelity of around 70% is realizable for π-pulse infidelities below 10−5 in traps subject to realistic rates of heating and dephasing. This scalability relies on fast gates: entangling gates faster than the trap period. PMID:28401945

  14. A Study on Fast Gates for Large-Scale Quantum Simulation with Trapped Ions.

    PubMed

    Taylor, Richard L; Bentley, Christopher D B; Pedernales, Julen S; Lamata, Lucas; Solano, Enrique; Carvalho, André R R; Hope, Joseph J

    2017-04-12

    Large-scale digital quantum simulations require thousands of fundamental entangling gates to construct the simulated dynamics. Despite success in a variety of small-scale simulations, quantum information processing platforms have hitherto failed to demonstrate the combination of precise control and scalability required to systematically outmatch classical simulators. We analyse how fast gates could enable trapped-ion quantum processors to achieve the requisite scalability to outperform classical computers without error correction. We analyze the performance of a large-scale digital simulator, and find that fidelity of around 70% is realizable for π-pulse infidelities below 10 -5 in traps subject to realistic rates of heating and dephasing. This scalability relies on fast gates: entangling gates faster than the trap period.

  15. A global probabilistic tsunami hazard assessment from earthquake sources

    USGS Publications Warehouse

    Davies, Gareth; Griffin, Jonathan; Lovholt, Finn; Glimsdal, Sylfest; Harbitz, Carl; Thio, Hong Kie; Lorito, Stefano; Basili, Roberto; Selva, Jacopo; Geist, Eric L.; Baptista, Maria Ana

    2017-01-01

    Large tsunamis occur infrequently but have the capacity to cause enormous numbers of casualties, damage to the built environment and critical infrastructure, and economic losses. A sound understanding of tsunami hazard is required to underpin management of these risks, and while tsunami hazard assessments are typically conducted at regional or local scales, globally consistent assessments are required to support international disaster risk reduction efforts, and can serve as a reference for local and regional studies. This study presents a global-scale probabilistic tsunami hazard assessment (PTHA), extending previous global-scale assessments based largely on scenario analysis. Only earthquake sources are considered, as they represent about 80% of the recorded damaging tsunami events. Globally extensive estimates of tsunami run-up height are derived at various exceedance rates, and the associated uncertainties are quantified. Epistemic uncertainties in the exceedance rates of large earthquakes often lead to large uncertainties in tsunami run-up. Deviations between modelled tsunami run-up and event observations are quantified, and found to be larger than suggested in previous studies. Accounting for these deviations in PTHA is important, as it leads to a pronounced increase in predicted tsunami run-up for a given exceedance rate.

  16. Content Area Reading Instruction for Secondary Teacher Candidates: A Case Study of a State-Required Online Content Area Reading Course

    ERIC Educational Resources Information Center

    Biggs, Brad

    2014-01-01

    This dissertation examined in a state-required, online preservice teacher course in content area reading instruction (CARI) at a large land-grant university in Minnesota. Few studies have been published to date on revitalized literacy teacher preparation efforts in CARI (See Vagle, Dillon, Davison-Jenkins, & LaDuca, 2005; Dillon, O'Brien,…

  17. Report of the Plasma Physics and Environmental Perturbation Laboratory (PPEPL) working groups. Volume 1: Plasma probes, wakes, and sheaths working group

    NASA Technical Reports Server (NTRS)

    1974-01-01

    It is shown in this report that comprehensive in-situ study of all aspects of the entire zone disturbance caused by a body in a flowing plasma resulted in a large number if requirements on the shuttle-PPEPL facility. A large amount of necessary in-situ observation can be obtained by adopting appropriate modes of performing the experiments. Requirements are indicated for worthwhile studies, of some aspects of the problems, which can be carried out effectively while imposing relatively few constraints on the early missions. Considerations for the desired growth and improvement of the PPEPL to facilitate more complete studies in later missions are also discussed. For Vol. 2, see N74-28170; for Vol# 3, see N74-28171.

  18. 30 CFR 49.40 - Requirements for large coal mines.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Requirements for large coal mines. 49.40 Section 49.40 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR EDUCATION AND TRAINING MINE RESCUE TEAMS Mine Rescue Teams for Underground Coal Mines § 49.40 Requirements for large coal...

  19. 30 CFR 49.40 - Requirements for large coal mines.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 30 Mineral Resources 1 2012-07-01 2012-07-01 false Requirements for large coal mines. 49.40 Section 49.40 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR EDUCATION AND TRAINING MINE RESCUE TEAMS Mine Rescue Teams for Underground Coal Mines § 49.40 Requirements for large coal...

  20. 30 CFR 49.40 - Requirements for large coal mines.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Requirements for large coal mines. 49.40 Section 49.40 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR EDUCATION AND TRAINING MINE RESCUE TEAMS Mine Rescue Teams for Underground Coal Mines § 49.40 Requirements for large coal...

  1. 30 CFR 49.40 - Requirements for large coal mines.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Requirements for large coal mines. 49.40 Section 49.40 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR EDUCATION AND TRAINING MINE RESCUE TEAMS Mine Rescue Teams for Underground Coal Mines § 49.40 Requirements for large coal...

  2. 30 CFR 49.40 - Requirements for large coal mines.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 30 Mineral Resources 1 2014-07-01 2014-07-01 false Requirements for large coal mines. 49.40 Section 49.40 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR EDUCATION AND TRAINING MINE RESCUE TEAMS Mine Rescue Teams for Underground Coal Mines § 49.40 Requirements for large coal...

  3. Breeding and Genetics Symposium: really big data: processing and analysis of very large data sets.

    PubMed

    Cole, J B; Newman, S; Foertter, F; Aguilar, I; Coffey, M

    2012-03-01

    Modern animal breeding data sets are large and getting larger, due in part to recent availability of high-density SNP arrays and cheap sequencing technology. High-performance computing methods for efficient data warehousing and analysis are under development. Financial and security considerations are important when using shared clusters. Sound software engineering practices are needed, and it is better to use existing solutions when possible. Storage requirements for genotypes are modest, although full-sequence data will require greater storage capacity. Storage requirements for intermediate and results files for genetic evaluations are much greater, particularly when multiple runs must be stored for research and validation studies. The greatest gains in accuracy from genomic selection have been realized for traits of low heritability, and there is increasing interest in new health and management traits. The collection of sufficient phenotypes to produce accurate evaluations may take many years, and high-reliability proofs for older bulls are needed to estimate marker effects. Data mining algorithms applied to large data sets may help identify unexpected relationships in the data, and improved visualization tools will provide insights. Genomic selection using large data requires a lot of computing power, particularly when large fractions of the population are genotyped. Theoretical improvements have made possible the inversion of large numerator relationship matrices, permitted the solving of large systems of equations, and produced fast algorithms for variance component estimation. Recent work shows that single-step approaches combining BLUP with a genomic relationship (G) matrix have similar computational requirements to traditional BLUP, and the limiting factor is the construction and inversion of G for many genotypes. A naïve algorithm for creating G for 14,000 individuals required almost 24 h to run, but custom libraries and parallel computing reduced that to 15 m. Large data sets also create challenges for the delivery of genetic evaluations that must be overcome in a way that does not disrupt the transition from conventional to genomic evaluations. Processing time is important, especially as real-time systems for on-farm decisions are developed. The ultimate value of these systems is to decrease time-to-results in research, increase accuracy in genomic evaluations, and accelerate rates of genetic improvement.

  4. Procedures and equipment for staining large numbers of plant root samples for endomycorrhizal assay.

    PubMed

    Kormanik, P P; Bryan, W C; Schultz, R C

    1980-04-01

    A simplified method of clearing and staining large numbers of plant roots for vesicular-arbuscular (VA) mycorrhizal assay is presented. Equipment needed for handling multiple samples is described, and two formulations for the different chemical solutions are presented. Because one formulation contains phenol, its use should be limited to basic studies for which adequate laboratory exhaust hoods are available and great clarity of fungal structures is required. The second staining formulation, utilizing lactic acid instead of phenol, is less toxic, requires less elaborate laboratory facilities, and has proven to be completely satisfactory for VA assays.

  5. Advanced optical sensing and processing technologies for the distributed control of large flexible spacecraft

    NASA Technical Reports Server (NTRS)

    Williams, G. M.; Fraser, J. C.

    1991-01-01

    The objective was to examine state-of-the-art optical sensing and processing technology applied to control the motion of flexible spacecraft. Proposed large flexible space systems, such an optical telescopes and antennas, will require control over vast surfaces. Most likely distributed control will be necessary involving many sensors to accurately measure the surface. A similarly large number of actuators must act upon the system. The used technical approach included reviewing proposed NASA missions to assess system needs and requirements. A candidate mission was chosen as a baseline study spacecraft for comparison of conventional and optical control components. Control system requirements of the baseline system were used for designing both a control system containing current off-the-shelf components and a system utilizing electro-optical devices for sensing and processing. State-of-the-art surveys of conventional sensor, actuator, and processor technologies were performed. A technology development plan is presented that presents a logical, effective way to develop and integrate advancing technologies.

  6. TRANSITION FROM KINETIC TO MHD BEHAVIOR IN A COLLISIONLESS PLASMA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parashar, Tulasi N.; Matthaeus, William H.; Shay, Michael A.

    The study of kinetic effects in heliospheric plasmas requires representation of dynamics at sub-proton scales, but in most cases the system is driven by magnetohydrodynamic (MHD) activity at larger scales. The latter requirement challenges available computational resources, which raises the question of how large such a system must be to exhibit MHD traits at large scales while kinetic behavior is accurately represented at small scales. Here we study this implied transition from kinetic to MHD-like behavior using particle-in-cell (PIC) simulations, initialized using an Orszag–Tang Vortex. The PIC code treats protons, as well as electrons, kinetically, and we address the questionmore » of interest by examining several different indicators of MHD-like behavior.« less

  7. A Quantitative Examination of Critical Success Factors Comparing Agile and Waterfall Project Management Methodologies

    ERIC Educational Resources Information Center

    Pedersen, Mitra

    2013-01-01

    This study investigated the rate of success for IT projects using agile and standard project management methodologies. Any successful project requires use of project methodology. Specifically, large projects require formal project management methodologies or models, which establish a blueprint of processes and project planning activities. This…

  8. How much electrical energy storage do we need? A synthesis for the U.S., Europe, and Germany

    DOE PAGES

    Cebulla, Felix; Haas, Jannik; Eichman, Josh; ...

    2018-02-03

    Electrical energy storage (EES) is a promising flexibility source for prospective low-carbon energy systems. In the last couple of years, many studies for EES capacity planning have been produced. However, these resulted in a very broad range of power and energy capacity requirements for storage, making it difficult for policymakers to identify clear storage planning recommendations. Therefore, we studied 17 recent storage expansion studies pertinent to the U.S., Europe, and Germany. We then systemized the storage requirement per variable renewable energy (VRE) share and generation technology. Our synthesis reveals that with increasing VRE shares, the EES power capacity increases linearly;more » and the energy capacity, exponentially. Further, by analyzing the outliers, the EES energy requirements can be at least halved. It becomes clear that grids dominated by photovoltaic energy call for more EES, while large shares of wind rely more on transmission capacity. Taking into account the energy mix clarifies - to a large degree - the apparent conflict of the storage requirements between the existing studies. Finally, there might exist a negative bias towards storage because transmission costs are frequently optimistic (by neglecting execution delays and social opposition) and storage can cope with uncertainties, but these issues are rarely acknowledged in the planning process.« less

  9. How much electrical energy storage do we need? A synthesis for the U.S., Europe, and Germany

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cebulla, Felix; Haas, Jannik; Eichman, Josh

    Electrical energy storage (EES) is a promising flexibility source for prospective low-carbon energy systems. In the last couple of years, many studies for EES capacity planning have been produced. However, these resulted in a very broad range of power and energy capacity requirements for storage, making it difficult for policymakers to identify clear storage planning recommendations. Therefore, we studied 17 recent storage expansion studies pertinent to the U.S., Europe, and Germany. We then systemized the storage requirement per variable renewable energy (VRE) share and generation technology. Our synthesis reveals that with increasing VRE shares, the EES power capacity increases linearly;more » and the energy capacity, exponentially. Further, by analyzing the outliers, the EES energy requirements can be at least halved. It becomes clear that grids dominated by photovoltaic energy call for more EES, while large shares of wind rely more on transmission capacity. Taking into account the energy mix clarifies - to a large degree - the apparent conflict of the storage requirements between the existing studies. Finally, there might exist a negative bias towards storage because transmission costs are frequently optimistic (by neglecting execution delays and social opposition) and storage can cope with uncertainties, but these issues are rarely acknowledged in the planning process.« less

  10. Shape accuracy requirements on starshades for large and small apertures

    NASA Astrophysics Data System (ADS)

    Shaklan, Stuart B.; Marchen, Luis; Cady, Eric

    2017-09-01

    Starshades have been designed to work with large and small telescopes alike. With smaller telescopes, the targets tend to be brighter and closer to the Solar System, and their putative planetary systems span angles that require starshades with radii of 10-30 m at distances of 10s of Mm. With larger apertures, the light-collecting power enables studies of more numerous, fainter systems, requiring larger, more distant starshades with radii >50 m at distances of 100s of Mm. Characterization using infrared wavelengths requires even larger starshades. A mitigating approach is to observe planets between the petals, where one can observe regions closer to the star but with reduced throughput and increased instrument scatter. We compare the starshade shape requirements, including petal shape, petal positioning, and other key terms, for the WFIRST 26m starshade and the HABEX 72 m starshade concepts, over a range of working angles and telescope sizes. We also compare starshades having rippled and smooth edges and show that their performance is nearly identical.

  11. The Large Synoptic Survey Telescope OCS and TCS models

    NASA Astrophysics Data System (ADS)

    Schumacher, German; Delgado, Francisco

    2010-07-01

    The Large Synoptic Survey Telescope (LSST) is a project envisioned as a system of systems with demanding science, technical, and operational requirements, that must perform as a fully integrated unit. The design and implementation of such a system poses big engineering challenges when performing requirements analysis, detailed interface definitions, operational modes and control strategy studies. The OMG System Modeling Language (SysML) has been selected as the framework for the systems engineering analysis and documentation for the LSST. Models for the overall system architecture and different observatory subsystems have been built describing requirements, structure, interfaces and behavior. In this paper we show the models for the Observatory Control System (OCS) and the Telescope Control System (TCS), and how this methodology has helped in the clarification of the design and requirements. In one common language, the relationships of the OCS, TCS, Camera and Data management subsystems are captured with models of the structure, behavior, requirements and the traceability between them.

  12. A new way to protect privacy in large-scale genome-wide association studies.

    PubMed

    Kamm, Liina; Bogdanov, Dan; Laur, Sven; Vilo, Jaak

    2013-04-01

    Increased availability of various genotyping techniques has initiated a race for finding genetic markers that can be used in diagnostics and personalized medicine. Although many genetic risk factors are known, key causes of common diseases with complex heritage patterns are still unknown. Identification of such complex traits requires a targeted study over a large collection of data. Ideally, such studies bring together data from many biobanks. However, data aggregation on such a large scale raises many privacy issues. We show how to conduct such studies without violating privacy of individual donors and without leaking the data to third parties. The presented solution has provable security guarantees. Supplementary data are available at Bioinformatics online.

  13. 77 FR 14505 - Proposed Information Collection; Comment Request; Gear-Marking Requirement for Atlantic Large...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-12

    ... Collection; Comment Request; Gear-Marking Requirement for Atlantic Large Whale Take Reduction Plan AGENCY... of large whales, especially right whales, due to incidental entanglement in the United States (U.S... Large Whale Take Reduction Plan (ALWTRP), developed under the authority of the Marine Mammal Protection...

  14. Study of Membrane Reflector Technology

    NASA Technical Reports Server (NTRS)

    Knapp, K.; Hedgepeth, J.

    1979-01-01

    Very large reflective surfaces are required by future spacecraft for such purposes as solar energy collection, antenna surfaces, thermal control, attitude and orbit control with solar pressure, and solar sailing. The performance benefits in large membrane reflector systems, which may be derived from an advancement of this film and related structures technology, are identified and qualified. The results of the study are reported and summarized. Detailed technical discussions of various aspects of the study are included in several separate technical notes which are referenced.

  15. Large area crop inventory experiment crop assessment subsystem software requirements document

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The functional data processing requirements are described for the Crop Assessment Subsystem of the Large Area Crop Inventory Experiment. These requirements are used as a guide for software development and implementation.

  16. General Education Oral Communication Assessment and Student Preferences for Learning: E-Textbook versus Paper Textbook

    ERIC Educational Resources Information Center

    Dwyer, Karen Kangas; Davidson, Marlina M.

    2013-01-01

    As part of a yearly university mandated assessment of a large basic communication course that fulfills the oral communication general education requirement, this study examined student preferences for textbooks, reading, and learning. Specifically, basic course students ("N"=321) at a large state university in the Midwest were asked to…

  17. Turkish Version of Students' Ideas about Nature of Science Questionnaire: A Validation Study

    ERIC Educational Resources Information Center

    Cansiz, Mustafa; Cansiz, Nurcan; Tas, Yasemin; Yerdelen, Sundus

    2017-01-01

    Mass assessment of large samples' nature of science views has been one of the core concerns in science education research. Due to impracticality of using open-ended questionnaires or conducting interviews with large groups, another line of research has been required for mass assessment of pupils' nature of science conception meaningfully.…

  18. The microwave radiometer spacecraft: A design study

    NASA Technical Reports Server (NTRS)

    Wright, R. L. (Editor)

    1981-01-01

    A large passive microwave radiometer spacecraft with near all weather capability of monitoring soil moisture for global crop forecasting was designed. The design, emphasizing large space structures technology, characterized the mission hardware at the conceptual level in sufficient detail to identify enabling and pacing technologies. Mission and spacecraft requirements, design and structural concepts, electromagnetic concepts, and control concepts are addressed.

  19. Linear Approximation SAR Azimuth Processing Study

    NASA Technical Reports Server (NTRS)

    Lindquist, R. B.; Masnaghetti, R. K.; Belland, E.; Hance, H. V.; Weis, W. G.

    1979-01-01

    A segmented linear approximation of the quadratic phase function that is used to focus the synthetic antenna of a SAR was studied. Ideal focusing, using a quadratic varying phase focusing function during the time radar target histories are gathered, requires a large number of complex multiplications. These can be largely eliminated by using linear approximation techniques. The result is a reduced processor size and chip count relative to ideally focussed processing and a correspondingly increased feasibility for spaceworthy implementation. A preliminary design and sizing for a spaceworthy linear approximation SAR azimuth processor meeting requirements similar to those of the SEASAT-A SAR was developed. The study resulted in a design with approximately 1500 IC's, 1.2 cubic feet of volume, and 350 watts of power for a single look, 4000 range cell azimuth processor with 25 meters resolution.

  20. Acoustical case studies of three green buildings

    NASA Astrophysics Data System (ADS)

    Siebein, Gary; Lilkendey, Robert; Skorski, Stephen

    2005-04-01

    Case studies of 3 green buildings with LEED certifications that required extensive acoustical retrofit work to become satisfactory work environments for their intended user groups will be used to define areas where green building design concepts and acoustical design concepts require reconciliation. Case study 1 is an office and conference center for a city environmental education agency. Large open spaces intended to collect daylight through clerestory windows provided large, reverberant volumes with few acoustic finishes that rendered them unsuitable as open office space and a conference room/auditorium. Case Study 2 describes one of the first gold LEED buildings in the southeast whose primary design concepts were so narrowly focused on thermal and lighting issues that they often worked directly against basic acoustical requirements resulting in sound levels of NC 50-55 in classrooms and faculty offices, crosstalk between classrooms and poor room acoustics. Case study 3 is an environmental education and conference center with open public areas, very high ceilings, and all reflective surfaces made from wood and other environmentally friendly materials that result in excessive loudness when the building is used by the numbers of people which it was intended to serve.

  1. Beyond the Large Hadron Collider: A First Look at Cryogenics for CERN Future Circular Colliders

    NASA Astrophysics Data System (ADS)

    Lebrun, Philippe; Tavian, Laurent

    Following the first experimental discoveries at the Large Hadron Collider (LHC) and the recent update of the European strategy in particle physics, CERN has undertaken an international study of possible future circular colliders beyond the LHC. The study, conducted with the collaborative participation of interested institutes world-wide, considers several options for very high energy hadron-hadron, electron-positron and hadron-electron colliders to be installed in a quasi-circular underground tunnel in the Geneva basin, with a circumference of 80 km to 100 km. All these machines would make intensive use of advanced superconducting devices, i.e. high-field bending and focusing magnets and/or accelerating RF cavities, thus requiring large helium cryogenic systems operating at 4.5 K or below. Based on preliminary sets of parameters and layouts for the particle colliders under study, we discuss the main challenges of their cryogenic systems and present first estimates of the cryogenic refrigeration capacities required, with emphasis on the qualitative and quantitative steps to be accomplished with respect to the present state-of-the-art.

  2. Feasibility study of a synthesis procedure for array feeds to improve radiation performance of large distorted reflector antennas

    NASA Technical Reports Server (NTRS)

    Stutzman, W. L.; Smith, W. T.

    1990-01-01

    Surface errors on parabolic reflector antennas degrade the overall performance of the antenna. Space antenna structures are difficult to build, deploy and control. They must maintain a nearly perfect parabolic shape in a harsh environment and must be lightweight. Electromagnetic compensation for surface errors in large space reflector antennas can be used to supplement mechanical compensation. Electromagnetic compensation for surface errors in large space reflector antennas has been the topic of several research studies. Most of these studies try to correct the focal plane fields of the reflector near the focal point and, hence, compensate for the distortions over the whole radiation pattern. An alternative approach to electromagnetic compensation is presented. The proposed technique uses pattern synthesis to compensate for the surface errors. The pattern synthesis approach uses a localized algorithm in which pattern corrections are directed specifically towards portions of the pattern requiring improvement. The pattern synthesis technique does not require knowledge of the reflector surface. It uses radiation pattern data to perform the compensation.

  3. Large gamma-ray detector arrays and electromagnetic separators

    NASA Astrophysics Data System (ADS)

    Lee, I.-Yang

    2013-12-01

    The use of large gamma-ray detector arrays with electromagnetic separators is a powerful combination. Various types of gamma-ray detectors have been used; some provide high detector efficiency such as scintillation detector array, others use Ge detectors for good energy resolution, and recently developed Ge energy tracking arrays gives both high peak-to-background ratio and position resolution. Similarly, different types of separators were used to optimize the performance under different experimental requirements and conditions. For example, gas-filled separators were used in heavy element studies for their large efficiency and beam rejection factor. Vacuum separators with good isotope resolution were used in transfer and fragmentation reactions for the study of nuclei far from stability. This paper presents results from recent experiments using gamma-ray detector arrays in combination with electromagnetic separators, and discusses the physics opportunities provided by these instruments. In particular, we review the performance of the instruments currently in use, and discuss the requirements of instruments for future radioactive beam accelerator facilities.

  4. Rapid Geometry Creation for Computer-Aided Engineering Parametric Analyses: A Case Study Using ComGeom2 for Launch Abort System Design

    NASA Technical Reports Server (NTRS)

    Hawke, Veronica; Gage, Peter; Manning, Ted

    2007-01-01

    ComGeom2, a tool developed to generate Common Geometry representation for multidisciplinary analysis, has been used to create a large set of geometries for use in a design study requiring analysis by two computational codes. This paper describes the process used to generate the large number of configurations and suggests ways to further automate the process and make it more efficient for future studies. The design geometry for this study is the launch abort system of the NASA Crew Launch Vehicle.

  5. Large Space Antenna Systems Technology, 1984

    NASA Technical Reports Server (NTRS)

    Boyer, W. J. (Compiler)

    1985-01-01

    Papers are presented which provide a comprehensive review of space missions requiring large antenna systems and of the status of key technologies required to enable these missions. Topic areas include mission applications for large space antenna systems, large space antenna structural systems, materials and structures technology, structural dynamics and control technology, electromagnetics technology, large space antenna systems and the space station, and flight test and evaluation.

  6. A fast time-difference inverse solver for 3D EIT with application to lung imaging.

    PubMed

    Javaherian, Ashkan; Soleimani, Manuchehr; Moeller, Knut

    2016-08-01

    A class of sparse optimization techniques that require solely matrix-vector products, rather than an explicit access to the forward matrix and its transpose, has been paid much attention in the recent decade for dealing with large-scale inverse problems. This study tailors application of the so-called Gradient Projection for Sparse Reconstruction (GPSR) to large-scale time-difference three-dimensional electrical impedance tomography (3D EIT). 3D EIT typically suffers from the need for a large number of voxels to cover the whole domain, so its application to real-time imaging, for example monitoring of lung function, remains scarce since the large number of degrees of freedom of the problem extremely increases storage space and reconstruction time. This study shows the great potential of the GPSR for large-size time-difference 3D EIT. Further studies are needed to improve its accuracy for imaging small-size anomalies.

  7. Written Parental Consent and the Use of Incentives in a Youth Smoking Prevention Trial: A Case Study from Project SPLASH

    ERIC Educational Resources Information Center

    Leakey, Tricia; Lunde, Kevin B.; Koga, Karin; Glanz, Karen

    2004-01-01

    More Institutional Review Boards (IRBs) are requiring written parental consent in school health intervention trials. Because this requirement presents a formidable challenge in conducting large-scale research, it is vital for investigators to share effective strategies learned from completed trials. Investigators for the recently completed Project…

  8. The Requirement for Vocational Skills in the Engineering Industry in the Areas of Modena and Vienna. Synthesis Report.

    ERIC Educational Resources Information Center

    Gatti, Mario; Mereu, Maria Grazia; Tagliaferro, Claudio; Markowitsch, Jorg; Neuberger, Robert

    Requirements for vocational skills in the engineering industry in Modena, Italy, and Vienna, Austria, were studied. In Modena, employees of a representative sample of 90 small, medium, and large firms in the mechanical processing, agricultural machinery, and sports car manufacturing sectors were interviewed. In Vienna, data were collected through…

  9. Integration of Digital Technology and Innovative Strategies for Learning and Teaching Large Classes: A Calculus Case Study

    ERIC Educational Resources Information Center

    Vajravelu, Kuppalapalle; Muhs, Tammy

    2016-01-01

    Successful science and engineering programs require proficiency and dynamics in mathematics classes to enhance the learning of complex subject matter with a sufficient amount of practical problem solving. Improving student performance and retention in mathematics classes requires inventive approaches. At the University of Central Florida (UCF) the…

  10. How Big Is Big Enough? Sample Size Requirements for CAST Item Parameter Estimation

    ERIC Educational Resources Information Center

    Chuah, Siang Chee; Drasgow, Fritz; Luecht, Richard

    2006-01-01

    Adaptive tests offer the advantages of reduced test length and increased accuracy in ability estimation. However, adaptive tests require large pools of precalibrated items. This study looks at the development of an item pool for 1 type of adaptive administration: the computer-adaptive sequential test. An important issue is the sample size required…

  11. Structural Feasibility Analysis of a Robotically Assembled Very Large Aperture Optical Space Telescope

    NASA Technical Reports Server (NTRS)

    Wilkie, William Keats; Williams, R. Brett; Agnes, Gregory S.; Wilcox, Brian H.

    2007-01-01

    This paper presents a feasibility study of robotically constructing a very large aperture optical space telescope on-orbit. Since the largest engineering challenges are likely to reside in the design and assembly of the 150-m diameter primary reflector, this preliminary study focuses on this component. The same technology developed for construction of the primary would then be readily used for the smaller optical structures (secondary, tertiary, etc.). A reasonable set of ground and on-orbit loading scenarios are compiled from the literature and used to define the structural performance requirements and size the primary reflector. A surface precision analysis shows that active adjustment of the primary structure is required in order to meet stringent optical surface requirements. Two potential actuation strategies are discussed along with potential actuation devices at the current state of the art. The finding of this research effort indicate that successful technology development combined with further analysis will likely enable such a telescope to be built in the future.

  12. AMTD - Advanced Mirror Technology Development in Mechanical Stability

    NASA Technical Reports Server (NTRS)

    Knight, J. Brent

    2015-01-01

    Analytical tools and processes are being developed at NASA Marshal Space Flight Center in support of the Advanced Mirror Technology Development (AMTD) project. One facet of optical performance is mechanical stability with respect to structural dynamics. Pertinent parameters are: (1) the spacecraft structural design, (2) the mechanical disturbances on-board the spacecraft (sources of vibratory/transient motion such as reaction wheels), (3) the vibration isolation systems (invariably required to meet future science needs), and (4) the dynamic characteristics of the optical system itself. With stability requirements of future large aperture space telescopes being in the lower Pico meter regime, it is paramount that all sources of mechanical excitation be considered in both feasibility studies and detailed analyses. The primary objective of this paper is to lay out a path to perform feasibility studies of future large aperture space telescope projects which require extreme stability. To get to that end, a high level overview of a structural dynamic analysis process to assess an integrated spacecraft and optical system is included.

  13. Precision requirements and innovative manufacturing for ultrahigh precision laser interferometry of gravitational-wave astronomy

    NASA Astrophysics Data System (ADS)

    Ni, Wei-Tou; Han, Sen; Jin, Tao

    2016-11-01

    With the LIGO announcement of the first direct detection of gravitational waves (GWs), the GW Astronomy was formally ushered into our age. After one-hundred years of theoretical investigation and fifty years of experimental endeavor, this is a historical landmark not just for physics and astronomy, but also for industry and manufacturing. The challenge and opportunity for industry is precision and innovative manufacturing in large size - production of large and homogeneous optical components, optical diagnosis of large components, high reflectance dielectric coating on large mirrors, manufacturing of components for ultrahigh vacuum of large volume, manufacturing of high attenuating vibration isolation system, production of high-power high-stability single-frequency lasers, production of high-resolution positioning systems etc. In this talk, we address the requirements and methods to satisfy these requirements. Optical diagnosis of large optical components requires large phase-shifting interferometer; the 1.06 μm Phase Shifting Interferometer for testing LIGO optics and the recently built 24" phase-shifting Interferometer in Chengdu, China are examples. High quality mirrors are crucial for laser interferometric GW detection, so as for ring laser gyroscope, high precision laser stabilization via optical cavities, quantum optomechanics, cavity quantum electrodynamics and vacuum birefringence measurement. There are stringent requirements on the substrate materials and coating methods. For cryogenic GW interferometer, appropriate coating on sapphire or silicon are required for good thermal and homogeneity properties. Large ultrahigh vacuum components and high attenuating vibration system together with an efficient metrology system are required and will be addressed. For space interferometry, drag-free technology and weak-light manipulation technology are must. Drag-free technology is well-developed. Weak-light phase locking is demonstrated in the laboratories while weak-light manipulation technology still needs developments.

  14. A Framework for Spatial Interaction Analysis Based on Large-Scale Mobile Phone Data

    PubMed Central

    Li, Weifeng; Cheng, Xiaoyun; Guo, Gaohua

    2014-01-01

    The overall understanding of spatial interaction and the exact knowledge of its dynamic evolution are required in the urban planning and transportation planning. This study aimed to analyze the spatial interaction based on the large-scale mobile phone data. The newly arisen mass dataset required a new methodology which was compatible with its peculiar characteristics. A three-stage framework was proposed in this paper, including data preprocessing, critical activity identification, and spatial interaction measurement. The proposed framework introduced the frequent pattern mining and measured the spatial interaction by the obtained association. A case study of three communities in Shanghai was carried out as verification of proposed method and demonstration of its practical application. The spatial interaction patterns and the representative features proved the rationality of the proposed framework. PMID:25435865

  15. Motivators that Do Not Motivate: The Case of Chinese EFL Learners and the Influence of Culture on Motivation

    ERIC Educational Resources Information Center

    Chen, Judy F.; Warden, Clyde A.; Chang, Huo-Tsan

    2005-01-01

    Language learning motivation plays an important role in both research and teaching, yet language learners are still largely understood in terms of North American and European cultural values. This research explored language learning motivation constructs in a Chinese cultural setting, where large numbers of students are required to study English.…

  16. A Comparison of Linking Methods for Estimating National Trends in International Comparative Large-Scale Assessments in the Presence of Cross-national DIF

    ERIC Educational Resources Information Center

    Sachse, Karoline A.; Roppelt, Alexander; Haag, Nicole

    2016-01-01

    Trend estimation in international comparative large-scale assessments relies on measurement invariance between countries. However, cross-national differential item functioning (DIF) has been repeatedly documented. We ran a simulation study using national item parameters, which required trends to be computed separately for each country, to compare…

  17. Revision of the Rawls et al. (1982) pedotransfer functions for their applicability to US croplands

    USDA-ARS?s Scientific Manuscript database

    Large scale environmental impact studies typically involve the use of simulation models and require a variety of inputs, some of which may need to be estimated in absence of adequate measured data. As an example, soil water retention needs to be estimated for a large number of soils that are to be u...

  18. Analysis of high load dampers

    NASA Technical Reports Server (NTRS)

    Bhat, S. T.; Buono, D. F.; Hibner, D. H.

    1981-01-01

    High load damping requirements for modern jet engines are discussed. The design of damping systems which could satisfy these requirements is also discusseed. In order to evaluate high load damping requirements, engines in three major classes were studied; large transport engines, small general aviation engines, and military engines. Four damper concepts applicable to these engines were evaluated; multi-ring, cartridge, curved beam, and viscous/friction. The most promising damper concept was selected for each engine and performance was assessed relative to conventional dampers and in light of projected damping requirements for advanced jet engines.

  19. A modified 3D algorithm for road traffic noise attenuation calculations in large urban areas.

    PubMed

    Wang, Haibo; Cai, Ming; Yao, Yifan

    2017-07-01

    The primary objective of this study is the development and application of a 3D road traffic noise attenuation calculation algorithm. First, the traditional empirical method does not address problems caused by non-direct occlusion by buildings and the different building heights. In contrast, this study considers the volume ratio of the buildings and the area ratio of the projection of buildings adjacent to the road. The influence of the ground affection is analyzed. The insertion loss due to barriers (infinite length and finite barriers) is also synthesized in the algorithm. Second, the impact of different road segmentation is analyzed. Through the case of Pearl River New Town, it is recommended that 5° is the most appropriate scanning angle as the computational time is acceptable and the average error is approximately 3.1 dB. In addition, the algorithm requires only 1/17 of the time that the beam tracking method requires at the cost of more imprecise calculation results. Finally, the noise calculation for a large urban area with a high density of buildings shows the feasibility of the 3D noise attenuation calculation algorithm. The algorithm is expected to be applied in projects requiring large area noise simulations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. AAFE large deployable antenna development program: Executive summary

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The large deployable antenna development program sponsored by the Advanced Applications Flight Experiments of the Langley Research Center is summarized. Projected user requirements for large diameter deployable reflector antennas were reviewed. Trade-off studies for the selection of a design concept for 10-meter diameter reflectors were made. A hoop/column concept was selected as the baseline concept. Parametric data are presented for 15-meter, 30-meter, and 100-meter diameters. A 1.82-meter diameter engineering model which demonstrated the feasiblity of the concept is described.

  1. Geostationary platform systems concepts definition follow-on study. Volume 2A: Technical Task 2 LSST special emphasis

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The results of the Large Space Systems Technology special emphasis task are presented. The task was an analysis of structural requirements deriving from the initial Phase A Operational Geostationary Platform study.

  2. The costs and effectiveness of large Phase III pre-licensure vaccine clinical trials.

    PubMed

    Black, Steven

    2015-01-01

    Prior to the 1980s, most vaccines were licensed based upon safety and effectiveness studies in several hundred individuals. Beginning with the evaluation of Haemophilus influenzae type b conjugate vaccines, much larger pre-licensure trials became common. The pre-licensure trial for Haemophilus influenzae oligosaccharide conjugate vaccine had more than 60,000 children and that of the seven-valent pneumococcal conjugate vaccine included almost 38,000 children. Although trial sizes for both of these studies were driven by the sample size required to demonstrate efficacy, the sample size requirements for safety evaluations of other vaccines have subsequently increased. With the demonstration of an increased risk of intussusception following the Rotashield brand rotavirus vaccine, this trend has continued. However, routinely requiring safety studies of 20,000-50,000 or more participants has two major downsides. First, the cost of performing large safety trials routinely prior to licensure of a vaccine is very large, with some estimates as high at US$200 million euros for one vaccine. This high financial cost engenders an opportunity cost whereby the number of vaccines that a company is willing or able to develop to meet public health needs becomes limited by this financial barrier. The second downside is that in the pre-licensure setting, such studies are very time consuming and delay the availability of a beneficial vaccine substantially. One might argue that in some situations, this financial commitment is warranted such as for evaluations of the risk of intussusception following newer rotavirus vaccines. However, it must be noted that while an increased risk of intussusception was not identified in large pre-licensure studies, in post marketing evaluations an increased risk of this outcome has been identified. Thus, even the extensive pre-licensure evaluations conducted did not identify an associated risk. The limitations of large Phase III trials have also been demonstrated in efficacy trials. Notably, pre-licensure trials of pneumococcal conjugate severely underestimated their true effect and cost-effectiveness. In fact, in discussions prior to vaccine introduction in the USA for PCV7, the vaccine was said to be not cost-effective and some counseled against its introduction. In reality, following introduction, PCV7 has been shown to be highly cost-effective. In the last decade, new methods have been identified using large linked databases such as the Vaccine Safety Datalink in the USA that allow identification of an increased risk of an event within a few months of vaccine introduction and that can screen for unanticipated very rare events as well. In addition, the availability of electronic medical records and hospital discharge data in many settings allows for accurate assessment of vaccine effectiveness. Given the high financial and opportunity cost of requiring large pre-licensure safety studies, consideration could be given to 'conditional licensure' of vaccines whose delivery system is well characterized in a setting where sophisticated pharmacovigilance systems exist on the condition that such licensure would incorporate a requirement for rapid cycle and other real-time evaluations of safety and effectiveness following introduction. This would actually allow for a more complete and timely evaluation of vaccines, lower the financial barrier to development of new vaccines and thus allow a broader portfolio of vaccines to be developed and successfully introduced.

  3. Functional Network Architecture of Reading-Related Regions across Development

    ERIC Educational Resources Information Center

    Vogel, Alecia C.; Church, Jessica A.; Power, Jonathan D.; Miezin, Fran M.; Petersen, Steven E.; Schlaggar, Bradley L.

    2013-01-01

    Reading requires coordinated neural processing across a large number of brain regions. Studying relationships between reading-related regions informs the specificity of information processing performed in each region. Here, regions of interest were defined from a meta-analysis of reading studies, including a developmental study. Relationships…

  4. Need, utilization, and configuration of a large, multi-G centrifuge on the Space Station

    NASA Technical Reports Server (NTRS)

    Bonting, Sjoerd L.

    1987-01-01

    A large, multi-g centrifuge is required on the Space Station (1) to provide valid 1-g controls for the study of zero-g effects on animals and plants and to study readaptation to 1 g; (2) to store animals at 1 g prior to short-term zero-g experimentation; (3) to permit g-level threshold studies of gravity effects. These requirements can be met by a 13-ft-diam., center-mounted centrifuge, on which up to 48 modular habitats with animals (squirrel monkey, rat, mouse) and plants are attached. The advantages of locating this centrifuge with the vivarium, a common environmental control and life support system, a general-purpose work station and storage of food, water, and supplies in an attached short module, are elaborated. Servicing and operation of the centrifuge, as well as minimizing its impact on other Space Station functions are also considered.

  5. Exploration Planetary Surface Structural Systems: Design Requirements and Compliance

    NASA Technical Reports Server (NTRS)

    Dorsey, John T.

    2011-01-01

    The Lunar Surface Systems Project developed system concepts that would be necessary to establish and maintain a permanent human presence on the Lunar surface. A variety of specific system implementations were generated as a part of the scenarios, some level of system definition was completed, and masses estimated for each system. Because the architecture studies generally spawned a large number of system concepts and the studies were executed in a short amount of time, the resulting system definitions had very low design fidelity. This paper describes the development sequence required to field a particular structural system: 1) Define Requirements, 2) Develop the Design and 3) Demonstrate Compliance of the Design to all Requirements. This paper also outlines and describes in detail the information and data that are required to establish structural design requirements and outlines the information that would comprise a planetary surface system Structures Requirements document.

  6. The electrical performance of Ag Zn batteries for the Venus multi-probe mission

    NASA Technical Reports Server (NTRS)

    Palandati, C.

    1975-01-01

    An evaluation of 5 Ah and 21 Ah Silver-Zinc batteries was made to determine their suitability to meet the energy storage requirements of the bus vehicle, 3 small probes and large probe for the Venus multi-probe mission. The evaluation included a 4 Ah battery for the small probe, a 21 Ah battery for the large probe, one battery of each size for the bus vehicle power, a periodic cycling test on each size battery and a wet stand test of charged and discharged cells of both cell designs. The study on the probe batteries and bus vehicle batteries included both electrical and thermal simulation for the entire mission. The effects on silver migration and zinc penetration of the cellophane separators caused by the various test parameters were determined by visual and X-ray fluorescence analysis. The 5 Ah batteries supported the power requirements for the bus vehicle and small probe. The 21 Ah large probe battery supplied the required mission power. Both probe batteries delivered in excess of 132 percent of rated capacity at the completion of the mission simulation.

  7. A preliminary study of a very large space radiometric antenna

    NASA Technical Reports Server (NTRS)

    Agrawal, P. K.

    1979-01-01

    An approach used to compute the size of a special radiometric reflector antenna is presented. Operating at 1 GHz, this reflector is required to produce 200 simultaneous contiguous beams, each with a 3 dB footprint of 1 km from an assumed satellite height of 650 km. The overall beam efficiency for each beam is required to be more than 90%.

  8. Effect of initial planting spacing on wood properties of unthinned loblolly pine at age 21

    Treesearch

    Alexander III Clark; Lewis Jordan; Laurie Schimleck; Richard F. Daniels

    2008-01-01

    Young, fast growing, intensively managed plantation loblolly pine (Pinus taeda L.) contains a large proportion of juvenile wood that may not have the stiffness required to meet the design requirements for southern pine dimension lumber. An unthinned loblolly pine spacing study was sampled to determine the effect of initial spacing on wood stiffness,...

  9. Design Optimization of a Variable-Speed Power Turbine

    NASA Technical Reports Server (NTRS)

    Hendricks, Eric S.; Jones, Scott M.; Gray, Justin S.

    2014-01-01

    NASA's Rotary Wing Project is investigating technologies that will enable the development of revolutionary civil tilt rotor aircraft. Previous studies have shown that for large tilt rotor aircraft to be viable, the rotor speeds need to be slowed significantly during the cruise portion of the flight. This requirement to slow the rotors during cruise presents an interesting challenge to the propulsion system designer as efficient engine performance must be achieved at two drastically different operating conditions. One potential solution to this challenge is to use a transmission with multiple gear ratios and shift to the appropriate ratio during flight. This solution will require a large transmission that is likely to be maintenance intensive and will require a complex shifting procedure to maintain power to the rotors at all times. An alternative solution is to use a fixed gear ratio transmission and require the power turbine to operate efficiently over the entire speed range. This concept is referred to as a variable-speed power-turbine (VSPT) and is the focus of the current study. This paper explores the design of a variable speed power turbine for civil tilt rotor applications using design optimization techniques applied to NASA's new meanline tool, the Object-Oriented Turbomachinery Analysis Code (OTAC).

  10. A Weight Comparison of Several Attitude Controls for Satellites

    NASA Technical Reports Server (NTRS)

    Adams, James J.; Chilton, Robert G.

    1959-01-01

    A brief theoretical study has been made for the purpose for estimating and comparing the weight of three different types of controls that can be used to change the attitude of a satellite. The three types of controls are jet reaction, inertia wheel, and a magnetic bar which interacts with the magnetic field of the earth. An idealized task which imposed severe requirements on the angular motion of the satellite was used as the basis for comparison. The results showed that a control for one axis can be devised which will weigh less than 1 percent of the total weight of the satellite. The inertia-wheel system offers weight-saving possibilities if a large number of cycles of operation are required, whereas the jet system would be preferred if a limited number of cycles are required. The magnetic-bar control requires such a large magnet that it is impractical for the example application but might be of value for supplying small trimming moments about certain axes.

  11. Use of Patient Registries and Administrative Datasets for the Study of Pediatric Cancer

    PubMed Central

    Rice, Henry E.; Englum, Brian R.; Gulack, Brian C.; Adibe, Obinna O.; Tracy, Elizabeth T.; Kreissman, Susan G.; Routh, Jonathan C.

    2015-01-01

    Analysis of data from large administrative databases and patient registries is increasingly being used to study childhood cancer care, although the value of these data sources remains unclear to many clinicians. Interpretation of large databases requires a thorough understanding of how the dataset was designed, how data were collected, and how to assess data quality. This review will detail the role of administrative databases and registry databases for the study of childhood cancer, tools to maximize information from these datasets, and recommendations to improve the use of these databases for the study of pediatric oncology. PMID:25807938

  12. Solar thermal propulsion for planetary spacecraft

    NASA Technical Reports Server (NTRS)

    Sercel, J. C.

    1985-01-01

    Previous studies have shown that many desirable planetary exploration missions require large injection delta-V. Solar Thermal Rocket (STR) propulsion, under study for orbit-raising applications may enhance or enable such high-energy missions. The required technology of thermal control for liquid hydrogen propellant is available for the required storage duration. Self-deploying, inflatable solar concentrators are under study. The mass penalty for passive cryogenic thermal control, liquid hydrogen tanks and solar concentrators does not compromise the specific impulse advantage afforded by the STR as compared to chemical propulsion systems. An STR injection module is characterized and performance is evaluated by comparison to electric propulsion options for the Saturn Orbiter Titan Probe (SOTP) and Uranus Flyby Uranus Probe (UFUP) missions.

  13. Windvan laser study

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The goal of defining a CO2 laser transmitter approach suited to Shuttle Coherent Atmospheric Lidar Experiment (SCALE) requirements is discussed. The adaptation of the existing WINDVAN system to the shuttle environment is addressed. The size, weight, reliability, and efficiency of the existing WINDVAN system are largely compatible with SCALE requirements. Repacking is needed for compatibility with vacuum and thermal environments. Changes are required to ensure survival through launch and landing, mechanical, vibration, and acoustic loads. Existing WINDVAN thermal management approaches depending on convection need to be upgraded zero gravity operations.

  14. 46 CFR 15.530 - Large passenger vessels.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 1 2010-10-01 2010-10-01 false Large passenger vessels. 15.530 Section 15.530 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY MERCHANT MARINE OFFICERS AND SEAMEN MANNING REQUIREMENTS Manning Requirements; Inspected Vessels § 15.530 Large passenger vessels. (a) The owner or operator of a U...

  15. 46 CFR 15.530 - Large passenger vessels.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 1 2013-10-01 2013-10-01 false Large passenger vessels. 15.530 Section 15.530 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY MERCHANT MARINE OFFICERS AND SEAMEN MANNING REQUIREMENTS Manning Requirements; Inspected Vessels § 15.530 Large passenger vessels. (a) The owner or operator of a U...

  16. 46 CFR 15.530 - Large passenger vessels.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 1 2012-10-01 2012-10-01 false Large passenger vessels. 15.530 Section 15.530 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY MERCHANT MARINE OFFICERS AND SEAMEN MANNING REQUIREMENTS Manning Requirements; Inspected Vessels § 15.530 Large passenger vessels. (a) The owner or operator of a U...

  17. 46 CFR 15.530 - Large passenger vessels.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 1 2011-10-01 2011-10-01 false Large passenger vessels. 15.530 Section 15.530 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY MERCHANT MARINE OFFICERS AND SEAMEN MANNING REQUIREMENTS Manning Requirements; Inspected Vessels § 15.530 Large passenger vessels. (a) The owner or operator of a U...

  18. 46 CFR 15.530 - Large passenger vessels.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 1 2014-10-01 2014-10-01 false Large passenger vessels. 15.530 Section 15.530 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY MERCHANT MARINE OFFICERS AND SEAMEN MANNING REQUIREMENTS Manning Requirements; Inspected Vessels § 15.530 Large passenger vessels. (a) The owner or operator of a U...

  19. First steps to lunar manufacturing: Results of the 1988 Space Studies Institute Lunar Systems Workshop

    NASA Technical Reports Server (NTRS)

    Maryniak, Gregg E.

    1992-01-01

    Prior studies by NASA and the Space Studies Institute have looked at the infrastructure required for the construction of solar power satellites (SPS) and other valuable large space systems from lunar materials. This paper discusses the results of a Lunar Systems Workshop conducted in January 1988. The workshop identified components of the infrastructure that could be implemented in the near future to create a revenue stream. These revenues could then be used to 'bootstrap' the additional elements required to begin the commercial use of nonterrestrial materials.

  20. Exploring asynchronous brainstorming in large groups: a field comparison of serial and parallel subgroups.

    PubMed

    de Vreede, Gert-Jan; Briggs, Robert O; Reiter-Palmon, Roni

    2010-04-01

    The aim of this study was to compare the results of two different modes of using multiple groups (instead of one large group) to identify problems and develop solutions. Many of the complex problems facing organizations today require the use of very large groups or collaborations of groups from multiple organizations. There are many logistical problems associated with the use of such large groups, including the ability to bring everyone together at the same time and location. A field study involved two different organizations and compared productivity and satisfaction of group. The approaches included (a) multiple small groups, each completing the entire process from start to end and combining the results at the end (parallel mode); and (b) multiple subgroups, each building on the work provided by previous subgroups (serial mode). Groups using the serial mode produced more elaborations compared with parallel groups, whereas parallel groups produced more unique ideas compared with serial groups. No significant differences were found related to satisfaction with process and outcomes between the two modes. Preferred mode depends on the type of task facing the group. Parallel groups are more suited for tasks for which a variety of new ideas are needed, whereas serial groups are best suited when elaboration and in-depth thinking on the solution are required. Results of this research can guide the development of facilitated sessions of large groups or "teams of teams."

  1. A study on the required performance of a 2G HTS wire for HTS wind power generators

    NASA Astrophysics Data System (ADS)

    Sung, Hae-Jin; Park, Minwon; Go, Byeong-Soo; Yu, In-Keun

    2016-05-01

    YBCO or REBCO coated conductor (2G) materials are developed for their superior performance at high magnetic field and temperature. Power system applications based on high temperature superconducting (HTS) 2G wire technology are attracting attention, including large-scale wind power generators. In particular, to solve problems associated with the foundations and mechanical structure of offshore wind turbines, due to the large diameter and heavy weight of the generator, an HTS generator is suggested as one of the key technologies. Many researchers have tried to develop feasible large-scale HTS wind power generator technologies. In this paper, a study on the required performance of a 2G HTS wire for large-scale wind power generators is discussed. A 12 MW class large-scale wind turbine and an HTS generator are designed using 2G HTS wire. The total length of the 2G HTS wire for the 12 MW HTS generator is estimated, and the essential prerequisites of the 2G HTS wire based generator are described. The magnetic field distributions of a pole module are illustrated, and the mechanical stress and strain of the pole module are analysed. Finally, a reasonable price for 2G HTS wire for commercialization of the HTS generator is suggested, reflecting the results of electromagnetic and mechanical analyses of the generator.

  2. Cloud computing for genomic data analysis and collaboration.

    PubMed

    Langmead, Ben; Nellore, Abhinav

    2018-04-01

    Next-generation sequencing has made major strides in the past decade. Studies based on large sequencing data sets are growing in number, and public archives for raw sequencing data have been doubling in size every 18 months. Leveraging these data requires researchers to use large-scale computational resources. Cloud computing, a model whereby users rent computers and storage from large data centres, is a solution that is gaining traction in genomics research. Here, we describe how cloud computing is used in genomics for research and large-scale collaborations, and argue that its elasticity, reproducibility and privacy features make it ideally suited for the large-scale reanalysis of publicly available archived data, including privacy-protected data.

  3. Design, analysis, and control of a large transport aircraft utilizing selective engine thrust as a backup system for the primary flight control. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Gerren, Donna S.

    1995-01-01

    A study has been conducted to determine the capability to control a very large transport airplane with engine thrust. This study consisted of the design of an 800-passenger airplane with a range of 5000 nautical miles design and evaluation of a flight control system, and design and piloted simulation evaluation of a thrust-only backup flight control system. Location of the four wing-mounted engines was varied to optimize the propulsive control capability, and the time constant of the engine response was studied. The goal was to provide level 1 flying qualities. The engine location and engine time constant did not have a large effect on the control capability. The airplane design did meet level 1 flying qualities based on frequencies, damping ratios, and time constants in the longitudinal and lateral-directional modes. Project pilots consistently rated the flying qualities as either level 1 or level 2 based on Cooper-Harper ratings. However, because of the limited control forces and moments, the airplane design fell short of meeting the time required to achieve a 30 deg bank and the time required to respond a control input.

  4. JPL Large Advanced Antenna Station Array Study

    NASA Technical Reports Server (NTRS)

    1978-01-01

    In accordance with study requirements, two antennas are described: a 30 meter standard antenna and a 34 meter modified antenna, along with a candidate array configuration for each. Modified antenna trade analyses are summarized, risks analyzed, costs presented, and a final antenna array configuration recommendation made.

  5. Large Animal Models for Foamy Virus Vector Gene Therapy

    PubMed Central

    Trobridge, Grant D.; Horn, Peter A.; Beard, Brian C.; Kiem, Hans-Peter

    2012-01-01

    Foamy virus (FV) vectors have shown great promise for hematopoietic stem cell (HSC) gene therapy. Their ability to efficiently deliver transgenes to multi-lineage long-term repopulating cells in large animal models suggests they will be effective for several human hematopoietic diseases. Here, we review FV vector studies in large animal models, including the use of FV vectors with the mutant O6-methylguanine-DNA methyltransferase, MGMTP140K to increase the number of genetically modified cells after transplantation. In these studies, FV vectors have mediated efficient gene transfer to polyclonal repopulating cells using short ex vivo transduction protocols designed to minimize the negative effects of ex vivo culture on stem cell engraftment. In this regard, FV vectors appear superior to gammaretroviral vectors, which require longer ex vivo culture to effect efficient transduction. FV vectors have also compared favorably with lentiviral vectors when directly compared in the dog model. FV vectors have corrected leukocyte adhesion deficiency and pyruvate kinase deficiency in the dog large animal model. FV vectors also appear safer than gammaretroviral vectors based on a reduced frequency of integrants near promoters and also near proto-oncogenes in canine repopulating cells. Together, these studies suggest that FV vectors should be highly effective for several human hematopoietic diseases, including those that will require relatively high percentages of gene-modified cells to achieve clinical benefit. PMID:23223198

  6. Accountable care organizations: financial advantages of larger hospital organizations.

    PubMed

    Camargo, Rodrigo; Camargo, Thaisa; Deslich, Stacie; Paul, David P; Coustasse, Alberto

    2014-01-01

    Accountable care organizations (ACOs) are groups of providers who agree to accept the responsibility for elevating the health status of a defined group of patients, with the goal of enabling people to take charge of their health and enroll in shared decision making with providers. The large initial investment required (estimated at $1.8 million) to develop an ACO implies that the participation of large health care organizations, especially hospitals and health systems, is required for success. Findings of this study suggest that ACOs based in a larger hospital organization are more likely to meet Centers for Medicare and Medicaid Services criteria for formation because of financial and structural assets of those entities.

  7. Technologies for low radio frequency observations of the Cosmic Dawn

    NASA Astrophysics Data System (ADS)

    Jones, D. L.

    2014-03-01

    The Jet Propulsion Laboratory (JPL) is developing concepts and technologies for low frequency radio astronomy space missions aimed at observing highly redshifted neutral Hydrogen from the Dark Ages. This is the period of cosmic history between the recombination epoch when the microwave background radiation was produced and the re-ionization of the intergalactic medium by the first generation of stars (Cosmic Dawn). This period, at redshifts z > ~20, is a critical epoch for the formation and evolution of large-scale structure in the universe. The 21-cm spectral line of Hydrogen provides the most promising method for directly studying the Dark Ages, but the corresponding frequencies at such large redshifts are only tens of MHz and thus require space-based observations to avoid terrestrial RFI and ionospheric absorption and refraction. This paper reports on the status of several low frequency technology development activities at JPL, including deployable bi-conical dipoles for a planned lunar-orbiting mission, and both rover-deployed and inflation-deployed long dipole antennas for use on the lunar surface. In addition, recent results from laboratory testing of low frequency receiver designs are presented. Finally, several concepts for space-based imaging interferometers utilizing deployable low frequency antennas are described. Some of these concepts involve large numbers of antennas and consequently a large digital cross-correlator will be needed. JPL has studied correlator architectures that greatly reduce the DC power required for this step, which can dominate the power consumption of real-time signal processing. Strengths and weaknesses of each mission concept are discussed in the context of the additional technology development required.

  8. Simulator study of flight characteristics of a large twin-fuselage cargo transport airplane during approach and landing

    NASA Technical Reports Server (NTRS)

    Grantham, W. D.; Deal, P. L.; Keyser, G. L., Jr.; Smith, P. M.

    1983-01-01

    A six degree-of-freedom, ground-based simulator study was conducted to evaluate the low speed flight characteristics of a twin fuselage cargo transport airplane and to compare these characteristics with those of a large, single fuselage (reference) transport configuration which was similar to the Lockheed C-5C airplane. The primary piloting task was the approach and landing. The results indicated that in order to achieve "acceptable' low speed handling qualities on the twin fuselage concept, considerable stability and control augmentation was required, and although the augmented airplane could be landed safely under adverse conditions, the roll performance of the aircraft had to be improved appreciably before the handling qualities were rated as being "satisfactory.' These ground-based simulation results indicated that a value of t sub phi = 30 (time required to bank 30 deg) less than 6 sec should result in "acceptable' roll response characteristics, and when t sub phi = 30 is less than 3.8 sec, "satisfactory' roll response should be attainable on such large and unusually configured aircraft as the subject twin fuselage cargo transport concept.

  9. Electrofishing effort requirements for estimating species richness in the Kootenai River, Idaho

    USGS Publications Warehouse

    Watkins, Carson J.; Quist, Michael C.; Shepard, Bradley B.; Ireland, Susan C.

    2016-01-01

    This study was conducted on the Kootenai River, Idaho to provide insight on sampling requirements to optimize future monitoring effort associated with the response of fish assemblages to habitat rehabilitation. Our objective was to define the electrofishing effort (m) needed to have a 95% probability of sampling 50, 75, and 100% of the observed species richness and to evaluate the relative influence of depth, velocity, and instream woody cover on sample size requirements. Sidechannel habitats required more sampling effort to achieve 75 and 100% of the total species richness than main-channel habitats. The sampling effort required to have a 95% probability of sampling 100% of the species richness was 1100 m for main-channel sites and 1400 m for side-channel sites. We hypothesized that the difference in sampling requirements between main- and side-channel habitats was largely due to differences in habitat characteristics and species richness between main- and side-channel habitats. In general, main-channel habitats had lower species richness than side-channel habitats. Habitat characteristics (i.e., depth, current velocity, and woody instream cover) were not related to sample size requirements. Our guidelines will improve sampling efficiency during monitoring effort in the Kootenai River and provide insight on sampling designs for other large western river systems where electrofishing is used to assess fish assemblages.

  10. The Application of Microwave Incineration to Regenerative Life Support

    NASA Technical Reports Server (NTRS)

    Sun, Sidney C.; Srinivasan, Venkatesh; Covington, Al (Technical Monitor)

    1995-01-01

    Future human exploration missions will require life support systems that are highly regenerative, requiring minimum resupply, enabling the crews to be largely self-sufficient. Solid wastes generated in space will be processed to recover usable material. Researchers at NASA Ames Research Center are studying a commercially-produced microwave incinerator as a solid waste processor. This paper will describe the results of testing to-date.

  11. Affordable and Lightweight High-Resolution X-ray Optics for Astronomical Missions

    NASA Technical Reports Server (NTRS)

    Zhang, W. W.; Biskach, M. P.; Bly, V. T.; Carter, J. M.; Chan, K. W.; Gaskin, J. A.; Hong, M.; Hohl, B. R.; Jones, W. D.; Kolodziejczak, J. J.

    2014-01-01

    Future x-ray astronomical missions require x-ray mirror assemblies that provide both high angular resolution and large photon collecting area. In addition, as x-ray astronomy undertakes more sensitive sky surveys, a large field of view is becoming increasingly important as well. Since implementation of these requirements must be carried out in broad political and economical contexts, any technology that meets these performance requirements must also be financially affordable and can be implemented on a reasonable schedule. In this paper we report on progress of an x-ray optics development program that has been designed to address all of these requirements. The program adopts the segmented optical design, thereby is capable of making both small and large mirror assemblies for missions of any size. This program has five technical elements: (1) fabrication of mirror substrates, (2) coating, (3) alignment, (4) bonding, and (5) mirror module systems engineering and testing. In the past year we have made progress in each of these five areas, advancing the angular resolution of mirror modules from 10.8 arc-seconds half-power diameter reported (HPD) a year ago to 8.3 arc-seconds now. These mirror modules have been subjected to and passed all environmental tests, including vibration, acoustic, and thermal vacuum. As such this technology is ready for implementing a mission that requires a 10-arc-second mirror assembly. Further development in the next two years would make it ready for a mission requiring a 5-arc-second mirror assembly. We expect that, by the end of this decade, this technology would enable the x-ray astrophysical community to compete effectively for a major x-ray mission in the 2020s that would require one or more 1-arc-second mirror assemblies for imaging, spectroscopic, timing, and survey studies.

  12. Extended duration Orbiter life support definition

    NASA Technical Reports Server (NTRS)

    Kleiner, G. N.; Thompson, C. D.

    1978-01-01

    Extending the baseline seven-day Orbiter mission to 30 days or longer and operating with a solar power module as the primary source for electrical power requires changes to the existing environmental control and life support (ECLS) system. The existing ECLS system imposes penalties on longer missions which limit the Orbiter capabilities and changes are required to enhance overall mission objectives. Some of these penalties are: large quantities of expendables, the need to dump or store large quantities of waste material, the need to schedule fuel cell operation, and a high landing weight penalty. This paper presents the study ground rules and examines the limitations of the present ECLS system against Extended Duration Orbiter mission requirements. Alternate methods of accomplishing ECLS functions for the Extended Duration Orbiter are discussed. The overall impact of integrating these options into the Orbiter are evaluated and significant Orbiter weight and volume savings with the recommended approaches are described.

  13. Could the extensive use of rare elements in renewable energy technologies become a cause for concern?

    NASA Astrophysics Data System (ADS)

    Bradshaw, A. M.; Reuter, B.; Hamacher, T.

    2015-08-01

    The energy transformation process beginning to take place in many countries as a response to climate change will reduce substantially the consumption of fossil fuels, but at the same time cause a large increase in the demand for other raw materials. Whereas it is difficult to estimate the quantities of, for example, iron, copper and aluminium required, the situation is somewhat simpler for the rare elements that might be needed in a sustainable energy economy based largely on photovoltaic sources, wind and possibly nuclear fusion. We consider briefly each of these technologies and discuss the supply risks associated with the rare elements required, if they were to be used in the quantities that might be required for a global energy transformation process. In passing, we point out the need in resource studies to define the terms "rare", "scarce" and "critical" and to use them in a consistent way.

  14. Control of fluxes in metabolic networks

    PubMed Central

    Basler, Georg; Nikoloski, Zoran; Larhlimi, Abdelhalim; Barabási, Albert-László; Liu, Yang-Yu

    2016-01-01

    Understanding the control of large-scale metabolic networks is central to biology and medicine. However, existing approaches either require specifying a cellular objective or can only be used for small networks. We introduce new coupling types describing the relations between reaction activities, and develop an efficient computational framework, which does not require any cellular objective for systematic studies of large-scale metabolism. We identify the driver reactions facilitating control of 23 metabolic networks from all kingdoms of life. We find that unicellular organisms require a smaller degree of control than multicellular organisms. Driver reactions are under complex cellular regulation in Escherichia coli, indicating their preeminent role in facilitating cellular control. In human cancer cells, driver reactions play pivotal roles in malignancy and represent potential therapeutic targets. The developed framework helps us gain insights into regulatory principles of diseases and facilitates design of engineering strategies at the interface of gene regulation, signaling, and metabolism. PMID:27197218

  15. Initial Technology Assessment for the Large-Aperture UV-Optical-Infrared (LUVOIR) Mission Concept Study

    NASA Technical Reports Server (NTRS)

    Bolcar, Matthew R.; Feinberg, Lee; France, Kevin; Rauscher, Bernard J.; Redding, David; Schiminovich, David

    2016-01-01

    The NASA Astrophysics Division's 30-Year Roadmap prioritized a future large-aperture space telescope operating in the ultra-violet/optical/infrared wavelength regime. The Association of Universities for Research in Astronomy envisioned a similar observatory, the High Definition Space Telescope. And a multi-institution group also studied the Advanced Technology Large Aperture Space Telescope. In all three cases, a broad science case is outlined, combining general astrophysics with the search for biosignatures via direct-imaging and spectroscopic characterization of habitable exoplanets. We present an initial technology assessment that enables such an observatory that is currently being studied for the 2020 Decadal Survey by the Large UV/Optical/Infrared (LUVOIR) surveyor Science and Technology Definition Team. We present here the technology prioritization for the 2016 technology cycle and define the required technology capabilities and current state-of-the-art performance. Current, planned, and recommended technology development efforts are also reported.

  16. Simulator study of flight characteristics of several large, dissimilar, cargo transport airplanes during approach and landing

    NASA Technical Reports Server (NTRS)

    Grantham, W. D.; Smith, P. M.; Deal, P. L.; Neely, W. R., Jr.

    1984-01-01

    A six-degree-of-freedom, ground based simulator study is conducted to evaluate the low-speed flight characteristics of four dissimilar cargo transport airplanes. These characteristics are compared with those of a large, present-day (reference) transport configuration similar to the Lockheed C-5A airplane. The four very large transport concepts evaluated consist of single-fuselage, twin-fuselage, triple-fuselage, and span-loader configurations. The primary piloting task is the approach and landing operation. The results of his study indicate that all four concepts evaluated have unsatisfactory longitudinal and lateral directional low speed flight characteristics and that considerable stability and control augmentation would be required to improve these characteristics (handling qualities) to a satisfactory level. Through the use of rate command/attitude hold augmentation in the pitch and roll axes, and the use of several turn-coordination features, the handling qualities of all four large transports simulated are improved appreciably.

  17. Initial Technology Assessment for the Large UV-Optical-Infrared (LUVOIR) Mission Concept Study

    NASA Technical Reports Server (NTRS)

    Bolcar, Matthew R.; Feinberg, Lee D.; France, Kevin; Rauscher, Bernard J.; Redding, David; Schiminovich, David

    2016-01-01

    The NASA Astrophysics Divisions 30-Year Roadmap prioritized a future large-aperture space telescope operating in the ultra-violet-optical-infrared wavelength regime. The Association of Universities for Research in Astronomy envisioned a similar observatory, the High Definition Space Telescope. And a multi-institution group also studied the Advanced Technology Large Aperture Space Telescope. In all three cases, a broad science case is outlined, combining general astrophysics with the search for bio-signatures via direct-imaging and spectroscopic characterization of habitable exo-planets. We present an initial technology assessment that enables such an observatory that is currently being studied for the 2020 Decadal Survey by the Large UV-Optical Infrared (LUVOIR) surveyor Science and Technology Definition Team. We present here the technology prioritization for the 2016 technology cycle and define the required technology capabilities and current state-of-the-art performance. Current, planned, and recommended technology development efforts are also reported.

  18. Technology requirements and readiness for very large vehicles

    NASA Technical Reports Server (NTRS)

    Conner, D. W.

    1979-01-01

    Common concerns of very large vehicles in the areas of economics, transportation system interfaces and operational problems were reviewed regarding their influence on vehicle configurations and technology. Fifty-four technology requirements were identified which are judged to be unique, or particularly critical, to very large vehicles. The requirements were about equally divided among the four general areas of aero/hydrodynamics, propulsion and acoustics, structures, and vehicle systems and operations. The state of technology readiness was judged to be poor to fair for slightly more than one half of the requirements. In the classic disciplinary areas, the state of technology readiness appears to be more advanced than for vehicle systems and operations.

  19. [Ultrasound guided percutaneous nephrolithotripsy].

    PubMed

    Guliev, B G

    2014-01-01

    The study was aimed to the evaluation of the effectiveness and results of ultrasound guided percutaneous nephrolithotripsy (PNL) for the treatment of patients with large stones in renal pelvis. The results of PNL in 138 patients who underwent surgery for kidney stones from 2011 to 2013 were analyzed. Seventy patients (Group 1) underwent surgery with combined ultrasound and radiological guidance, and 68 patients (Group 2)--only with ultrasound guidance. The study included patients with large renal pelvic stones larger than 2.2 cm, requiring the formation of a single laparoscopic approach. Using the comparative analysis, the timing of surgery, the number of intra- and postoperative complications, blood loss and length of stay were evaluated. Percutaneous access was successfully performed in all patients. Postoperative complications (exacerbation of chronic pyelonephritis, gross hematuria) were observed in 14.3% of patients in Group 1 and in 14.7% of patients in Group 2. Bleeding requiring blood transfusion, and injuries of adjacent organs were not registered. Efficacy of PNL in the Group 1 was 95.7%; 3 (4.3%) patients required additional interventions. In Group 2, the effectiveness of PNL was 94.1%, 4 (5.9%) patients additionally underwent extracorporeal lithotripsy. There were no significant differences in the effectiveness of PNL, the volume of blood loss and duration of hospitalization. Ultrasound guided PNL can be performed in large pelvic stones and sufficient expansion of renal cavities, thus reducing radiation exposure of patients and medical staff.

  20. Deployable antenna phase A study

    NASA Technical Reports Server (NTRS)

    Schultz, J.; Bernstein, J.; Fischer, G.; Jacobson, G.; Kadar, I.; Marshall, R.; Pflugel, G.; Valentine, J.

    1979-01-01

    Applications for large deployable antennas were re-examined, flight demonstration objectives were defined, the flight article (antenna) was preliminarily designed, and the flight program and ground development program, including the support equipment, were defined for a proposed space transportation system flight experiment to demonstrate a large (50 to 200 meter) deployable antenna system. Tasks described include: (1) performance requirements analysis; (2) system design and definition; (3) orbital operations analysis; and (4) programmatic analysis.

  1. Hybrid Propulsion Technology Program

    NASA Technical Reports Server (NTRS)

    Jensen, G. E.; Holzman, A. L.

    1990-01-01

    Future launch systems of the United States will require improvements in booster safety, reliability, and cost. In order to increase payload capabilities, performance improvements are also desirable. The hybrid rocket motor (HRM) offers the potential for improvements in all of these areas. The designs are presented for two sizes of hybrid boosters, a large 4.57 m (180 in.) diameter booster duplicating the Advanced Solid Rocket Motor (ASRM) vacuum thrust-time profile and smaller 2.44 m (96 in.), one-quater thrust level booster. The large booster would be used in tandem, while eight small boosters would be used to achieve the same total thrust. These preliminary designs were generated as part of the NASA Hybrid Propulsion Technology Program. This program is the first phase of an eventual three-phaes program culminating in the demonstration of a large subscale engine. The initial trade and sizing studies resulted in preferred motor diameters, operating pressures, nozzle geometry, and fuel grain systems for both the large and small boosters. The data were then used for specific performance predictions in terms of payload and the definition and selection of the requirements for the major components: the oxidizer feed system, nozzle, and thrust vector system. All of the parametric studies were performed using realistic fuel regression models based upon specific experimental data.

  2. Characterization of the electromechanical properties of EAP materials

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Yoseph; Sherrita, Stewart; Bhattachary, Kaushik; Lih, Shyh-Shiuh

    2001-01-01

    Electroactive polymers (EAP) are an emerging class of actuation materials. Their large electrically induced strains (longitudinal or bending), low density, mechanical flexibility, and ease of processing offer advantages over traditional electroactive materials. However, before the capability of these materials can be exploited, their electrical and mechanical behavior must be properly quantified. Two general types of EAP can be identified. The first type is ionic EAP, which requires relatively low voltages (<10V) to achieve large bending deflections. This class usually needs to be hydrated and electrochemical reactions may occur. The second type is Electronic-EAP and it involves electrostrictive and/or Maxwell stresses. This type of materials requires large electric fields (>100MV/m) to achieve longitudinal deformations at the range from 4 - 360%. Some of the difficulties in characterizing EAP include: nonlinear properties, large compliance (large mismatch with metal electrodes), nonhomogeneity resulting from processing, etc. To support the need for reliable data, the authors are developing characterization techniques to quantify the electroactive responses and material properties of EAP materials. The emphasis of the current study is on addressing electromechanical issues related to the ion-exchange type EAP also known as IPMC. The analysis, experiments and test results are discussed in this paper.

  3. Gas Turbine Characteristics for a Large Civil Tilt-Rotor (LCTR)

    NASA Technical Reports Server (NTRS)

    Snyder, Christopher A.; Thurman, Douglas R.

    2010-01-01

    In support of the Fundamental Aeronautics Program, Subsonic Rotary Wing Project; an engine system study has been undertaken to help define and understand some of the major gas turbine engine parameters required to meet performance and weight requirements as defined by earlier vehicle system studies. These previous vehicle studies will be reviewed to help define gas turbine performance goals. Assumptions and analysis methods used will be described. Performance and weight estimates for a few conceptual gas turbine engines meeting these requirements will be given and discussed. Estimated performance for these conceptual engines over a wide speed variation (down to 50 percent power turbine rpm at high torque) will be presented. Finally, areas needing further effort will be suggested and discussed.

  4. A comparison of quality and utilization problems in large and small group practices.

    PubMed

    Gleason, S C; Richards, M J; Quinnell, J E

    1995-12-01

    Physicians practicing in large, multispecialty medical groups share an organizational culture that differs from that of physicians in small or independent practices. Since 1980, there has been a sharp increase in the size of multispecialty group practice organizations, in part because of increased efficiencies of large group practices. The greater number of physicians and support personnel in a large group practice also requires a relatively more sophisticated management structure. The efficiencies, conveniences, and management structure of a large group practice provide an optimal environment to practice medicine. However, a search of the literature found no data linking a large group practice environment to practice outcomes. The purpose of the study reported in this article was to determine if physicians in large practices have fewer quality and utilization problems than physicians in small or independent practices.

  5. Impacts of savanna trees on forage quality for a large African herbivore

    PubMed Central

    De Kroon, Hans; Prins, Herbert H. T.

    2008-01-01

    Recently, cover of large trees in African savannas has rapidly declined due to elephant pressure, frequent fires and charcoal production. The reduction in large trees could have consequences for large herbivores through a change in forage quality. In Tarangire National Park, in Northern Tanzania, we studied the impact of large savanna trees on forage quality for wildebeest by collecting samples of dominant grass species in open grassland and under and around large Acacia tortilis trees. Grasses growing under trees had a much higher forage quality than grasses from the open field indicated by a more favourable leaf/stem ratio and higher protein and lower fibre concentrations. Analysing the grass leaf data with a linear programming model indicated that large savanna trees could be essential for the survival of wildebeest, the dominant herbivore in Tarangire. Due to the high fibre content and low nutrient and protein concentrations of grasses from the open field, maximum fibre intake is reached before nutrient requirements are satisfied. All requirements can only be satisfied by combining forage from open grassland with either forage from under or around tree canopies. Forage quality was also higher around dead trees than in the open field. So forage quality does not reduce immediately after trees die which explains why negative effects of reduced tree numbers probably go initially unnoticed. In conclusion our results suggest that continued destruction of large trees could affect future numbers of large herbivores in African savannas and better protection of large trees is probably necessary to sustain high animal densities in these ecosystems. PMID:18309522

  6. Eating in the absence of hunger in adolescents: intake after a large-array meal compared with that after a standardized meal.

    PubMed

    Shomaker, Lauren B; Tanofsky-Kraff, Marian; Zocca, Jaclyn M; Courville, Amber; Kozlosky, Merel; Columbo, Kelli M; Wolkoff, Laura E; Brady, Sheila M; Crocker, Melissa K; Ali, Asem H; Yanovski, Susan Z; Yanovski, Jack A

    2010-10-01

    Eating in the absence of hunger (EAH) is typically assessed by measuring youths' intake of palatable snack foods after a standard meal designed to reduce hunger. Because energy intake required to reach satiety varies among individuals, a standard meal may not ensure the absence of hunger among participants of all weight strata. The objective of this study was to compare adolescents' EAH observed after access to a very large food array with EAH observed after a standardized meal. Seventy-eight adolescents participated in a randomized crossover study during which EAH was measured as intake of palatable snacks after ad libitum access to a very large array of lunch-type foods (>10,000 kcal) and after a lunch meal standardized to provide 50% of the daily estimated energy requirements. The adolescents consumed more energy and reported less hunger after the large-array meal than after the standardized meal (P values < 0.001). They consumed ≈70 kcal less EAH after the large-array meal than after the standardized meal (295 ± 18 compared with 365 ± 20 kcal; P < 0.001), but EAH intakes after the large-array meal and after the standardized meal were positively correlated (P values < 0.001). The body mass index z score and overweight were positively associated with EAH in both paradigms after age, sex, race, pubertal stage, and meal intake were controlled for (P values ≤ 0.05). EAH is observable and positively related to body weight regardless of whether youth eat in the absence of hunger from a very large-array meal or from a standardized meal. This trial was registered at clinicaltrials.gov as NCT00631644.

  7. Cost-Efficient Storage of Cryogens

    NASA Technical Reports Server (NTRS)

    Fesmire, J. E.; Sass, J. P.; Nagy, Z.; Sojoumer, S. J.; Morris, D. L.; Augustynowicz, S. D.

    2007-01-01

    NASA's cryogenic infrastructure that supports launch vehicle operations and propulsion testing is reaching an age where major refurbishment will soon be required. Key elements of this infrastructure are the large double-walled cryogenic storage tanks used for both space vehicle launch operations and rocket propulsion testing at the various NASA field centers. Perlite powder has historically been the insulation material of choice for these large storage tank applications. New bulk-fill insulation materials, including glass bubbles and aerogel beads, have been shown to provide improved thermal and mechanical performance. A research testing program was conducted to investigate the thermal performance benefits as well as to identify operational considerations and associated risks associated with the application of these new materials in large cryogenic storage tanks. The program was divided into three main areas: material testing (thermal conductivity and physical characterization), tank demonstration testing (liquid nitrogen and liquid hydrogen), and system studies (thermal modeling, economic analysis, and insulation changeout). The results of this research work show that more energy-efficient insulation solutions are possible for large-scale cryogenic storage tanks worldwide and summarize the operational requirements that should be considered for these applications.

  8. Natural supersymmetry without light Higgsinos

    DOE PAGES

    Cohen, Timothy; Kearney, John; Luty, Markus A.

    2015-04-08

    In this study, we present a mechanism that allows a large Higgsino mass without large fine-tuning. The Higgs is a pseudo-Nambu-Goldstone boson (PNGB) of the global symmetry breaking pattern SO(5)→SO(4). Because of the PNGB nature of the light Higgs, the SO(5) invariant Higgsino mass does not directly contribute to the Higgs mass. Large couplings in the Higgs sector that spontaneously breaks SO(5) minimize the tuning, and are also motivated by the requirements of generating a sufficiently large Higgs quartic coupling and of maintaining a natural approximate global SO(5) symmetry. When these conditions are imposed, theories of this type predict heavymore » Higgsinos. This construction differs from composite Higgs models in that no new particles are introduced to form complete SO(5) multiplets involving the top quark—the stop is the only top partner. Compatibility with Higgs coupling measurements requires cancellations among contributions to the Higgs mass-squared parameter at the 10% level. An important implication of this construction is that the compressed region of stop and sbottom searches can still be natural.« less

  9. Assessment Intelligence in Small Group Learning

    ERIC Educational Resources Information Center

    Xing, Wanli; Wu, Yonghe

    2014-01-01

    Assessment of groups in CSCL context is a challenging task fraught with many confounding factors collected and measured. Previous documented studies are by and large summative in nature and some process-oriented methods require time-intensive coding of qualitative data. This study attempts to resolve these problems for teachers to assess groups…

  10. Case Study Analysis of the Impacts of Water Acquisition for Hydraulic Fracturing on Local Water Availability

    EPA Science Inventory

    Hydraulic fracturing (HF) is used to develop unconventional gas reserves, but the technology requires large volumes of water, placing demands on local water resources and potentially creating conflict with other users and ecosystems. This study examines the balance between water ...

  11. Signal Detection Theory as a Tool for Successful Student Selection

    ERIC Educational Resources Information Center

    van Ooijen-van der Linden, Linda; van der Smagt, Maarten J.; Woertman, Liesbeth; te Pas, Susan F.

    2017-01-01

    Prediction accuracy of academic achievement for admission purposes requires adequate "sensitivity" and "specificity" of admission tools, yet the available information on the validity and predictive power of admission tools is largely based on studies using correlational and regression statistics. The goal of this study was to…

  12. APPLYING OPERATIONAL ANALYSIS TO URBAN EDUCATIONAL SYSTEMS, A WORKING PAPER.

    ERIC Educational Resources Information Center

    SISSON, ROGER L.

    OPERATIONS RESEARCH CONCEPTS ARE POTENTIALLY USEFUL FOR STUDY OF SUCH LARGE URBAN SCHOOL DISTRICT PROBLEMS AS INFORMATION FLOW, PHYSICAL STRUCTURE OF THE DISTRICT, ADMINISTRATIVE DECISION MAKING BOARD POLICY FUNCTIONS, AND THE BUDGET STRUCTURE. OPERATIONAL ANALYSIS REQUIRES (1) IDENTIFICATION OF THE SYSTEM UNDER STUDY, (2) IDENTIFICATION OF…

  13. Mission Study for Generation-X: A Large Area and High Angular Observatory to Study the Early Universe

    NASA Technical Reports Server (NTRS)

    Brissenden, Roger

    2005-01-01

    In this report we provide a summary of the technical progress achieved during the last year Generation-X Vision Mission Study. In addition, we provide a brief programmatic status. The Generation-X (Gen-X) Vision Mission Study investigates the science requirements, mission concepts and technology drivers for an X-ray telescope designed to study the new frontier of astrophysics: the birth and evolution of the first stars, galaxies and black holes in the early Universe. X-ray astronomy offers an opportunity to detect these via the activity of the black holes, and the supernova explosions and gamma-ray burst afterglows of the massive stars. However, such objects are beyond the grasp of current missions which are operating or even under development. Our team has conceived a Gen-X Vision Mission based on an X-ray observatory with 100 m2 collecting area at 1 keV (1000 times larger than Chandra) and 0.1 arcsecond angular resolution (several times better than Chandra and 50 times better than the Constellation-X resolution goal). Such a high energy observatory will be capable of detecting the earliest black holes and galaxies in the Universe, and will also study extremes of density, gravity, magnetic fields, and kinetic energy which cannot be created in laboratories. In our study we develop the mission concept and define candidate technologies and performance requirements for Gen-X. The baseline Gen-X mission involves four 8 m diameter X-ray telescopes operating at Sun-Earth L2. We trade against an alternate concept of a single 26 m diameter telescope with focal plane instruments on a separate spacecraft. A telescope of this size will require either robotic or human-assisted in-flight assembly. The required effective area implies that extremely lightweight grazing incidence X-ray optics must be developed. To achieve the required areal density of at least 100 times lower than for Chandra, we study 0.2 mm thick mirrors which have active on-orbit figure control. We also study the suite of required detectors, including a large FOV high angular resolution imager, a cryogenic imaging spectrometer and a reflection grating spectrometer.

  14. Measuring the embodied energy in drinking water supply systems: a case study in the Great Lakes region.

    PubMed

    Mo, Weiwei; Nasiri, Fuzhan; Eckelman, Matthew J; Zhang, Qiong; Zimmerman, Julie B

    2010-12-15

    A sustainable supply of both energy and water is critical to long-term national security, effective climate policy, natural resource sustainability, and social wellbeing. These two critical resources are inextricably and reciprocally linked; the production of energy requires large volumes of water, while the treatment and distribution of water is also significantly dependent upon energy. In this paper, a hybrid analysis approach is proposed to estimate embodied energy and to perform a structural path analysis of drinking water supply systems. The applicability of this approach is then tested through a case study of a large municipal water utility (city of Kalamazoo) in the Great Lakes region to provide insights on the issues of water-energy pricing and carbon footprints. Kalamazoo drinking water requires approximately 9.2 MJ/m(3) of energy to produce, 30% of which is associated with indirect inputs such as system construction and treatment chemicals.

  15. Advances in multi-scale modeling of solidification and casting processes

    NASA Astrophysics Data System (ADS)

    Liu, Baicheng; Xu, Qingyan; Jing, Tao; Shen, Houfa; Han, Zhiqiang

    2011-04-01

    The development of the aviation, energy and automobile industries requires an advanced integrated product/process R&D systems which could optimize the product and the process design as well. Integrated computational materials engineering (ICME) is a promising approach to fulfill this requirement and make the product and process development efficient, economic, and environmentally friendly. Advances in multi-scale modeling of solidification and casting processes, including mathematical models as well as engineering applications are presented in the paper. Dendrite morphology of magnesium and aluminum alloy of solidification process by using phase field and cellular automaton methods, mathematical models of segregation of large steel ingot, and microstructure models of unidirectionally solidified turbine blade casting are studied and discussed. In addition, some engineering case studies, including microstructure simulation of aluminum casting for automobile industry, segregation of large steel ingot for energy industry, and microstructure simulation of unidirectionally solidified turbine blade castings for aviation industry are discussed.

  16. Radiatively coupled thermionic and thermoelectric power system concept

    NASA Technical Reports Server (NTRS)

    Shimada, K.; Ewell, R.

    1981-01-01

    The study presented showed that the large power systems (about 100 kW) utilizing radiatively coupled thermionic or thermoelectric converters could be designed so that the power subsystem could be contained in a Space Shuttle bay as a part of an electrically propelled spacecraft. The radiatively coupled system requires a large number of individual converters since the transferred heat is smaller than with the conductively coupled system, but the advantages of the new system indicates merit for further study. The advantages are (1) good electrical isolation between converters and the heat source, (2) physical separation of converters from the heat source (making the system fabrication manageable), and (3) elimination of radiator heat pipes, which are required in an all-heat-pipe power system. In addition, the specific weight of the radiatively coupled power systems favorably compares with that of the all-heat-pipe systems.

  17. Motion simulator study of longitudinal stability requirements for large delta wing transport airplanes during approach and landing with stability augmentation systems failed

    NASA Technical Reports Server (NTRS)

    Snyder, C. T.; Fry, E. B.; Drinkwater, F. J., III; Forrest, R. D.; Scott, B. C.; Benefield, T. D.

    1972-01-01

    A ground-based simulator investigation was conducted in preparation for and correlation with an-flight simulator program. The objective of these studies was to define minimum acceptable levels of static longitudinal stability for landing approach following stability augmentation systems failures. The airworthiness authorities are presently attempting to establish the requirements for civil transports with only the backup flight control system operating. Using a baseline configuration representative of a large delta wing transport, 20 different configurations, many representing negative static margins, were assessed by three research test pilots in 33 hours of piloted operation. Verification of the baseline model to be used in the TIFS experiment was provided by computed and piloted comparisons with a well-validated reference airplane simulation. Pilot comments and ratings are included, as well as preliminary tracking performance and workload data.

  18. Dumbo heavy lifter aircraft

    NASA Technical Reports Server (NTRS)

    Riester, Peter; Ellis, Colleen; Wagner, Michael; Orren, Scott; Smith, Byron; Skelly, Michael; Zgraggen, Craig; Webber, Matt

    1992-01-01

    The world is rapidly changing from one with two military superpowers, with which most countries were aligned, to one with many smaller military powers. In this environment, the United States cannot depend on the availability of operating bases from which to respond to crises requiring military intervention. Several studies (e.g. the SAB Global Reach, Global Power Study) have indicated an increased need to be able to rapidly transport large numbers of troops and equipment from the continental United States to potential trouble spots throughout the world. To this end, a request for proposals (RFP) for the concept design of a large aircraft capable of 'projecting' a significant military force without reliance on surface transportation was developed. These design requirements are: minimum payload of 400,000 pounds at 2.5 g maneuver load factor; minimum unfueled range of 6,000 nautical miles; and aircraft must operate from existing domestic air bases and use existing airbases or sites of opportunity at the destination.

  19. Concepts and analysis for precision segmented reflector and feed support structures

    NASA Technical Reports Server (NTRS)

    Miller, Richard K.; Thomson, Mark W.; Hedgepeth, John M.

    1990-01-01

    Several issues surrounding the design of a large (20-meter diameter) Precision Segmented Reflector are investigated. The concerns include development of a reflector support truss geometry that will permit deployment into the required doubly-curved shape without significant member strains. For deployable and erectable reflector support trusses, the reduction of structural redundancy was analyzed to achieve reduced weight and complexity for the designs. The stiffness and accuracy of such reduced member trusses, however, were found to be affected to a degree that is unexpected. The Precision Segmented Reflector designs were developed with performance requirements that represent the Reflector application. A novel deployable sunshade concept was developed, and a detailed parametric study of various feed support structural concepts was performed. The results of the detailed study reveal what may be the most desirable feed support structure geometry for Precision Segmented Reflector/Large Deployable Reflector applications.

  20. Atmospheric density models

    NASA Technical Reports Server (NTRS)

    Mueller, A. C.

    1977-01-01

    An atmospheric model developed by Jacchia, quite accurate but requiring a large amount of computer storage and execution time, was found to be ill-suited for the space shuttle onboard program. The development of a simple atmospheric density model to simulate the Jacchia model was studied. Required characteristics including variation with solar activity, diurnal variation, variation with geomagnetic activity, semiannual variation, and variation with height were met by the new atmospheric density model.

  1. A link between eumelanism and calcium physiology in the barn owl

    NASA Astrophysics Data System (ADS)

    Roulin, Alexandre; Dauwe, Tom; Blust, Ronny; Eens, Marcel; Beaud, Michel

    2006-09-01

    In many animals, melanin-based coloration is strongly heritable and is largely insensitive to the environment and body condition. According to the handicap principle, such a trait may not reveal individual quality because the production of different melanin-based colorations often entails similar costs. However, a recent study showed that the production of eumelanin pigments requires relatively large amounts of calcium, potentially implying that melanin-based coloration is associated with physiological processes requiring calcium. If this is the case, eumelanism may be traded-off against other metabolic processes that require the same elements. We used a correlative approach to examine, for the first time, this proposition in the barn owl, a species in which individuals vary in the amount, size, and blackness of eumelanic spots. For this purpose, we measured calcium concentration in the left humerus of 85 dead owls. Results showed that the humeri of heavily spotted individuals had a higher concentration of calcium. This suggests either that plumage spottiness signals the ability to absorb calcium from the diet for both eumelanin production and storage in bones, or that lightly spotted individuals use more calcium for metabolic processes at the expense of calcium storage in bones. Our study supports the idea that eumelanin-based coloration is associated with a number of physiological processes requiring calcium.

  2. Tropomodulin 1 Regulation of Actin Is Required for the Formation of Large Paddle Protrusions Between Mature Lens Fiber Cells.

    PubMed

    Cheng, Catherine; Nowak, Roberta B; Biswas, Sondip K; Lo, Woo-Kuen; FitzGerald, Paul G; Fowler, Velia M

    2016-08-01

    To elucidate the proteins required for specialized small interlocking protrusions and large paddle domains at lens fiber cell tricellular junctions (vertices), we developed a novel method to immunostain single lens fibers and studied changes in cell morphology due to loss of tropomodulin 1 (Tmod1), an F-actin pointed end-capping protein. We investigated F-actin and F-actin-binding protein localization in interdigitations of Tmod1+/+ and Tmod1-/- single mature lens fibers. F-actin-rich small protrusions and large paddles were present along cell vertices of Tmod1+/+ mature fibers. In contrast, Tmod1-/- mature fiber cells lack normal paddle domains, while small protrusions were unaffected. In Tmod1+/+ mature fibers, Tmod1, β2-spectrin, and α-actinin are localized in large puncta in valleys between paddles; but in Tmod1-/- mature fibers, β2-spectrin was dispersed while α-actinin was redistributed at the base of small protrusions and rudimentary paddles. Fimbrin and Arp3 (actin-related protein 3) were located in puncta at the base of small protrusions, while N-cadherin and ezrin outlined the cell membrane in both Tmod1+/+ and Tmod1-/- mature fibers. These results suggest that distinct F-actin organizations are present in small protrusions versus large paddles. Formation and/or maintenance of large paddle domains depends on a β2-spectrin-actin network stabilized by Tmod1. α-Actinin-crosslinked F-actin bundles are enhanced in absence of Tmod1, indicating altered cytoskeleton organization. Formation of small protrusions is likely facilitated by Arp3-branched and fimbrin-bundled F-actin networks, which do not depend on Tmod1. This is the first work to reveal the F-actin-associated proteins required for the formation of paddles between lens fibers.

  3. I'll take that to go: Big data bags and minimal identifiers for exchange of large, complex datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chard, Kyle; D'Arcy, Mike; Heavner, Benjamin D.

    Big data workflows often require the assembly and exchange of complex, multi-element datasets. For example, in biomedical applications, the input to an analytic pipeline can be a dataset consisting thousands of images and genome sequences assembled from diverse repositories, requiring a description of the contents of the dataset in a concise and unambiguous form. Typical approaches to creating datasets for big data workflows assume that all data reside in a single location, requiring costly data marshaling and permitting errors of omission and commission because dataset members are not explicitly specified. We address these issues by proposing simple methods and toolsmore » for assembling, sharing, and analyzing large and complex datasets that scientists can easily integrate into their daily workflows. These tools combine a simple and robust method for describing data collections (BDBags), data descriptions (Research Objects), and simple persistent identifiers (Minids) to create a powerful ecosystem of tools and services for big data analysis and sharing. We present these tools and use biomedical case studies to illustrate their use for the rapid assembly, sharing, and analysis of large datasets.« less

  4. 3D finite element analysis of tightening process of bolt and nut connections with pitch difference

    NASA Astrophysics Data System (ADS)

    Liu, X.; Noda, N.-A.; Sano, Y.; Huang, Y. T.; Takase, Y.

    2018-06-01

    In a wide industrial field, the bolt-nut joint is unitized as an important machine element and anti-loosening performance is always required. In this paper, the effect of a slight pitch difference between a bolt and nut is studied. Firstly, by varying the pitch difference, the prevailing torque required for the nut rotation, before the nut touches the clamped body, is measured experimentally. Secondly, the tightening torque is determined as a function of the axial force of the bolt after the nut touches the clamped body. The results show that a large value of pitch difference may provide large prevailing torque that causes an anti-loosening effect although a very large pitch difference may deteriorate the bolt axial force under a certain tightening torque. Thirdly, a suitable pitch difference is determined taking into account the anti-loosening and clamping abilities. Furthermore, the chamfered corners at nut ends are considered, and it is found that the 3D finite element analysis with considering the chamfered nut threads has a good agreement with the experimental observation. Finally, the most desirable pitch difference required for improving anti-loosening is proposed.

  5. Low-thrust chemical orbit transfer propulsion

    NASA Technical Reports Server (NTRS)

    Pelouch, J. J., Jr.

    1979-01-01

    The need for large structures in high orbit is reported in terms of the many mission opportunities which require such structures. Mission and transportation options for large structures are presented, and it is shown that low-thrust propulsion is an enabling requirement for some missions and greatly enhancing to many others. Electric and low-thrust chemical propulsion are compared, and the need for an requirements of low-thrust chemical propulsion are discussed in terms of the interactions that are perceived to exist between the propulsion system and the large structure.

  6. Tolerance Studies of the Mu2e Solenoid System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lopes, M. L.; Ambrosio, G.; Buehler, M.

    2014-01-01

    The muon-to-electron conversion experiment at Fermilab is designed to explore charged lepton flavor violation. It is composed of three large superconducting solenoids, namely, the production solenoid, the transport solenoid, and the detector solenoid. Each subsystem has a set of field requirements. Tolerance sensitivity studies of the magnet system were performed with the objective of demonstrating that the present magnet design meets all the field requirements. Systematic and random errors were considered on the position and alignment of the coils. The study helps to identify the critical sources of errors and which are translated to coil manufacturing and mechanical support tolerances.

  7. Large Deployable Reflector (LDR) - A concept for an orbiting submillimeter-infrared telescope for the 1990s

    NASA Technical Reports Server (NTRS)

    Swanson, P. N.; Gulkis, S.; Kulper, T. B. H.; Kiya, M.

    1983-01-01

    The history and background of the Large Deployable Reflector (LDR) are reviewed. The results of the June 1982 Asilomar (CA) workshop are incorporated into the LDR science objectives and telescope concept. The areas where the LDR may have the greatest scientific impact are in the study of star formation and planetary systems in the own and nearby galaxies and in cosmological studies of the structure and evolution of the early universe. The observational requirements for these and other scientific studies give rise to a set of telescope functional requirements. These, in turn, are satisfied by an LDR configuration which is a Cassegrain design with a 20 m diameter, actively controlled, segmented, primary reflector, diffraction limited at a wavelength of 30 to 50 microns. Technical challenges in the LDR development include construction of high tolerance mirror segments, surface figure measurement, figure control, vibration control, pointing, cryogenics, and coherent detectors. Project status and future plans for the LDR are discussed.

  8. Evaluation of genotoxicity testing of FDA approved large molecule therapeutics.

    PubMed

    Sawant, Satin G; Fielden, Mark R; Black, Kurt A

    2014-10-01

    Large molecule therapeutics (MW>1000daltons) are not expected to enter the cell and thus have reduced potential to interact directly with DNA or related physiological processes. Genotoxicity studies are therefore not relevant and typically not required for large molecule therapeutic candidates. Regulatory guidance supports this approach; however there are examples of marketed large molecule therapeutics where sponsors have conducted genotoxicity studies. A retrospective analysis was performed on genotoxicity studies of United States FDA approved large molecule therapeutics since 1998 identified through the Drugs@FDA website. This information was used to provide a data-driven rationale for genotoxicity evaluations of large molecule therapeutics. Fifty-three of the 99 therapeutics identified were tested for genotoxic potential. None of the therapeutics tested showed a positive outcome in any study except the peptide glucagon (GlucaGen®) showing equivocal in vitro results, as stated in the product labeling. Scientific rationale and data from this review indicate that testing of a majority of large molecule modalities do not add value to risk assessment and support current regulatory guidance. Similarly, the data do not support testing of peptides containing only natural amino acids. Peptides containing non-natural amino acids and small molecules in conjugated products may need to be tested. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Low-Field-Triggered Large Magnetostriction in Iron-Palladium Strain Glass Alloys.

    PubMed

    Ren, Shuai; Xue, Dezhen; Ji, Yuanchao; Liu, Xiaolian; Yang, Sen; Ren, Xiaobing

    2017-09-22

    Development of miniaturized magnetostriction-associated devices requires low-field-triggered large magnetostriction. In this study, we acquired a large magnetostriction (800 ppm) triggered by a low saturation field (0.8 kOe) in iron-palladium (Fe-Pd) alloys. Magnetostriction enhancement jumping from 340 to 800 ppm was obtained with a slight increase in Pd concentration from 31.3 to 32.3 at. %. Further analysis showed that such a slight increase led to suppression of the long-range ordered martensitic phase and resulted in a frozen short-range ordered strain glass state. This strain glass state possessed a two-phase nanostructure with nanosized frozen strain domains embedded in the austenite matrix, which was responsible for the unique magnetostriction behavior. Our study provides a way to design novel magnetostrictive materials with low-field-triggered large magnetostriction.

  10. Separation of large DNA molecules by applying pulsed electric field to size exclusion chromatography-based microchip

    NASA Astrophysics Data System (ADS)

    Azuma, Naoki; Itoh, Shintaro; Fukuzawa, Kenji; Zhang, Hedong

    2018-02-01

    Through electrophoresis driven by a pulsed electric field, we succeeded in separating large DNA molecules with an electrophoretic microchip based on size exclusion chromatography (SEC), which was proposed in our previous study. The conditions of the pulsed electric field required to achieve the separation were determined by numerical analyses using our originally proposed separation model. From the numerical results, we succeeded in separating large DNA molecules (λ DNA and T4 DNA) within 1600 s, which was approximately half of that achieved under a direct electric field in our previous study. Our SEC-based electrophoresis microchip will be one of the effective tools to meet the growing demand of faster and more convenient separation of large DNA molecules, especially in the field of epidemiological research of infectious diseases.

  11. Feasibility study for the application of the large format camera as a payload for the Orbiter program

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The large format camera (LFC) designed as a 30 cm focal length cartographic camera system that employs forward motion compensation in order to achieve the full image resolution provided by its 80 degree field angle lens is described. The feasibility of application of the current LFC design to deployment in the orbiter program as the Orbiter Camera Payload System was assessed and the changes that are necessary to meet such a requirement are discussed. Current design and any proposed design changes were evaluated relative to possible future deployment of the LFC on a free flyer vehicle or in a WB-57F. Preliminary mission interface requirements for the LFC are given.

  12. Solar Cell and Array Technology Development for NASA Solar Electric Propulsion Missions

    NASA Technical Reports Server (NTRS)

    Piszczor, Michael; McNatt, Jeremiah; Mercer, Carolyn; Kerslake, Tom; Pappa, Richard

    2012-01-01

    NASA is currently developing advanced solar cell and solar array technologies to support future exploration activities. These advanced photovoltaic technology development efforts are needed to enable very large (multi-hundred kilowatt) power systems that must be compatible with solar electric propulsion (SEP) missions. The technology being developed must address a wide variety of requirements and cover the necessary advances in solar cell, blanket integration, and large solar array structures that are needed for this class of missions. Th is paper will summarize NASA's plans for high power SEP missions, initi al mission studies and power system requirements, plans for advanced photovoltaic technology development, and the status of specific cell and array technology development and testing that have already been conducted.

  13. Software Engineering for Scientific Computer Simulations

    NASA Astrophysics Data System (ADS)

    Post, Douglass E.; Henderson, Dale B.; Kendall, Richard P.; Whitney, Earl M.

    2004-11-01

    Computer simulation is becoming a very powerful tool for analyzing and predicting the performance of fusion experiments. Simulation efforts are evolving from including only a few effects to many effects, from small teams with a few people to large teams, and from workstations and small processor count parallel computers to massively parallel platforms. Successfully making this transition requires attention to software engineering issues. We report on the conclusions drawn from a number of case studies of large scale scientific computing projects within DOE, academia and the DoD. The major lessons learned include attention to sound project management including setting reasonable and achievable requirements, building a good code team, enforcing customer focus, carrying out verification and validation and selecting the optimum computational mathematics approaches.

  14. Cancer prevention clinical trials.

    PubMed

    Nixon, D W

    1994-01-01

    Many kinds of cancer are preventable. Avoidance of tobacco would essentially eliminate lung cancer and most head and neck cancers as well. Other common cancers (breast, colon, prostate) are related to diet and therefore may also be preventable, at least in part. Abundant epidemiologic and laboratory data link specific nutrients including fat, fiber and vitamins to cancer so that appropriate manipulation of these constituents might reduce cancer risk. Determination of appropriate manipulations requires prospective clinical trials in humans. Approximately 40 such trials are in progress. Some have been completed with encouraging results. Future large scale trials will require designs that overcome the barriers of cost, large subject numbers and long study duration. The use of "intermediate markers" rather than cancer end points is a strategy that will help overcome these barriers.

  15. Large Deployable Reflector (LDR) Requirements for Space Station Accommodations

    NASA Technical Reports Server (NTRS)

    Crowe, D. A.; Clayton, M. J.; Runge, F. C.

    1985-01-01

    Top level requirements for assembly and integration of the Large Deployable Reflector (LDR) Observatory at the Space Station are examined. Concepts are currently under study for LDR which will provide a sequel to the Infrared Astronomy Satellite and the Space Infrared Telescope Facility. LDR will provide a spectacular capability over a very broad spectral range. The Space Station will provide an essential facility for the initial assembly and check out of LDR, as well as a necessary base for refurbishment, repair and modification. By providing a manned platform, the Space Station will remove the time constraint on assembly associated with use of the Shuttle alone. Personnel safety during necessary EVA is enhanced by the presence of the manned facility.

  16. Large Deployable Reflector (LDR) requirements for space station accommodations

    NASA Astrophysics Data System (ADS)

    Crowe, D. A.; Clayton, M. J.; Runge, F. C.

    1985-04-01

    Top level requirements for assembly and integration of the Large Deployable Reflector (LDR) Observatory at the Space Station are examined. Concepts are currently under study for LDR which will provide a sequel to the Infrared Astronomy Satellite and the Space Infrared Telescope Facility. LDR will provide a spectacular capability over a very broad spectral range. The Space Station will provide an essential facility for the initial assembly and check out of LDR, as well as a necessary base for refurbishment, repair and modification. By providing a manned platform, the Space Station will remove the time constraint on assembly associated with use of the Shuttle alone. Personnel safety during necessary EVA is enhanced by the presence of the manned facility.

  17. Concurrent access to a virtual microscope using a web service oriented architecture

    NASA Astrophysics Data System (ADS)

    Corredor, Germán.; Iregui, Marcela; Arias, Viviana; Romero, Eduardo

    2013-11-01

    Virtual microscopy (VM) facilitates visualization and deployment of histopathological virtual slides (VS), a useful tool for education, research and diagnosis. In recent years, it has become popular, yet its use is still limited basically because of the very large sizes of VS, typically of the order of gigabytes. Such volume of data requires efficacious and efficient strategies to access the VS content. In an educative or research scenario, several users may require to access and interact with VS at the same time, so, due to large data size, a very expensive and powerful infrastructure is usually required. This article introduces a novel JPEG2000-based service oriented architecture for streaming and visualizing very large images under scalable strategies, which in addition need not require very specialized infrastructure. Results suggest that the proposed architecture enables transmission and simultaneous visualization of large images, while it is efficient using resources and offering users proper response times.

  18. Large Deployable Reflector Technologies for Future European Telecom and Earth Observation Missions

    NASA Astrophysics Data System (ADS)

    Ihle, A.; Breunig, E.; Dadashvili, L.; Migliorelli, M.; Scialino, L.; van't Klosters, K.; Santiago-Prowald, J.

    2012-07-01

    This paper presents requirements, analysis and design results for European large deployable reflectors (LDR) for space applications. For telecommunications, the foreseeable use of large reflectors is associated to the continuous demand for improved performance of mobile services. On the other hand, several earth observation (EO) missions can be identified carrying either active or passive remote sensing instruments (or both), in which a large effective aperture is needed e.g. BIOMASS. From the European point of view there is a total dependence of USA industry as such LDRs are not available from European suppliers. The RESTEO study is part of a number of ESA led activities to facilitate European LDR development. This paper is focused on the structural-mechanical aspects of this study. We identify the general requirements for LDRs with special emphasis on launcher accommodation for EO mission. In the next step, optimal concepts for the LDR structure and the RF-Surface are reviewed. Regarding the RF surface, both, a knitted metal mesh and a shell membrane based on carbon fibre reinforced silicon (CFRS) are considered. In terms of the backing structure, the peripheral ring concept is identified as most promising and a large number of options for the deployment kinematics are discussed. Of those, pantographic kinematics and a conical peripheral ring are selected. A preliminary design for these two most promising LDR concepts is performed which includes static, modal and kinematic simulation and also techniques to generate the reflector nets.

  19. LSST system analysis and integration task for an advanced science and application space platform

    NASA Technical Reports Server (NTRS)

    1980-01-01

    To support the development of an advanced science and application space platform (ASASP) requirements of a representative set of payloads requiring large separation distances selected from the Science and Applications Space Platform data base. These payloads were a 100 meter diameter atmospheric gravity wave antenna, a 100 meter by 100 meter particle beam injection experiment, a 2 meter diameter, 18 meter long astrometric telescope, and a 15 meter diameter, 35 meter long large ambient deployable IR telescope. A low earth orbit at 500 km altitude and 56 deg inclination was selected as being the best compromise for meeting payload requirements. Platform subsystems were defined which would support the payload requirements and a physical platform concept was developed. Structural system requirements which included utilities accommodation, interface requirements, and platform strength and stiffness requirements were developed. An attitude control system concept was also described. The resultant ASASP concept was analyzed and technological developments deemed necessary in the area of large space systems were recommended.

  20. Hyponatremia and fractures: should hyponatremia be further studied as a potential biochemical risk factor to be included in FRAX algorithms?

    PubMed

    Ayus, J C; Bellido, T; Negri, A L

    2017-05-01

    The Fracture Risk Assessment Tool (FRAX®) was developed by the WHO Collaborating Centre for metabolic bone diseases to evaluate fracture risk of patients. It is based on patient models that integrate the risk associated with clinical variables and bone mineral density (BMD) at the femoral neck. The clinical risk factors included in FRAX were chosen to include only well-established and independent variables related to skeletal fracture risk. The FRAX tool has acquired worldwide acceptance despite having several limitations. FRAX models have not included biochemical derangements in estimation of fracture risk due to the lack of validation in large prospective studies. Recently, there has been an increasing number of studies showing a relationship between hyponatremia and the occurrence of fractures. Hyponatremia is the most frequent electrolyte abnormality measured in the clinic, and serum sodium concentration is a very reproducible, affordable, and readily obtainable measurement. Thus, we think that hyponatremia should be further studied as a biochemical risk factor for skeletal fractures prediction, particularly those at the hip which carries the greatest morbidity and mortality. To achieve this will require the collection of large patient cohorts from diverse geographical locations that include a measure of serum sodium in addition to the other FRAX variables in large numbers, in both sexes, over a wide age range and with wide geographical representation. It would also require the inclusion of data on duration and severity of hyponatremia. Information will be required both on the risk of fracture associated with the occurrence and length of exposure to hyponatremia and to the relationship with the other risk variables included in FRAX and also the independent effect on the occurrence of death which is increased by hyponatremia.

  1. A design study of a reaction control system for a V/STOL fighter/attack aircraft

    NASA Technical Reports Server (NTRS)

    Beard, B. B.; Foley, W. H.

    1983-01-01

    Attention is given to a short takeoff vertical landing (STOVL) aircraft reaction control system (RCS) design study. The STOVL fighter/attack aircraft employs an existing turbofan engine, and its hover requirement places a premium on weight reduction, which eliminates prospective nonairbreathing RCSs. A simple engine compressor bleed RCS degrades overall performance to an unacceptable degree, and the supersonic requirement precludes the large volume alternatives of thermal or ejector thrust augmentation systems as well as the ducting of engine exhaust gases and the use of a dedicated turbojet. The only system which addressed performance criteria without requiring major engine modifications was a dedicated load compressor driven by an auxilliary power unit.

  2. Solar power satellite: System definition study. Part 1, volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A study of the solar power satellite system, which represents a means of tapping baseload electric utility power from the sun on a large scale, was summarized. Study objectives, approach, and planning are presented along with an energy conversion evaluation. Basic requirements were considered in regard to space transportation, construction, and maintainability.

  3. Economics of ion propulsion for large space systems

    NASA Technical Reports Server (NTRS)

    Masek, T. D.; Ward, J. W.; Rawlin, V. K.

    1978-01-01

    This study of advanced electrostatic ion thrusters for space propulsion was initiated to determine the suitability of the baseline 30-cm thruster for future missions and to identify other thruster concepts that would better satisfy mission requirements. The general scope of the study was to review mission requirements, select thruster designs to meet these requirements, assess the associated thruster technology requirements, and recommend short- and long-term technology directions that would support future thruster needs. Preliminary design concepts for several advanced thrusters were developed to assess the potential practical difficulties of a new design. This study produced useful general methodologies for assessing both planetary and earth orbit missions. For planetary missions, the assessment is in terms of payload performance as a function of propulsion system technology level. For earth orbit missions, the assessment is made on the basis of cost (cost sensitivity to propulsion system technology level).

  4. Survey of Large Methane Emitters in North America

    NASA Astrophysics Data System (ADS)

    Deiker, S.

    2017-12-01

    It has been theorized that methane emissions in the oil and gas industry follow log normal or "fat tail" distributions, with large numbers of small sources for every very large source. Such distributions would have significant policy and operational implications. Unfortunately, by their very nature such distributions would require large sample sizes to verify. Until recently, such large-scale studies would be prohibitively expensive. The largest public study to date sampled 450 wells, an order of magnitude too low to effectively constrain these models. During 2016 and 2017, Kairos Aerospace conducted a series of surveys the LeakSurveyor imaging spectrometer, mounted on light aircraft. This small, lightweight instrument was designed to rapidly locate large emission sources. The resulting survey covers over three million acres of oil and gas production. This includes over 100,000 wells, thousands of storage tanks and over 7,500 miles of gathering lines. This data set allows us to now probe the distribution of large methane emitters. Results of this survey, and implications for methane emission distribution, methane policy and LDAR will be discussed.

  5. An Engineering Design Reference Mission for a Future Large-Aperture UVOIR Space Observatory

    NASA Astrophysics Data System (ADS)

    Thronson, Harley A.; Bolcar, Matthew R.; Clampin, Mark; Crooke, Julie A.; Redding, David; Rioux, Norman; Stahl, H. Philip

    2016-01-01

    From the 2010 NRC Decadal Survey and the NASA Thirty-Year Roadmap, Enduring Quests, Daring Visions, to the recent AURA report, From Cosmic Birth to Living Earths, multiple community assessments have recommended development of a large-aperture UVOIR space observatory capable of achieving a broad range of compelling scientific goals. Of these priority science goals, the most technically challenging is the search for spectroscopic biomarkers in the atmospheres of exoplanets in the solar neighborhood. Here we present an engineering design reference mission (EDRM) for the Advanced Technology Large-Aperture Space Telescope (ATLAST), which was conceived from the start as capable of breakthrough science paired with an emphasis on cost control and cost effectiveness. An EDRM allows the engineering design trade space to be explored in depth to determine what are the most demanding requirements and where there are opportunities for margin against requirements. Our joint NASA GSFC/JPL/MSFC/STScI study team has used community-provided science goals to derive mission needs, requirements, and candidate mission architectures for a future large-aperture, non-cryogenic UVOIR space observatory. The ATLAST observatory is designed to operate at a Sun-Earth L2 orbit, which provides a stable thermal environment and excellent field of regard. Our reference designs have emphasized a serviceable 36-segment 9.2 m aperture telescope that stows within a five-meter diameter launch vehicle fairing. As part of our cost-management effort, this particular reference mission builds upon the engineering design for JWST. Moreover, it is scalable to a variety of launch vehicle fairings. Performance needs developed under the study are traceable to a variety of additional reference designs, including options for a monolithic primary mirror.

  6. Singlet oxygen detection in biological systems: Uses and limitations.

    PubMed

    Koh, Eugene; Fluhr, Robert

    2016-07-02

    The study of singlet oxygen in biological systems is challenging in many ways. Singlet oxygen is a relatively unstable ephemeral molecule, and its properties make it highly reactive with many biomolecules, making it difficult to quantify accurately. Several methods have been developed to study this elusive molecule, but most studies thus far have focused on those conditions that produce relatively large amounts of singlet oxygen. However, the need for more sensitive methods is required as one begins to explore the levels of singlet oxygen required in signaling and regulatory processes. Here we discuss the various methods used in the study of singlet oxygen, and outline their uses and limitations.

  7. A Preliminary Study of a Solar-Probe Mission

    NASA Technical Reports Server (NTRS)

    Dugan, Duane W.

    1961-01-01

    A preliminary study is made of some problems associated with the sending of an instrumented probe close to the Sun for the purpose of gathering and telemetering back to Earth information concerning solar phenomena and circumsolar space. The problems considered are primarily those relating to heating and to launch requirements. A nonanalytic discussion of the communications problem of a solar-probe mission is presented to obtain order-of-magnitude estimates of the output and weight of an auxiliary power supply which might be required. From the study it is believed that approaches to the Sun as close as about 4 or 5 million miles do not present insuperable difficulties insofar as heating and communications are concerned. Guidance requirements, in general, do not appear to be stringent. However, in terms of current experience, velocity requirements may be large. It is found, for example, that to achieve perihelion distances between the orbit of Mercury and the visible disc of the Sun, total burnout velocities ranging between 50,000 and 100,000 feet per second are required.

  8. Solving large scale structure in ten easy steps with COLA

    NASA Astrophysics Data System (ADS)

    Tassev, Svetlin; Zaldarriaga, Matias; Eisenstein, Daniel J.

    2013-06-01

    We present the COmoving Lagrangian Acceleration (COLA) method: an N-body method for solving for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). Unlike standard N-body methods, the COLA method can straightforwardly trade accuracy at small-scales in order to gain computational speed without sacrificing accuracy at large scales. This is especially useful for cheaply generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing, as those catalogs are essential for performing detailed error analysis for ongoing and future surveys of LSS. As an illustration, we ran a COLA-based N-body code on a box of size 100 Mpc/h with particles of mass ≈ 5 × 109Msolar/h. Running the code with only 10 timesteps was sufficient to obtain an accurate description of halo statistics down to halo masses of at least 1011Msolar/h. This is only at a modest speed penalty when compared to mocks obtained with LPT. A standard detailed N-body run is orders of magnitude slower than our COLA-based code. The speed-up we obtain with COLA is due to the fact that we calculate the large-scale dynamics exactly using LPT, while letting the N-body code solve for the small scales, without requiring it to capture exactly the internal dynamics of halos. Achieving a similar level of accuracy in halo statistics without the COLA method requires at least 3 times more timesteps than when COLA is employed.

  9. 50 CFR 216.92 - Dolphin-safe requirements for tuna harvested in the ETP by large purse seine vessels.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 50 Wildlife and Fisheries 7 2010-10-01 2010-10-01 false Dolphin-safe requirements for tuna... MAMMALS REGULATIONS GOVERNING THE TAKING AND IMPORTING OF MARINE MAMMALS Dolphin Safe Tuna Labeling § 216.92 Dolphin-safe requirements for tuna harvested in the ETP by large purse seine vessels. (a) U.S...

  10. 50 CFR 216.92 - Dolphin-safe requirements for tuna harvested in the ETP by large purse seine vessels.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 50 Wildlife and Fisheries 9 2011-10-01 2011-10-01 false Dolphin-safe requirements for tuna... MAMMALS REGULATIONS GOVERNING THE TAKING AND IMPORTING OF MARINE MAMMALS Dolphin Safe Tuna Labeling § 216.92 Dolphin-safe requirements for tuna harvested in the ETP by large purse seine vessels. (a) U.S...

  11. 50 CFR 216.92 - Dolphin-safe requirements for tuna harvested in the ETP by large purse seine vessels.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 50 Wildlife and Fisheries 10 2012-10-01 2012-10-01 false Dolphin-safe requirements for tuna... MAMMALS REGULATIONS GOVERNING THE TAKING AND IMPORTING OF MARINE MAMMALS Dolphin Safe Tuna Labeling § 216.92 Dolphin-safe requirements for tuna harvested in the ETP by large purse seine vessels. (a) U.S...

  12. 50 CFR 216.92 - Dolphin-safe requirements for tuna harvested in the ETP by large purse seine vessels.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 50 Wildlife and Fisheries 10 2014-10-01 2014-10-01 false Dolphin-safe requirements for tuna... MAMMALS REGULATIONS GOVERNING THE TAKING AND IMPORTING OF MARINE MAMMALS Dolphin Safe Tuna Labeling § 216.92 Dolphin-safe requirements for tuna harvested in the ETP by large purse seine vessels. (a) U.S...

  13. 50 CFR 216.92 - Dolphin-safe requirements for tuna harvested in the ETP by large purse seine vessels.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 50 Wildlife and Fisheries 10 2013-10-01 2013-10-01 false Dolphin-safe requirements for tuna... MAMMALS REGULATIONS GOVERNING THE TAKING AND IMPORTING OF MARINE MAMMALS Dolphin Safe Tuna Labeling § 216.92 Dolphin-safe requirements for tuna harvested in the ETP by large purse seine vessels. (a) U.S...

  14. Performance of ceramic superconductors in magnetic bearings

    NASA Technical Reports Server (NTRS)

    Kirtley, James L., Jr.; Downer, James R.

    1993-01-01

    Magnetic bearings are large-scale applications of magnet technology, quite similar in certain ways to synchronous machinery. They require substantial flux density over relatively large volumes of space. Large flux density is required to have satisfactory force density. Satisfactory dynamic response requires that magnetic circuit permeances not be too large, implying large air gaps. Superconductors, which offer large magnetomotive forces and high flux density in low permeance circuits, appear to be desirable in these situations. Flux densities substantially in excess of those possible with iron can be produced, and no ferromagnetic material is required. Thus the inductance of active coils can be made low, indicating good dynamic response of the bearing system. The principal difficulty in using superconductors is, of course, the deep cryogenic temperatures at which they must operate. Because of the difficulties in working with liquid helium, the possibility of superconductors which can be operated in liquid nitrogen is thought to extend the number and range of applications of superconductivity. Critical temperatures of about 98 degrees Kelvin were demonstrated in a class of materials which are, in fact, ceramics. Quite a bit of public attention was attracted to these new materials. There is a difficulty with the ceramic superconducting materials which were developed to date. Current densities sufficient for use in large-scale applications have not been demonstrated. In order to be useful, superconductors must be capable of carrying substantial currents in the presence of large magnetic fields. The possible use of ceramic superconductors in magnetic bearings is investigated and discussed and requirements that must be achieved by superconductors operating at liquid nitrogen temperatures to make their use comparable with niobium-titanium superconductors operating at liquid helium temperatures are identified.

  15. Recurrent respiratory papillomatosis: a longitudinal study comparing severity associated with human papilloma viral types 6 and 11 and other risk factors in a large pediatric population.

    PubMed

    Wiatrak, Brian J; Wiatrak, Deborah W; Broker, Thomas R; Lewis, Linda

    2004-11-01

    A database was developed for prospective, longitudinal study of recurrent respiratory papillomatosis (RRP) in a large population of pediatric patients. Data recorded for each patient included epidemiological factors, human papilloma virus (HPV) type, clinical course, staged severity of disease at each surgical intervention, and frequency of surgical intervention. The study hypothesizes that patients with HPV type 11 (HPV-11) and patients younger than 3 years of age at diagnosis are at risk for more aggressive and extensive disease. The 10-year prospective epidemiological study used disease staging for each patient with an original scoring system. Severity scores were updated at each surgical procedure. Parents of children with RRP referred to the authors' hospital completed a detailed epidemiological questionnaire at the initial visit or at the first return visit after the study began. At the first endoscopic debridement after study enrollment, tissue was obtained and submitted for HPV typing using polymerase chain reaction techniques and in situ hybridization. Staging of disease severity was performed in real time at each endoscopic procedure using an RRP scoring system developed by one of the authors (B.J.W.). The frequency of endoscopic operative debridement was recorded for each patient. Information in the database was analyzed to identify statistically significant relationships between extent of disease and/or HPV type, patient age at diagnosis, and selected epidemiological factors. The study may represent the first longitudinal prospective analysis of a large pediatric RRP population. Fifty-eight of the 73 patients in the study underwent HPV typing. Patients infected with HPV-11 were significantly more likely to have higher severity scores, require more frequent surgical intervention, and require adjuvant therapy to control disease progression. In addition, patients with HPV-11 RRP were significantly more likely to develop tracheal disease, to require tracheotomy, and to develop pulmonary disease. Patients receiving a diagnosis of RRP before 3 years of age had significantly higher severity scores, higher frequencies of surgical intervention, and greater likelihood of requiring adjuvant medical therapy. Patients with Medicaid insurance had significantly higher severity scores and required more frequent surgical debridement. Birth by cesarean section appeared to be a significant risk factor for more severe disease and necessity of more frequent surgical intervention. Statistical analysis of the relationships among epidemiological factors, HPV type, and clinical course revealed that patients with HPV-11 and patients younger than 3 years of age at RRP diagnosis are prone to develop more aggressive disease as represented by higher severity scores at endoscopic debridement, more frequent operative debridement procedures per year, a greater requirement for adjuvant therapy, and greater likelihood of tracheal disease with tracheotomy.

  16. Airship economics

    NASA Technical Reports Server (NTRS)

    Neumann, R. D.; Hackney, L. R. M.

    1975-01-01

    Projected operating and manufacturing costs of a large airship design which are considered practical with today's technology and environment are discussed. Data and information developed during an 18-month study on the question of feasibility, engineering, economics and production problems related to a large metalclad type airship are considered. An overview of other classic airship designs are provided, and why metalclad was selected as the most prudent and most economic design to be considered in the 1970-80 era is explained. Crew operation, ATC and enroute requirements are covered along with the question of handling, maintenance and application of systems to the large airship.

  17. COLAcode: COmoving Lagrangian Acceleration code

    NASA Astrophysics Data System (ADS)

    Tassev, Svetlin V.

    2016-02-01

    COLAcode is a serial particle mesh-based N-body code illustrating the COLA (COmoving Lagrangian Acceleration) method; it solves for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). It differs from standard N-body code by trading accuracy at small-scales to gain computational speed without sacrificing accuracy at large scales. This is useful for generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing; such catalogs are needed to perform detailed error analysis for ongoing and future surveys of LSS.

  18. Producing Hydrogen With Sunlight

    NASA Technical Reports Server (NTRS)

    Biddle, J. R.; Peterson, D. B.; Fujita, T.

    1987-01-01

    Costs high but reduced by further research. Producing hydrogen fuel on large scale from water by solar energy practical if plant costs reduced, according to study. Sunlight attractive energy source because it is free and because photon energy converts directly to chemical energy when it breaks water molecules into diatomic hydrogen and oxygen. Conversion process low in efficiency and photochemical reactor must be spread over large area, requiring large investment in plant. Economic analysis pertains to generic photochemical processes. Does not delve into details of photochemical reactor design because detailed reactor designs do not exist at this early stage of development.

  19. More efficient irrigation may compensate for increases in irrigation water requirements due to climate change in the Mediterranean area

    NASA Astrophysics Data System (ADS)

    Fader, Marianela; Shi, Sinan; von Bloh, Werner; Bondeau, Alberte; Cramer, Wolfgang

    2017-04-01

    Irrigation in the Mediterranean is of vital importance for food security, employment and economic development. We will present a recently published study1 that estimates the current level of water demand for Mediterranean agriculture and simulates the potential impacts of climate change, population growth and transitions to water-saving irrigation and conveyance technologies. The results indicate that, at present, Mediterranean region could save 35% of water by implementing more efficient irrigation and conveyance systems, with large differences in the saving potentials across countries. Under climate change, more efficient irrigation is of vital importance for counteracting increases in irrigation water requirements. The Mediterranean area as a whole might face an increase in gross irrigation requirements between 4% and 18% from climate change alone by the end of the century if irrigation systems and conveyance are not improved. Population growth increases these numbers to 22% and 74%, respectively, affecting mainly the Southern and Eastern Mediterranean. However, improved irrigation technologies and conveyance systems have large water saving potentials, especially in the Eastern Mediterranean. Both the Eastern and the Southern Mediterranean would need around 35% more water than today if they could afford some degree of modernization of irrigation and conveyance systems and benefit from the CO2-fertilization effect. However, in some scenarios water scarcity may constrain the supply of the irrigation water needed in future in Algeria, Libya, Israel, Jordan, Lebanon, Syria, Serbia, Morocco, Tunisia and Spain. In this study, vegetation growth, phenology, agricultural production and irrigation water requirements and withdrawal were simulated with the process-based ecohydrological and agro-ecosystem model LPJmL ("Lund-Potsdam-Jena managed Land") after a large development2 that comprised the improved representation of Mediterranean crops.

  20. Triage: care of the critically ill and injured during pandemics and disasters: CHEST consensus statement.

    PubMed

    Christian, Michael D; Sprung, Charles L; King, Mary A; Dichter, Jeffrey R; Kissoon, Niranjan; Devereaux, Asha V; Gomersall, Charles D

    2014-10-01

    Pandemics and disasters can result in large numbers of critically ill or injured patients who may overwhelm available resources despite implementing surge-response strategies. If this occurs, critical care triage, which includes both prioritizing patients for care and rationing scarce resources, will be required. The suggestions in this chapter are important for all who are involved in large-scale pandemics or disasters with multiple critically ill or injured patients, including front-line clinicians, hospital administrators, and public health or government officials. The Triage topic panel reviewed previous task force suggestions and the literature to identify 17 key questions for which specific literature searches were then conducted to identify studies upon which evidence-based recommendations could be made. No studies of sufficient quality were identified. Therefore, the panel developed expert opinion-based suggestions using a modified Delphi process. Suggestions from the previous task force that were not being updated were also included for validation by the expert panel. The suggestions from the task force outline the key principles upon which critical care triage should be based as well as a path for the development of the plans, processes, and infrastructure required. This article provides 11 suggestions regarding the principles upon which critical care triage should be based and policies to guide critical care triage. Ethical and efficient critical care triage is a complex process that requires significant planning and preparation. At present, the prognostic tools required to produce an effective decision support system (triage protocol) as well as the infrastructure, processes, legal protections, and training are largely lacking in most jurisdictions. Therefore, critical care triage should be a last resort after mass critical care surge strategies.

  1. The Generation-X X-ray Observatory Vision Mission and Technology Study

    NASA Technical Reports Server (NTRS)

    Figueroa-Feliciano, Enectali

    2004-01-01

    The new frontier in astrophysics is the study of the birth and evolution of the first stars, galaxies and black holes in the early Universe. X-ray astronomy opens a window into these objects by studying the emission from black holes, supernova explosions and the gamma-ray burst afterglows of massive stars. However, such objects are beyond the grasp of current or near-future observatories. X-ray imaging and spectroscopy of such distant objects will require an X-ray telescope with large collecting area and high angular resolution. Our team has conceived the Generation-X Vision Mission based on an X-ray observatory with 100 sq m collecting area at 1 keV (1000 times larger than Chandra) and 0.1 arcsecond angular resolution (several times better than Chandra and 50 times better than the Constellation-X resolution goal). Such an observatory would be capable of detecting the earliest black holes and galaxies in the Universe, and will also study extremes of density, gravity, magnetic fields, and kinetic energy which cannot be created in laboratories. NASA has selected the Generation-X mission for study under its Vision Mission Program. We describe the studies being performed to develop the mission concept and define candidate technologies and performance requirements for Generation-X. The baseline Generation-X mission involves four 8m diameter X-ray telescopes operating at Sun-Earth L2. We trade against an alternate concept of a single 26m diameter telescope with focal plane instruments on a separate spacecraft. A telescope of this size will require either robotic or human-assisted in-flight assembly. The required effective area implies that extremely lightweight grazing incidence X-ray optics must be developed. To achieve the required aerial density of at least 100 times lower than in Chandra, we will study 0.1mm thick mirrors which have active on-orbit figure control. We discuss the suite of required detectors, including a large FOV high angular resolution imager, a cryogenic imaging spectrometer and a grating spectrometer. We outline the development roadmap to confront the many technological challenges far implementing the Generation-X mission.

  2. Eyeglass Large Aperture, Lightweight Space Optics FY2000 - FY2002 LDRD Strategic Initiative

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hyde, R

    2003-02-10

    A series of studies by the Air Force, the National Reconnaissance Office and NASA have identified the critical role played by large optics in fulfilling many of the space related missions of these agencies. Whether it is the Next Generation Space Telescope for NASA, high resolution imaging systems for NRO, or beam weaponry for the Air Force, the diameter of the primary optic is central to achieving high resolution (imaging) or a small spot size on target (lethality). While the detailed requirements differ for each application (high resolution imaging over the visible and near-infrared for earth observation, high damage thresholdmore » but single-wavelength operation for directed energy), the challenges of a large, lightweight primary optic which is space compatible and operates with high efficiency are the same. The advantage of such large optics to national surveillance applications is that it permits these observations to be carried-out with much greater effectiveness than with smaller optics. For laser weapons, the advantage is that it permits more tightly focused beams which can be leveraged into either greater effective range, reduced laser power, and/or smaller on-target spot-sizes; weapon systems can be made either much more effective or much less expensive. This application requires only single-wavelength capability, but places an emphasis upon robust, rapidly targetable optics. The advantages of large aperture optics to astronomy are that it increases the sensitivity and resolution with which we can view the universe. This can be utilized either for general purpose astronomy, allowing us to examine greater numbers of objects in more detail and at greater range, or it can enable the direct detection and detailed examination of extra-solar planets. This application requires large apertures (for both light-gathering and resolution reasons), with broad-band spectral capability, but does not emphasize either large fields-of-view or pointing agility. Despite differences in their requirements and implementations, the fundamental difficulty in utilizing large aperture optics is the same for all of these applications: It is extremely difficult to design large aperture space optics which are both optically precise and can meet the practical requirements for launch and deployment in space. At LLNL we have developed a new concept (Eyeglass) which uses large diffractive optics to solve both of these difficulties; greatly reducing both the mass and the tolerance requirements for large aperture optics. During previous LDRD-supported research, we developed this concept, built and tested broadband diffractive telescopes, and built 50 cm aperture diffraction-limited diffractive lenses (the largest in the world). This work is fully described in UCRL-ID-136262, Eyeglass: A Large Aperture Space Telescope. However, there is a large gap between optical proof-of-principle with sub-meter apertures, and actual 50 meter space telescopes. This gap is far too large (both in financial resources and in spacecraft expertise) to be filled internally at LLNL; implementation of large aperture diffractive space telescopes must be done externally using non-LLNL resources and expertise. While LLNL will never become the primary contractor and integrator for large space optical systems, our natural role is to enable these devices by developing the capability of producing very large diffractive optics. Accordingly, the purpose of the Large Aperture, Lightweight Space Optics Strategic Initiative was to develop the technology to fabricate large, lightweight diffractive lenses. The additional purpose of this Strategic Initiative was, of course, to demonstrate this lens-fabrication capability in a fashion compellingly enough to attract the external support necessary to continue along the path to full-scale space-based telescopes. During this 3 year effort (FY2000-FY2002) we have developed the capability of optically smoothing and diffractively-patterning thin meter-sized sheets of glass into lens panels. We have also developed alignment and seaming techniques which allow individual lens panels to be assembled together, forming a much larger, segmented, diffractive lens. The capabilities provided by this LDRD-supported developmental effort were then demonstrated by the fabrication and testing of a lightweight, 5 meter aperture, diffractive lens.« less

  3. Thermographic Imaging of Defects in Anisotropic Composites

    NASA Technical Reports Server (NTRS)

    Plotnikov, Y. A.; Winfree, W. P.

    2000-01-01

    Composite materials are of increasing interest to the aerospace industry as a result of their weight versus performance characteristics. One of the disadvantages of composites is the high cost of fabrication and post inspection with conventional ultrasonic scanning systems. The high cost of inspection is driven by the need for scanning systems which can follow large curve surfaces. Additionally, either large water tanks or water squirters are required to couple the ultrasonics into the part. Thermographic techniques offer significant advantages over conventional ultrasonics by not requiring physical coupling between the part and sensor. The thermographic system can easily inspect large curved surface without requiring a surface following scanner. However, implementation of Thermal Nondestructive Evaluations (TNDE) for flaw detection in composite materials and structures requires determining its limit. Advanced algorithms have been developed to enable locating and sizing defects in carbon fiber reinforced plastic (CFRP). Thermal Tomography is a very promising method for visualizing the size and location of defects in materials such as CFRP. However, further investigations are required to determine its capabilities for inspection of thick composites. In present work we have studied influence of the anisotropy on the reconstructed image of a defect generated by an inversion technique. The composite material is considered as homogeneous with macro properties: thermal conductivity K, specific heat c, and density rho. The simulation process involves two sequential steps: solving the three dimensional transient heat diffusion equation for a sample with a defect, then estimating the defect location and size from the surface spatial and temporal thermal distributions (inverse problem), calculated from the simulations.

  4. Radiometer requirements for Earth-observation systems using large space antennas

    NASA Technical Reports Server (NTRS)

    Keafer, L. S., Jr.; Harrington, R. F.

    1983-01-01

    Requirements are defined for Earth observation microwave radiometry for the decade of the 1990's by using large space antenna (LSA) systems with apertures in the range from 50 to 200 m. General Earth observation needs, specific measurement requirements, orbit mission guidelines and constraints, and general radiometer requirements are defined. General Earth observation needs are derived from NASA's basic space science program. Specific measurands include soil moisture, sea surface temperature, salinity, water roughness, ice boundaries, and water pollutants. Measurements are required with spatial resolution from 10 to 1 km and with temporal resolution from 3 days to 1 day. The primary orbit altitude and inclination ranges are 450 to 2200 km and 60 to 98 deg, respectively. Contiguous large scale coverage of several land and ocean areas over the globe dictates large (several hundred kilometers) swaths. Radiometer measurements are made in the bandwidth range from 1 to 37 GHz, preferably with dual polarization radiometers with a minimum of 90 percent beam efficiency. Reflector surface, root mean square deviation tolerances are in the wavelength range from 1/30 to 1/100.

  5. Fostering Resilience in Beginning Special Education Teachers

    ERIC Educational Resources Information Center

    Belknap, Bridget M.

    2012-01-01

    This qualitative study identified perceptions of risk and resilience in four different teaching roles of first-year, secondary special education teachers in three school districts in a large metropolitan area. The study sample consisted of nine women in their first year of teaching who were also completing the requirements of a master's…

  6. BOILER DESIGN CRITERIA FOR DRY SORBENT SO2 CONTROL WITH LOW-NOX BURNERS: NEW UNIT APPLICATIONS

    EPA Science Inventory

    The report describes a study to define boiler modifications required to achieve 70% SO2 removal with sorbent injection on a large tangentially fired utility boiler without supplemental spray drying. The study is a follow on to a recently completed broader evaluation of boiler des...

  7. How Can Intercultural School Development Succeed? The Perspective of Teachers and Teacher Educators

    ERIC Educational Resources Information Center

    Kiel, Ewald; Syring, Marcus; Weiss, Sabine

    2017-01-01

    The large number of newly arrived individuals from other countries, particularly of young people, has had an enormous impact on the school system in Germany. The present study investigated requirements for successful intercultural school development. The study used investigative group discussions, where the groups were composed of teachers and…

  8. Getting Vaccinated against HPV: Attitudes, Intentions and Perceived Barriers of Female Undergraduates

    ERIC Educational Resources Information Center

    Burke, Sloane C.; Vail-Smith, Karen; White, David M.; Baker, Elizabeth; Mitchell, Terri

    2010-01-01

    This study examines college women's intention to receive the Human Papilloma Virus (HPV) vaccine and their perceived barriers to being vaccinated. The study reports findings from an online questionnaire completed by 856 undergraduate women enrolled in a required personal health course at a large (27,000 plus) southeastern university. The majority…

  9. Student Thoughts and Perceptions on Curriculum Reform

    ERIC Educational Resources Information Center

    VanderJagt, Douglas D.

    2013-01-01

    The purpose of this qualitative case study was to examine how students experience and respond to Michigan's increased graduation requirements. The study was conducted in a large, suburban high school that instituted a change to a trimester system in response to the state mandate. A criterion-based sample of 16 students, both college bound and…

  10. "Invisible" Bilingualism--"Invisible" Language Ideologies: Greek Teachers' Attitudes Towards Immigrant Pupils' Heritage Languages

    ERIC Educational Resources Information Center

    Gkaintartzi, Anastasia; Kiliari, Angeliki; Tsokalidou, Roula

    2015-01-01

    This paper presents data from two studies--a nationwide quantitative research and an ethnographic study--on Greek schoolteachers' attitudes towards immigrant pupils' bilingualism. The quantitative data come from a large-scale questionnaire survey, which aimed at the investigation of the needs and requirements for the implementation of a pilot…

  11. Innovative space x-ray telescopes

    NASA Astrophysics Data System (ADS)

    Hudec, R.; Inneman, A.; Pina, L.; Sveda, L.; Ticha, H.; Brozek, V.

    2017-11-01

    We report on the progress in innovative X-ray mirror development with focus on requirements of future X-ray astronomy space projects. Various future projects in X-ray astronomy and astrophysics will require large lightweight but highly accurate segments with multiple thin shells or foils. The large Wolter 1 grazing incidence multiple mirror arrays, the Kirkpatrick-Baez modules, as well as the large Lobster-Eye X-ray telescope modules in Schmidt arrangement may serve as examples. All these space projects will require high quality and light segmented shells (shaped, bent or flat foils) with high X-ray reflectivity and excellent mechanical stability.

  12. Big questions, big science: meeting the challenges of global ecology.

    PubMed

    Schimel, David; Keller, Michael

    2015-04-01

    Ecologists are increasingly tackling questions that require significant infrastucture, large experiments, networks of observations, and complex data and computation. Key hypotheses in ecology increasingly require more investment, and larger data sets to be tested than can be collected by a single investigator's or s group of investigator's labs, sustained for longer than a typical grant. Large-scale projects are expensive, so their scientific return on the investment has to justify the opportunity cost-the science foregone because resources were expended on a large project rather than supporting a number of individual projects. In addition, their management must be accountable and efficient in the use of significant resources, requiring the use of formal systems engineering and project management to mitigate risk of failure. Mapping the scientific method into formal project management requires both scientists able to work in the context, and a project implementation team sensitive to the unique requirements of ecology. Sponsoring agencies, under pressure from external and internal forces, experience many pressures that push them towards counterproductive project management but a scientific community aware and experienced in large project science can mitigate these tendencies. For big ecology to result in great science, ecologists must become informed, aware and engaged in the advocacy and governance of large ecological projects.

  13. Integration and Analysis of Neighbor Discovery and Link Quality Estimation in Wireless Sensor Networks

    PubMed Central

    Radi, Marjan; Dezfouli, Behnam; Abu Bakar, Kamalrulnizam; Abd Razak, Shukor

    2014-01-01

    Network connectivity and link quality information are the fundamental requirements of wireless sensor network protocols to perform their desired functionality. Most of the existing discovery protocols have only focused on the neighbor discovery problem, while a few number of them provide an integrated neighbor search and link estimation. As these protocols require a careful parameter adjustment before network deployment, they cannot provide scalable and accurate network initialization in large-scale dense wireless sensor networks with random topology. Furthermore, performance of these protocols has not entirely been evaluated yet. In this paper, we perform a comprehensive simulation study on the efficiency of employing adaptive protocols compared to the existing nonadaptive protocols for initializing sensor networks with random topology. In this regard, we propose adaptive network initialization protocols which integrate the initial neighbor discovery with link quality estimation process to initialize large-scale dense wireless sensor networks without requiring any parameter adjustment before network deployment. To the best of our knowledge, this work is the first attempt to provide a detailed simulation study on the performance of integrated neighbor discovery and link quality estimation protocols for initializing sensor networks. This study can help system designers to determine the most appropriate approach for different applications. PMID:24678277

  14. A Fast Evaluation Method for Energy Building Consumption Based on the Design of Experiments

    NASA Astrophysics Data System (ADS)

    Belahya, Hocine; Boubekri, Abdelghani; Kriker, Abdelouahed

    2017-08-01

    Building sector is one of the effective consumer energy by 42% in Algeria. The need for energy has continued to grow, in inordinate way, due to lack of legislation on energy performance in this large consumer sector. Another reason is the simultaneous change of users’ requirements to maintain their comfort, especially summer in dry lands and parts of southern Algeria, where the town of Ouargla presents a typical example which leads to a large amount of electricity consumption through the use of air conditioning. In order to achieve a high performance envelope of the building, an optimization of major parameters building envelope is required, using design of experiments (DOE), can determine the most effective parameters and eliminate the less importance. The study building is often complex and time consuming due to the large number of parameters to consider. This study focuses on reducing the computing time and determines the major parameters of building energy consumption, such as area of building, factor shape, orientation, ration walls to windows …etc to make some proposal models in order to minimize the seasonal energy consumption due to air conditioning needs.

  15. Evaluation of US rear underride guard regulation for large trucks using real-world crashes.

    PubMed

    Brumbelow, Matthew L; Blanar, Laura

    2010-11-01

    Current requirements for rear underride guards on large trucks are set by the National Highway Traffic Safety Administration in Federal Motor Vehicle Safety Standards (FMVSS) 223 and 224. The standards have been in place since 1998, but their adequacy has not been evaluated apart from two series of controlled crash tests. The current study used detailed reviews of real-world crashes from the Large Truck Crash Causation Study to assess the ability of guards that comply with certain aspects of the regulation to mitigate passenger vehicle underride. It also evaluated the dangers posed by underride of large trucks that are exempt from guard requirements. For the 115 cases meeting the inclusion criteria, coded data, case narratives, photographs, and measurements were used to examine the interaction between study vehicles. The presence and type of underride guard was determined, and its performance in mitigating underride was categorized. Overall, almost one-half of the passenger vehicles had underride damage classified as severe or catastrophic. These vehicles accounted for 23 of the 28 in which occupants were killed. For the cases involving trailers with underride guards compliant with one or both FMVSS, guard deformation or complete failure was frequent and most commonly due to weak attachments, buckling of the trailer chassis, or bending of the lateral end of the guard under narrow overlap loading. Most of the truck units studied qualified for at least one of the FMVSS exemptions. The two largest groups were trailers with small wheel setbacks and single-unit straight trucks. Dump trucks represented a particularly hazardous category of straight truck. The current study suggests several weaknesses in the rear underride guard regulation. The standard allows too much ground clearance, the quasi-static test conditions allow guard designs that fail in narrow overlap crashes, and certifying guards independent of trailers leads to systems with inadequate attachment and chassis strength. Additionally, the regulation should be expanded to cover a higher percentage of the large truck fleet.

  16. Chronic Subdural Hematoma Treated by Small or Large Craniotomy with Membranectomy as the Initial Treatment

    PubMed Central

    Kim, Jae-Hong; Kim, Jung-Hee; Kong, Min-Ho; Song, Kwan-Young

    2011-01-01

    Objective There are few studies comparing small and large craniotomies for the initial treatment of chronic subdural hematoma (CSDH) which had non-liquefied hematoma, multilayer intrahematomal loculations, or organization/calcification on computed tomography and magnetic resonance imaging. These procedures were compared to determine which would produce superior postoperative results. Methods Between 2001 and 2009, 317 consecutive patients were surgically treated for CSDH at our institution. Of these, 16 patients underwent a small craniotomy with partial membranectomy and 42 patients underwent a large craniotomy with extended membranectomy as the initial treatment. A retrospective review was performed to compare the postoperative outcomes of these two techniques, focusing on improvement of neurological status, complications, reoperation rate, and days of post-operative hospitalization. Results The mean ages were 69.4±12.1 and 55.6±9.3 years in the small and large craniotomy groups, respectively. The recurrence of hematomas requiring reoperation occurred in 50% and 10% of the small and large craniotomy patients, respectively (p<0.001). There were no significant differences in postoperative neurological status, complications, or days of hospital stay between these two groups. Conclusion Among the cases of CSDH initially requiring craniotomy, the large craniotomy with extended membranectomy technique reduced the reoperation rate, compared to that of the small craniotomy with partial membranectomy technique. PMID:22053228

  17. Support of an Active Science Project by a Large Information System: Lessons for the EOS Era

    NASA Technical Reports Server (NTRS)

    Angelici, Gary L.; Skiles, J. W.; Popovici, Lidia Z.

    1993-01-01

    The ability of large information systems to support the changing data requirements of active science projects is being tested in a NASA collaborative study. This paper briefly profiles both the active science project and the large information system involved in this effort and offers some observations about the effectiveness of the project support. This is followed by lessons that are important for those participating in large information systems that need to support active science projects or that make available the valuable data produced by these projects. We learned in this work that it is difficult for a large information system focused on long term data management to satisfy the requirements of an on-going science project. For example, in order to provide the best service, it is important for all information system staff to keep focused on the needs and constraints of the scientists in the development of appropriate services. If the lessons learned in this and other science support experiences are not applied by those involved with large information systems of the EOS (Earth Observing System) era, then the final data products produced by future science projects may not be robust or of high quality, thereby making the conduct of the project science less efficacious and reducing the value of these unique suites of data for future research.

  18. Material processing: AI-MSG modification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woolsey, C.C.; Carnazzola, A.

    1973-12-18

    This specification establishes fabrication processing requirements such as cleaning, welding, brazing, and post-weld heat treating for the modification of the Atomics International (AI) Modular Steam Generator (MSG) for use in the Large Leak Test Rig (LLTR) for the study of sodium-water reactions.

  19. Efficient Bayesian mixed model analysis increases association power in large cohorts

    PubMed Central

    Loh, Po-Ru; Tucker, George; Bulik-Sullivan, Brendan K; Vilhjálmsson, Bjarni J; Finucane, Hilary K; Salem, Rany M; Chasman, Daniel I; Ridker, Paul M; Neale, Benjamin M; Berger, Bonnie; Patterson, Nick; Price, Alkes L

    2014-01-01

    Linear mixed models are a powerful statistical tool for identifying genetic associations and avoiding confounding. However, existing methods are computationally intractable in large cohorts, and may not optimize power. All existing methods require time cost O(MN2) (where N = #samples and M = #SNPs) and implicitly assume an infinitesimal genetic architecture in which effect sizes are normally distributed, which can limit power. Here, we present a far more efficient mixed model association method, BOLT-LMM, which requires only a small number of O(MN)-time iterations and increases power by modeling more realistic, non-infinitesimal genetic architectures via a Bayesian mixture prior on marker effect sizes. We applied BOLT-LMM to nine quantitative traits in 23,294 samples from the Women’s Genome Health Study (WGHS) and observed significant increases in power, consistent with simulations. Theory and simulations show that the boost in power increases with cohort size, making BOLT-LMM appealing for GWAS in large cohorts. PMID:25642633

  20. Geomorphic analysis of large alluvial rivers

    NASA Astrophysics Data System (ADS)

    Thorne, Colin R.

    2002-05-01

    Geomorphic analysis of a large river presents particular challenges and requires a systematic and organised approach because of the spatial scale and system complexity involved. This paper presents a framework and blueprint for geomorphic studies of large rivers developed in the course of basic, strategic and project-related investigations of a number of large rivers. The framework demonstrates the need to begin geomorphic studies early in the pre-feasibility stage of a river project and carry them through to implementation and post-project appraisal. The blueprint breaks down the multi-layered and multi-scaled complexity of a comprehensive geomorphic study into a number of well-defined and semi-independent topics, each of which can be performed separately to produce a clearly defined, deliverable product. Geomorphology increasingly plays a central role in multi-disciplinary river research and the importance of effective quality assurance makes it essential that audit trails and quality checks are hard-wired into study design. The structured approach presented here provides output products and production trails that can be rigorously audited, ensuring that the results of a geomorphic study can stand up to the closest scrutiny.

  1. The Space Station as a Construction Base for Large Space Structures

    NASA Technical Reports Server (NTRS)

    Gates, R. M.

    1985-01-01

    The feasibility of using the Space Station as a construction site for large space structures is examined. An overview is presented of the results of a program entitled Definition of Technology Development Missions (TDM's) for Early Space Stations - Large Space Structures. The definition of LSS technology development missions must be responsive to the needs of future space missions which require large space structures. Long range plans for space were assembled by reviewing Space System Technology Models (SSTM) and other published sources. Those missions which will use large space structures were reviewed to determine the objectives which must be demonstrated by technology development missions. The three TDM's defined during this study are: (1) a construction storage/hangar facility; (2) a passive microwave radiometer; and (3) a precision optical system.

  2. Are attributes of organizational performance in large health care organizations relevant in primary care practices?

    PubMed

    Orzano, A John; Tallia, Alfred F; Nutting, Paul A; Scott-Cawiezell, Jill; Crabtree, Benjamin F

    2006-01-01

    Are organizational attributes associated with better health outcomes in large health care organizations applicable to primary care practices? In comparative case studies of two community family practices, it was found that attributes of organizational performance identified in larger health care organizations must be tailored to their unique context of primary care. Further work is required to adapt or establish the significance of the attributes of management infrastructure and information mastery.

  3. Applicability of PM3 to transphosphorylation reaction path: Toward designing a minimal ribozyme

    NASA Technical Reports Server (NTRS)

    Manchester, John I.; Shibata, Masayuki; Setlik, Robert F.; Ornstein, Rick L.; Rein, Robert

    1993-01-01

    A growing body of evidence shows that RNA can catalyze many of the reactions necessary both for replication of genetic material and the possible transition into the modern protein-based world. However, contemporary ribozymes are too large to have self-assembled from a prebiotic oligonucleotide pool. Still, it is likely that the major features of the earliest ribozymes have been preserved as molecular fossils in the catalytic RNA of today. Therefore, the search for a minimal ribozyme has been aimed at finding the necessary structural features of a modern ribozyme (Beaudry and Joyce, 1990). Both a three-dimensional model and quantum chemical calculations are required to quantitatively determine the effects of structural features of the ribozyme on the reaction it catalyzes. Using this model, quantum chemical calculations must be performed to determine quantitatively the effects of structural features on catalysis. Previous studies of the reaction path have been conducted at the ab initio level, but these methods are limited to small models due to enormous computational requirements. Semiempirical methods have been applied to large systems in the past; however, the accuracy of these methods depends largely on a simple model of the ribozyme-catalyzed reaction, or hydrolysis of phosphoric acid. We find that the results are qualitatively similar to ab initio results using large basis sets. Therefore, PM3 is suitable for studying the reaction path of the ribozyme-catalyzed reaction.

  4. Exploring the performance of large-N radio astronomical arrays

    NASA Astrophysics Data System (ADS)

    Lonsdale, Colin J.; Doeleman, Sheperd S.; Cappallo, Roger J.; Hewitt, Jacqueline N.; Whitney, Alan R.

    2000-07-01

    New radio telescope arrays are currently being contemplated which may be built using hundreds, or even thousands, of relatively small antennas. These include the One Hectare Telescope of the SETI Institute and UC Berkeley, the LOFAR telescope planned for the New Mexico desert surrounding the VLA, and possibly the ambitious international Square Kilometer Array (SKA) project. Recent and continuing advances in signal transmission and processing technology make it realistic to consider full cross-correlation of signals from such a large number of antennas, permitting the synthesis of an aperture with much greater fidelity than in the past. In principle, many advantages in instrumental performance are gained by this 'large-N' approach to the design, most of which require the development of new algorithms. Because new instruments of this type are expected to outstrip the performance of current instruments by wide margins, much of their scientific productivity is likely to come from the study of objects which are currently unknown. For this reason, instrumental flexibility is of special importance in design studies. A research effort has begun at Haystack Observatory to explore large-N performance benefits, and to determine what array design properties and data reduction algorithms are required to achieve them. The approach to these problems, involving a sophisticated data simulator, algorithm development, and exploration of array configuration parameter space, will be described, and progress to date will be summarized.

  5. Large space systems technology electronics: Data and power distribution

    NASA Technical Reports Server (NTRS)

    Dunbar, W. G.

    1980-01-01

    The development of hardware technology and manufacturing techniques required to meet space platform and antenna system needs in the 1980s is discussed. Preliminary designs for manned and automatically assembled space power system cables, connectors, and grounding and bonding materials and techniques are reviewed. Connector concepts, grounding design requirements, and bonding requirements are discussed. The problem of particulate debris contamination for large structure spacecraft is addressed.

  6. Nutrigenomics and metabolomics will change clinical nutrition and public health practice: insights from studies on dietary requirements for choline2

    PubMed Central

    Zeisel, Steven H

    2008-01-01

    Science is beginning to understand how genetic variation and epigenetic events alter requirements for, and responses to, nutrients (nutrigenomics). At the same time, methods for profiling almost all of the products of metabolism in a single sample of blood or urine are being developed (metabolomics). Relations between diet and nutrigenomic and metabolomic profiles and between those profiles and health have become important components of research that could change clinical practice in nutrition. Most nutrition studies assume that all persons have average dietary requirements, and the studies often do not plan for a large subset of subjects who differ in requirements for a nutrient. Large variances in responses that occur when such a population exists can result in statistical analyses that argue for a null effect. If nutrition studies could better identify responders and differentiate them from nonresponders on the basis of nutrigenomic or metabolomic profiles, the sensitivity to detect differences between groups could be greatly increased, and the resulting dietary recommendations could be appropriately targeted. It is not certain that nutrition will be the clinical specialty primarily responsible for nutrigenomics or metabolomics, because other disciplines currently dominate the development of portions of these fields. However, nutrition scientists' depth of understanding of human metabolism can be used to establish a role in the research and clinical programs that will arise from nutrigenomic and metabolomic profiling. Investments made today in training programs and in research methods could ensure a new foundation for clinical nutrition in the future. PMID:17823415

  7. Aversive properties of negative incentive shifts in Fischer 344 and Lewis rats

    PubMed Central

    Brewer, Adam; Johnson, Patrick; Stein, Jeff; Schlund, Michael; Williams, Dean C.

    2018-01-01

    Research on incentive contrast highlights that reward value is not absolute but rather is based upon comparisons we make to rewards we have received and expect to receive. Both human and nonhuman studies on incentive contrast show that shifting from a larger more-valued reward to a smaller less-valued reward is associated with long periods of nonresponding—a negative contrast effect. In this investigation, we used two different genetic rat strains, Fischer 344 and Lewis rats that putatively differ in their sensitivity to aversive stimulation, to assess the aversive properties of large-to-small reward shifts (negative incentive shifts). Additionally, we examined the extent to which increasing cost (fixed-ratio requirements) modulates negative contrast effects. In the presence of a cue that signaled the upcoming reward magnitude, lever pressing was reinforced with one of two different magnitudes of food (large or small). This design created two contrast shifts (small-to-large, large-to-small) and two shifts used as control conditions (small-to-small, large-to-large). Results showed a significant interaction between rat strain and cost requirements only during the negative incentive shift with the emotionally reactive Fischer 344 rats exhibiting significantly longer response latencies with increasing cost, highlighting greater negative contrast. These findings are more consistent with emotionality accounts of negative contrast and results of neurophysiological research that suggests shifting from a large to a small reward is aversive. Findings also highlight how subjective reward value and motivation is a product of gene-environment interactions. PMID:27864048

  8. Sub-aperture stitching test of a cylindrical mirror with large aperture

    NASA Astrophysics Data System (ADS)

    Xue, Shuai; Chen, Shanyong; Shi, Feng; Lu, Jinfeng

    2016-09-01

    Cylindrical mirrors are key optics of high-end equipment of national defense and scientific research such as high energy laser weapons, synchrotron radiation system, etc. However, its surface error test technology develops slowly. As a result, its optical processing quality can not meet the requirements, and the developing of the associated equipment is hindered. Computer Generated-Hologram (CGH) is commonly utilized as null for testing cylindrical optics. However, since the fabrication process of CGH with large aperture is not sophisticated yet, the null test of cylindrical optics with large aperture is limited by the aperture of the CGH. Hence CGH null test combined with sub-aperture stitching method is proposed to break the limit of the aperture of CGH for testing cylindrical optics, and the design of CGH for testing cylindrical surfaces is analyzed. Besides, the misalignment aberration of cylindrical surfaces is different from that of the rotational symmetric surfaces since the special shape of cylindrical surfaces, and the existing stitching algorithm of rotational symmetric surfaces can not meet the requirements of stitching cylindrical surfaces. We therefore analyze the misalignment aberrations of cylindrical surfaces, and study the stitching algorithm for measuring cylindrical optics with large aperture. Finally we test a cylindrical mirror with large aperture to verify the validity of the proposed method.

  9. Flow adjustment inside large finite-size wind farms approaching the infinite wind farm regime

    NASA Astrophysics Data System (ADS)

    Wu, Ka Ling; Porté-Agel, Fernando

    2017-04-01

    Due to the increasing number and the growing size of wind farms, the distance among them continues to decrease. Thus, it is necessary to understand how these large finite-size wind farms and their wakes could interfere the atmospheric boundary layer (ABL) dynamics and adjacent wind farms. Fully-developed flow inside wind farms has been extensively studied through numerical simulations of infinite wind farms. The transportation of momentum and energy is only vertical and the advection of them is neglected in these infinite wind farms. However, less attention has been paid to examine the length of wind farms required to reach such asymptotic regime and the ABL dynamics in the leading and trailing edges of the large finite-size wind farms. Large eddy simulations are performed in this study to investigate the flow adjustment inside large finite-size wind farms in conventionally-neutral boundary layer with the effect of Coriolis force and free-atmosphere stratification from 1 to 5 K/km. For the large finite-size wind farms considered in the present work, when the potential temperature lapse rate is 5 K/km, the wind farms exceed the height of the ABL by two orders of magnitude for the incoming flow inside the farms to approach the fully-developed regime. An entrance fetch of approximately 40 times of the ABL height is also required for such flow adjustment. At the fully-developed flow regime of the large finite-size wind farms, the flow characteristics match those of infinite wind farms even though they have different adjustment length scales. The role of advection at the entrance and exit regions of the large finite-size wind farms is also examined. The interaction between the internal boundary layer developed above the large finite-size wind farms and the ABL under different potential temperature lapse rates are compared. It is shown that the potential temperature lapse rate plays a role in whether the flow inside the large finite-size wind farms adjusts to the fully-developed flow regime. The flow characteristics of the wake of these large finite-size wind farms are reported to forecast the effect of large finite-size wind farms on adjacent wind farms. A power deficit as large as 8% is found at a distance of 10 km downwind from the large finite-size wind farms.

  10. Micrometeorologic methods for measuring the post-application volatilization of pesticides

    USGS Publications Warehouse

    Majewski, M.S.

    1999-01-01

    A wide variety of micrometeorological measurement methods can be used to estimate the postapplication volatilization of pesticides from treated fields. All these estimation methods require that the entire study area have the same surficial characteristics, including the area surrounding the actual study site, and that the pesticide under investigation be applied as quickly and as uniformly as possible before any measurements are made. Methods such as aerodynamic profile, energy balance, eddy correlation, and relaxed eddy accumulation require a large (typically 1 or more hectare) study area so that the flux measurements can be made in a well developed atmospheric boundary- layer and that steady-state conditions exist. The area surrounding the study plot should have similar surficial characteristics as the study plot with sufficient upwind extent so the wind speed and temperature gradients are fully developed. Mass balance methods such as integrated horizontal flux and trajectory simulations do not require a large source area, but the area surrounding the study plot should have similar surficial characteristics. None of the micrometeorological techniques for estimating the postapplication volatilization fluxes of pesticides disturb the environment or the soil processes that influence the gas exchange from the surface to the atmosphere. They allow for continuous measurements and provide a temporally averaged flux value over a large area. If the behavior of volatilizing pesticides and the importance of the volatilization process in redistributing pesticides in the environment are to be fully understood, it is critical that we understand not only the processes that govern pesticide entry into the lower atmosphere, but also how much of the millions of kilograms of pesticides that are applied annually are introduced into, and redistributed by, the atmosphere. We also must be aware of the assumptions and limitations of the estimation techniques used, and adapt the field of pesticide volatilization flux measurements to advances in atmospheric science.

  11. Mean-field dynamo in a turbulence with shear and kinetic helicity fluctuations.

    PubMed

    Kleeorin, Nathan; Rogachevskii, Igor

    2008-03-01

    We study the effects of kinetic helicity fluctuations in a turbulence with large-scale shear using two different approaches: the spectral tau approximation and the second-order correlation approximation (or first-order smoothing approximation). These two approaches demonstrate that homogeneous kinetic helicity fluctuations alone with zero mean value in a sheared homogeneous turbulence cannot cause a large-scale dynamo. A mean-field dynamo is possible when the kinetic helicity fluctuations are inhomogeneous, which causes a nonzero mean alpha effect in a sheared turbulence. On the other hand, the shear-current effect can generate a large-scale magnetic field even in a homogeneous nonhelical turbulence with large-scale shear. This effect was investigated previously for large hydrodynamic and magnetic Reynolds numbers. In this study we examine the threshold required for the shear-current dynamo versus Reynolds number. We demonstrate that there is no need for a developed inertial range in order to maintain the shear-current dynamo (e.g., the threshold in the Reynolds number is of the order of 1).

  12. Risk of incident clinical diagnosis of AD-type dementia attributable to pathology-confirmed vascular disease

    PubMed Central

    Dodge, Hiroko H.; Zhu, Jian; Woltjer, Randy; Nelson, Peter T.; Bennett, David A.; Cairns, Nigel J.; Fardo, David W.; Kaye, Jeffrey A.; Lyons, Deniz-Erten; Mattek, Nora; Schneider, Julie A; Silbert, Lisa C.; Xiong, Chengjie; Yu, Lei; Schmitt, Frederick A.; Kryscio, Richard J.; Abner, Erin L.

    2016-01-01

    Introduction Presence of cerebrovascular pathology may increase the risk of clinical diagnosis of AD. Methods We examined excess risk of incident clinical diagnosis of AD (probable and possible AD) posed by the presence of lacunes and large infarcts beyond AD pathology using data from the Statistical Modelling of Aging and Risk of Transition (SMART) study, a consortium of longitudinal cohort studies with over 2000 autopsies. We created six mutually exclusive pathology patterns combining three levels of AD pathology (low, moderate or high AD pathology) and two levels of vascular pathology (without lacunes and large infarcts or with lacunes and/or large infarcts). Results The coexistence of lacunes and large infarcts results in higher likelihood of clinical diagnosis of AD only when AD pathology burden is low. Discussion Our results reinforce the diagnostic importance of AD pathology in clinical AD. Further harmonization of assessment approaches for vascular pathologies is required. PMID:28017827

  13. Space station automation study: Automation requirements derived from space manufacturing concepts. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The electroepitaxial process and the Very Large Scale Integration (VLSI) circuits (chips) facilities were chosen because each requires a very high degree of automation, and therefore involved extensive use of teleoperators, robotics, process mechanization, and artificial intelligence. Both cover a raw materials process and a sophisticated multi-step process and are therfore highly representative of the kinds of difficult operation, maintenance, and repair challenges which can be expected for any type of space manufacturing facility. Generic areas were identified which will require significant further study. The initial design will be based on terrestrial state-of-the-art hard automation. One hundred candidate missions were evaluated on the basis of automation portential and availability of meaning ful knowldege. The design requirements and unconstrained design concepts developed for the two missions are presented.

  14. Wing planform geometry effects on large subsonic military transport airplanes. Final technical report March 1976-February 1977

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kulfan, R.M.; Vachal, J.D.

    1978-02-01

    A Preliminary Design Study of large turbulent flow military transport aircraft has been made. The study airplanes were designed to carry a heavy payload (350,000 lb) for a long range (10,000 nmi). The study tasks included: Wing geometry/cruise speed optimization of a large cantilever wing military transport airplane; Preliminary design and performance evaluation of a strut-braced wing transport airplane; and Structural analyses of large-span cantilever and strut-braced wings of graphite/epoxy sandwich construction (1985 technology). The best cantilever wing planform for minimum takeoff gross weight, and minimum fuel requirements, as determined using statistical weight evaluations, has a high aspect ratio, lowmore » sweep, low thickness/chord ratio, and a cruise Mach number of 0.76. A near optimum wing planform with greater speed capability (M = 0.78) has an aspect ratio = 12, quarter chord sweep = 20 deg, and thickness/chord ratio of 0.14/0.08 (inboard/outboard).« less

  15. Technical Training Requirements of Middle Management in the Greek Textile and Clothing Industries.

    ERIC Educational Resources Information Center

    Fotinopoulou, K.; Manolopoulos, N.

    A case study of 16 companies in the Greek textile and clothing industry elicited the training needs of the industry's middle managers. The study concentrated on large and medium-sized work units, using a lengthy questionnaire. The study found that middle managers increasingly need to solve problems and ensure the reliability of new equipment and…

  16. Conducting Causal Effects Studies in Science Education: Considering Methodological Trade-Offs in the Context of Policies Affecting Research in Schools

    ERIC Educational Resources Information Center

    Taylor, Joseph; Kowalski, Susan; Wilson, Christopher; Getty, Stephen; Carlson, Janet

    2013-01-01

    This paper focuses on the trade-offs that lie at the intersection of methodological requirements for causal effect studies and policies that affect how and to what extent schools engage in such studies. More specifically, current federal funding priorities encourage large-scale randomized studies of interventions in authentic settings. At the same…

  17. Identification of Skills Standards for Entry Level Legal Office Support Staff in Urban Oklahoma: A Delphi Study

    ERIC Educational Resources Information Center

    Reese Ward, Tonya Maria

    2010-01-01

    Scope and Method of Study: The purpose of this study was to use industry experts to identify critical skills or competencies perceived by the legal profession to be required by competent team members in the legal office environment. Specifically, this study focused on fulfilling this purpose in the context of urban Oklahoma, where a large number…

  18. Nordic registry-based cohort studies: Possibilities and pitfalls when combining Nordic registry data.

    PubMed

    Maret-Ouda, John; Tao, Wenjing; Wahlin, Karl; Lagergren, Jesper

    2017-07-01

    All five Nordic countries (Denmark, Finland, Iceland, Norway and Sweden) have nationwide registries with similar data structure and validity, as well as personal identity numbers enabling linkage between registries. These resources provide opportunities for medical research that is based on large registry-based cohort studies with long and complete follow-up. This review describes practical aspects, opportunities and challenges encountered when setting up all-Nordic registry-based cohort studies. Relevant articles describing registries often used for medical research in the Nordic countries were retrieved. Further, our experiences of conducting this type of study, including planning, acquiring permissions, data retrieval and data cleaning and handling, and the possibilities and challenges we have encountered are described. Combining data from the Nordic countries makes it possible to create large and powerful cohorts. The main challenges include obtaining all permissions within each country, usually in the local language, and retrieving the data. These challenges emphasise the importance of having experienced collaborators within each country. Following the acquisition of data, data management requires the understanding of the differences between the variables to be used in the various countries. A concern is the long time required between initiation and completion. Nationwide Nordic registries can be combined into cohorts with high validity and statistical power, but the considerable expertise, workload and time required to complete such cohorts should not be underestimated.

  19. Control of fluxes in metabolic networks.

    PubMed

    Basler, Georg; Nikoloski, Zoran; Larhlimi, Abdelhalim; Barabási, Albert-László; Liu, Yang-Yu

    2016-07-01

    Understanding the control of large-scale metabolic networks is central to biology and medicine. However, existing approaches either require specifying a cellular objective or can only be used for small networks. We introduce new coupling types describing the relations between reaction activities, and develop an efficient computational framework, which does not require any cellular objective for systematic studies of large-scale metabolism. We identify the driver reactions facilitating control of 23 metabolic networks from all kingdoms of life. We find that unicellular organisms require a smaller degree of control than multicellular organisms. Driver reactions are under complex cellular regulation in Escherichia coli, indicating their preeminent role in facilitating cellular control. In human cancer cells, driver reactions play pivotal roles in malignancy and represent potential therapeutic targets. The developed framework helps us gain insights into regulatory principles of diseases and facilitates design of engineering strategies at the interface of gene regulation, signaling, and metabolism. © 2016 Basler et al.; Published by Cold Spring Harbor Laboratory Press.

  20. Attitude maneuvers of a solar-powered electric orbital transfer vehicle

    NASA Astrophysics Data System (ADS)

    Jenkin, Alan B.

    1992-08-01

    Attitude maneuver requirements of a solar-powered electric orbital transfer vehicle have been studied in detail. This involved evaluation of the yaw, pitch, and roll profiles and associated angular accelerations needed to simultaneously steer the vehicle thrust vector and maintain the solar array pointed toward the sun. Maintaining the solar array pointed exactly at the sun leads to snap roll maneuvers which have very high (theoretically unbounded) accelerations, thereby imposing large torque requirements. The problem is exacerbated by the large solar arrays which are needed to generate the high levels of power needed by electric propulsion devices. A method of eliminating the snap roll maneuvers is presented. The method involves the determination of relaxed roll profiles which approximate a forced transition between alternate exact roll profiles and incur only small errors in solar array pointing. The method makes it feasible to perform the required maneuvers using currently available attitude control technology such as reaction wheels, hot gas jets, or gimballed main engines.

  1. Solar electric propulsion and interorbital transportation

    NASA Technical Reports Server (NTRS)

    Austin, R. E.

    1978-01-01

    In-house MSFC and contracted systems studies have evaluated the requirements associated with candidate SEP missions and the results point to a standard system approach for both program flexibility and economy. The prospects for economical space transportation in the 1980s have already provided a stimulus for Space Industrialization (SI) planning. Two SI initiatives that are used as examples for interorbital transportation requirements are discussed - Public Service Platforms and Satellite Power System. The interorbital requirements for SI range from support of manned geosynchronous missions to transfers of bulk cargo and large-delicate space structures from low earth orbit to geosynchronous orbit.

  2. On the energy budget in the current disruption region. [of geomagnetic tail

    NASA Technical Reports Server (NTRS)

    Hesse, Michael; Birn, Joachim

    1993-01-01

    This study investigates the energy budget in the current disruption region of the magnetotail, coincident with a pre-onset thin current sheet, around substorm onset time using published observational data and theoretical estimates. We find that the current disruption/dipolarization process typically requires energy inflow into the primary disruption region. The disruption dipolarization process is therefore endoenergetic, i.e., requires energy input to operate. Therefore we argue that some other simultaneously operating process, possibly a large scale magnetotail instability, is required to provide the necessary energy input into the current disruption region.

  3. Design studies of large aperture, high-resolution Earth science microwave radiometers compatible with small launch vehicles

    NASA Technical Reports Server (NTRS)

    Schroeder, Lyle C.; Bailey, M. C.; Harrington, Richard F.; Kendall, Bruce M.; Campbell, Thomas G.

    1994-01-01

    High-spatial-resolution microwave radiometer sensing from space with reasonable swath widths and revisit times favors large aperture systems. However, with traditional precision antenna design, the size and weight requirements for such systems are in conflict with the need to emphasize small launch vehicles. This paper describes tradeoffs between the science requirements, basic operational parameters, and expected sensor performance for selected satellite radiometer concepts utilizing novel lightweight compactly packaged real apertures. Antenna, feed, and radiometer subsystem design and calibration are presented. Preliminary results show that novel lightweight real aperture coupled with state-of-the-art radiometer designs are compatible with small launch systems, and hold promise for high-resolution earth science measurements of sea ice, precipitation, soil moisture, sea surface temperature, and ocean wind speeds.

  4. Aerobrake concepts for NTP systems study

    NASA Technical Reports Server (NTRS)

    Cruz, Manuel I.

    1992-01-01

    Design concepts are described for landing large spacecraft masses on the Mars surface in support of manned missions with interplanetary transportation using Nuclear Thermal Propulsion (NTP). Included are the mission and systems analyses, trade studies and sensitivity analyses, design analyses, technology assessment, and derived requirements to support this concept. The mission phases include the Mars de-orbit, entry, terminal descent, and terminal touchdown. The study focuses primarily on Mars surface delivery from orbit after Mars orbit insertion using an NTP. The requirements associated with delivery of logistical supplies, habitats, and other equipment on minimum energy Earth to Mars transfers are also addressed in a preliminary fashion.

  5. Legacy Phosphorus Effect and Need to Re-calibrate Soil Test P Methods for Organic Crop Production.

    NASA Astrophysics Data System (ADS)

    Dao, Thanh H.; Schomberg, Harry H.; Cavigelli, Michel A.

    2015-04-01

    Phosphorus (P) is a required nutrient for the normal development and growth of plants and supplemental P is needed in most cultivated soils. Large inputs of cover crop residues and nutrient-rich animal manure are added to supply needed nutrients to promote the optimal production of organic grain crops and forages. The effects of crop rotations and tillage management of the near-surface zone on labile phosphorus (P) forms were studied in soil under conventional and organic crop management systems in the mid-Atlantic region of the U.S. after 18 years due to the increased interest in these alternative systems. Soil nutrient surpluses likely caused by low grain yields resulted in large pools of exchangeable phosphate-P and equally large pools of enzyme-labile organic P (Po) in soils under organic management. In addition, the difference in the P loading rates between the conventional and organic treatments as guided by routine soil test recommendations suggested that overestimating plant P requirements contributed to soil P surpluses because routine soil testing procedures did not account for the presence and size of the soil enzyme-labile Po pool. The effect of large P additions is long-lasting as they continued to contribute to elevated soil total bioactive P concentrations 12 or more years later. Consequently, accurate estimates of crop P requirements, P turnover in soil, and real-time plant and soil sensing systems are critical considerations to optimally manage manure-derived nutrients in organic crop production.

  6. Estimating ecosystem service changes as a precursor to modeling

    EPA Science Inventory

    EPA's Future Midwestern Landscapes Study will project changes in ecosystem services (ES) for alternative future policy scenarios in the Midwestern U.S. Doing so for detailed landscapes over large spatial scales will require serial application of economic and ecological models. W...

  7. Atmospheric rendezvous feasibility study

    NASA Technical Reports Server (NTRS)

    Schaezler, A. D.

    1972-01-01

    A study was carried out to determine the feasibility of using atmospheric rendezvous to increase the efficiency of space transportation and to determine the most effective implementation. It is concluded that atmospheric rendezvous is feasible and can be utilized in a space transportation system to reduce size of the orbiter vehicle, provide a powered landing with go-around capability for every mission, and achieve lateral range performance that exceeds requirements. A significantly lighter booster and reduced launch fuel requirements are additional benefits that can be realized with a system that includes a large subsonic airplane for recovery of the orbiter. Additional reduction in booster size is possible if the airplane is designed for recovery of the booster by towing. An airplane about the size of the C-5A is required.

  8. LDR cryogenics

    NASA Technical Reports Server (NTRS)

    Nast, T.

    1988-01-01

    A brief summary from the 1985 Large Deployable Reflector (LDR) Asilomar 2 workshop of the requirements for LDR cryogenic cooling is presented. The heat rates are simply the sum of the individual heat rates from the instruments. Consideration of duty cycle will have a dramatic effect on cooling requirements. There are many possible combinations of cooling techniques for each of the three temperatures zones. It is clear that much further system study is needed to determine what type of cooling system is required (He-2, hybrid or mechanical) and what size and power is required. As the instruments, along with their duty cycles and heat rates, become better defined it will be possible to better determine the optimum cooling systems.

  9. Landing Force Organizational Systems Study (LFOSS).

    DTIC Science & Technology

    1979-01-01

    moving, and field fortifications. A heavy crawler tor is required for large earth-moving missions in tough soil condi- s where the smaller tractor is...current T/O’s and T/E’s. The primary sources for each item of equipment were the work directive, the required operational capability (ROC), and the...developments have not been used as a source of information because of the inability to determine the specific project impact prior to its entrance

  10. Open solutions to distributed control in ground tracking stations

    NASA Technical Reports Server (NTRS)

    Heuser, William Randy

    1994-01-01

    The advent of high speed local area networks has made it possible to interconnect small, powerful computers to function together as a single large computer. Today, distributed computer systems are the new paradigm for large scale computing systems. However, the communications provided by the local area network is only one part of the solution. The services and protocols used by the application programs to communicate across the network are as indispensable as the local area network. And the selection of services and protocols that do not match the system requirements will limit the capabilities, performance, and expansion of the system. Proprietary solutions are available but are usually limited to a select set of equipment. However, there are two solutions based on 'open' standards. The question that must be answered is 'which one is the best one for my job?' This paper examines a model for tracking stations and their requirements for interprocessor communications in the next century. The model and requirements are matched with the model and services provided by the five different software architectures and supporting protocol solutions. Several key services are examined in detail to determine which services and protocols most closely match the requirements for the tracking station environment. The study reveals that the protocols are tailored to the problem domains for which they were originally designed. Further, the study reveals that the process control model is the closest match to the tracking station model.

  11. Space station accommodations for lunar base elements: A study

    NASA Technical Reports Server (NTRS)

    Weidman, Deene J.; Cirillo, William; Llewellyn, Charles; Kaszubowski, Martin; Kienlen, E. Michael, Jr.

    1987-01-01

    The results of a study conducted at NASA-LaRC to assess the impact on the space station of accommodating a Manned Lunar Base are documented. Included in the study are assembly activities for all infrastructure components, resupply and operations support for lunar base elements, crew activity requirements, the effect of lunar activities on Cape Kennedy operations, and the effect on space station science missions. Technology needs to prepare for such missions are also defined. Results of the study indicate that the space station can support the manned lunar base missions with the addition of a Fuel Depot Facility and a heavy lift launch vehicle to support the large launch requirements.

  12. Singlet oxygen detection in biological systems: Uses and limitations

    PubMed Central

    Koh, Eugene; Fluhr, Robert

    2016-01-01

    ABSTRACT The study of singlet oxygen in biological systems is challenging in many ways. Singlet oxygen is a relatively unstable ephemeral molecule, and its properties make it highly reactive with many biomolecules, making it difficult to quantify accurately. Several methods have been developed to study this elusive molecule, but most studies thus far have focused on those conditions that produce relatively large amounts of singlet oxygen. However, the need for more sensitive methods is required as one begins to explore the levels of singlet oxygen required in signaling and regulatory processes. Here we discuss the various methods used in the study of singlet oxygen, and outline their uses and limitations. PMID:27231787

  13. INFERENCE FOR INDIVIDUAL-LEVEL MODELS OF INFECTIOUS DISEASES IN LARGE POPULATIONS.

    PubMed

    Deardon, Rob; Brooks, Stephen P; Grenfell, Bryan T; Keeling, Matthew J; Tildesley, Michael J; Savill, Nicholas J; Shaw, Darren J; Woolhouse, Mark E J

    2010-01-01

    Individual Level Models (ILMs), a new class of models, are being applied to infectious epidemic data to aid in the understanding of the spatio-temporal dynamics of infectious diseases. These models are highly flexible and intuitive, and can be parameterised under a Bayesian framework via Markov chain Monte Carlo (MCMC) methods. Unfortunately, this parameterisation can be difficult to implement due to intense computational requirements when calculating the full posterior for large, or even moderately large, susceptible populations, or when missing data are present. Here we detail a methodology that can be used to estimate parameters for such large, and/or incomplete, data sets. This is done in the context of a study of the UK 2001 foot-and-mouth disease (FMD) epidemic.

  14. Evaluation of Genetic Algorithm Concepts using Model Problems. Part 1; Single-Objective Optimization

    NASA Technical Reports Server (NTRS)

    Holst, Terry L.; Pulliam, Thomas H.

    2003-01-01

    A genetic-algorithm-based optimization approach is described and evaluated using a simple hill-climbing model problem. The model problem utilized herein allows for the broad specification of a large number of search spaces including spaces with an arbitrary number of genes or decision variables and an arbitrary number hills or modes. In the present study, only single objective problems are considered. Results indicate that the genetic algorithm optimization approach is flexible in application and extremely reliable, providing optimal results for all problems attempted. The most difficult problems - those with large hyper-volumes and multi-mode search spaces containing a large number of genes - require a large number of function evaluations for GA convergence, but they always converge.

  15. Optical design of the STAR-X telescope

    NASA Astrophysics Data System (ADS)

    Saha, Timo T.; Zhang, William W.; McClelland, Ryan S.

    2017-08-01

    Top-level science objectives of the Survey and Time-domain Astrophysical Research eXplorer (STAR-X) include: investigations of most violent explosions in the universe, study of growth of black holes across cosmic time and mass scale, and measure how structure formation heats majority of baryons in the universe. To meet these objectives, the STAR-X telescope requires a field of view of about 1 square-degree, an angular resolution of 5 arc-seconds or better across large part of the field of view. The on-axis effective area at 1 keV should be about 2,000 cm2 . Payload cost and launch considerations limit the outer diameter, focal length, and mass to 1.3 meters, 5 meters, and 250 kilograms, respectively. Telescope design is based on a segmented meta-shell approach we have developed at Goddard Space Flight Center. The telescope mirror shells are divided into segments. Individual shells are nested inside each other to meet the effective area requirements in 0.5 - 6.0 keV range. We consider Wolter-Schwarzschild, and Modified-WolterSchwarzschild telescopes. These designs offer an excellent PSF over a large field of view. Nested shells are vulnerable to stray light problems. We have designed a multi-component baffle system to eliminate direct and single-reflection light paths inside the mirror assembly. Large numbers of internal and external baffles are required to prevent stray rays from reaching the focal plane. We have developed a simple ray-trace tool to determine the dimensions and locations of the baffles. In this paper, we present the results of our trade studies, baffle design studies, and optical performance analyses of the STAR-X telescope.

  16. Preliminary feasibility study of pallet-only mode for magnetospheric and plasmas in space payloads, volume 4

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Results of studies performed on the magnetospheric and plasma portion of the AMPS are presented. Magnetospheric and plasma in space experiments and instruments are described along with packaging (palletization) concepts. The described magnetospheric and plasma experiments were considered as separate entities. Instrumentation ospheric and plasma experiments were considered as separate entities. Instrumentation requirements and operations were formulated to provide sufficient data for unambiguous interpretation of results without relying upon other experiments of the series. Where ground observations are specified, an assumption was made that large-scale additions or modifications to existing facilities were not required.

  17. Advanced subsonic long-haul transport terminal area compatibility study. Volume 1: Compatibility assessment

    NASA Technical Reports Server (NTRS)

    1974-01-01

    An analysis was made to identify airplane research and technology necessary to ensure advanced transport aircraft the capability of accommodating forecast traffic without adverse impact on airport communities. Projections were made of the delay, noise, and emissions impact of future aircraft fleets on typical large urban airport. Design requirements, based on these projections, were developed for an advanced technology, long-haul, subsonic transport. A baseline aircraft was modified to fulfill the design requirements for terminal area compatibility. Technical and economic comparisons were made between these and other aircraft configured to support the study.

  18. Cryogenics for LDR

    NASA Technical Reports Server (NTRS)

    Kittel, Peter

    1988-01-01

    Three cryogenic questions of importance to Large Deployable Reflector (LDR) are discussed: the primary cooling requirement, the secondary cooling requirement, and the instrument changeout requirement.

  19. Solving large scale structure in ten easy steps with COLA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tassev, Svetlin; Zaldarriaga, Matias; Eisenstein, Daniel J., E-mail: stassev@cfa.harvard.edu, E-mail: matiasz@ias.edu, E-mail: deisenstein@cfa.harvard.edu

    2013-06-01

    We present the COmoving Lagrangian Acceleration (COLA) method: an N-body method for solving for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). Unlike standard N-body methods, the COLA method can straightforwardly trade accuracy at small-scales in order to gain computational speed without sacrificing accuracy at large scales. This is especially useful for cheaply generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing, as those catalogs are essential for performing detailed error analysis for ongoing and future surveys of LSS. As anmore » illustration, we ran a COLA-based N-body code on a box of size 100 Mpc/h with particles of mass ≈ 5 × 10{sup 9}M{sub s}un/h. Running the code with only 10 timesteps was sufficient to obtain an accurate description of halo statistics down to halo masses of at least 10{sup 11}M{sub s}un/h. This is only at a modest speed penalty when compared to mocks obtained with LPT. A standard detailed N-body run is orders of magnitude slower than our COLA-based code. The speed-up we obtain with COLA is due to the fact that we calculate the large-scale dynamics exactly using LPT, while letting the N-body code solve for the small scales, without requiring it to capture exactly the internal dynamics of halos. Achieving a similar level of accuracy in halo statistics without the COLA method requires at least 3 times more timesteps than when COLA is employed.« less

  20. 76 FR 71625 - Position Limits for Futures and Swaps

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-18

    .... In the 1920s and into the 1930s, a series of studies and reports found that large speculative... Dodd-Frank Act amended the CEA to direct the Commission to define the relevant factors to be considered... Section 719 of the Dodd-Frank Act specifically requires the Commission `to conduct a study of the effects...

  1. An Exploration of Attitudes towards Modern Biotechnology: A Study among Dutch Secondary School Students

    ERIC Educational Resources Information Center

    Klop, Tanja; Severiens, Sabine

    2007-01-01

    Modern biotechnology will have a large impact on society and requires informed decision-making and critical attitudes toward biotechnology among the public. This study aims to explore these attitudes in secondary education. For this purpose, a questionnaire was constructed according to the general tripartite theory of attitudes. A total of 574…

  2. Microsoft Excel®: Is It an Important Job Skill for College Graduates?

    ERIC Educational Resources Information Center

    Formby, Sam K.; Medlin, B. Dawn; Ellington, Virginia

    2017-01-01

    Several studies have found that a large percentage of middle-skilled jobs require at least a basic understanding of spreadsheets, and some even advanced level skills. A study was conducted at a four-year university to identify Excel skill sets that were determined as necessary by employers of the university's current students, advisory boards,…

  3. The Influence of Digital Interactive Textbook Instruction on Student Learning Preferences, Outcomes, and Motivation

    ERIC Educational Resources Information Center

    O'Bannon, Blanche W.; Skolits, Gary J.; Lubke, Jennifer K.

    2017-01-01

    This study examined achievement when an interactive textbook (iBook) was used in place of lecture to teach students to create instructional flipcharts for a Promethean interactive whiteboard. The study was conducted with students enrolled in a required technology course for teachers at a large research-intensive university in the southeastern…

  4. Heritage Language Maintenance and Education in the Greek Sociolinguistic Context: Albanian Immigrant Parents' Views

    ERIC Educational Resources Information Center

    Gkaintartzi, Anastasia; Kiliari, Angeliki; Tsokalidou, Roula

    2016-01-01

    This paper presents data from two studies--a nationwide quantitative research and an ethnographic study--on immigrant parents' perspectives about heritage language maintenance and education in Greek state schools. The quantitative data come from a large-scale questionnaire survey, which aimed at the investigation of the needs and requirements for…

  5. International manned lunar base - Beginning the 21st century in space

    NASA Astrophysics Data System (ADS)

    Smith, Harlan J.; Gurshtejn, Aleksandr A.; Mendell, Wendell

    An evaluation is made of requirements for, and advantages in, the creation of a manned lunar base whose functions emphasize astronomical investigations. These astronomical studies would be able to capitalize on the lunar environment's ultrahigh vacuum, highly stable surface, dark and cold sky, low-G, absence of wind, isolation from terrestrial 'noise', locally usable ceramic raw materials, and large radiotelescope dish-supporting hemispherical craters. Large telescope structures would be nearly free of the gravity and wind loads that complicate their design on earth.

  6. Scale-Up: Improving Large Enrollment Physics Courses

    NASA Astrophysics Data System (ADS)

    Beichner, Robert

    1999-11-01

    The Student-Centered Activities for Large Enrollment University Physics (SCALE-UP) project is working to establish a learning environment that will promote increased conceptual understanding, improved problem-solving performance, and greater student satisfaction, while still maintaining class sizes of approximately 100. We are also addressing the new ABET engineering accreditation requirements for inquiry-based learning along with communication and team-oriented skills development. Results of studies of our latest classroom design, plans for future classroom space, and the current iteration of instructional materials will be discussed.

  7. Satellite power system (SPS) concept definition study. Volume 3: Experimental verification definition

    NASA Technical Reports Server (NTRS)

    Hanley, G. M.

    1980-01-01

    An evolutionary Satellite Power Systems development plan was prepared. Planning analysis was directed toward the evolution of a scenario that met the stated objectives, was technically possible and economically attractive, and took into account constraining considerations, such as requirements for very large scale end-to-end demonstration in a compressed time frame, the relative cost/technical merits of ground testing versus space testing, and the need for large mass flow capability to low Earth orbit and geosynchronous orbit at reasonable cost per pound.

  8. Perspectives in astrophysical databases

    NASA Astrophysics Data System (ADS)

    Frailis, Marco; de Angelis, Alessandro; Roberto, Vito

    2004-07-01

    Astrophysics has become a domain extremely rich of scientific data. Data mining tools are needed for information extraction from such large data sets. This asks for an approach to data management emphasizing the efficiency and simplicity of data access; efficiency is obtained using multidimensional access methods and simplicity is achieved by properly handling metadata. Moreover, clustering and classification techniques on large data sets pose additional requirements in terms of computation and memory scalability and interpretability of results. In this study we review some possible solutions.

  9. Electrofishing Effort Required to Estimate Biotic Condition in Southern Idaho Rivers

    EPA Science Inventory

    An important issue surrounding biomonitoring in large rivers is the minimum sampling effort required to collect an adequate number of fish for accurate and precise determinations of biotic condition. During the summer of 2002, we sampled 15 randomly selected large-river sites in...

  10. The production of multiprotein complexes in insect cells using the baculovirus expression system.

    PubMed

    Abdulrahman, Wassim; Radu, Laura; Garzoni, Frederic; Kolesnikova, Olga; Gupta, Kapil; Osz-Papai, Judit; Berger, Imre; Poterszman, Arnaud

    2015-01-01

    The production of a homogeneous protein sample in sufficient quantities is an essential prerequisite not only for structural investigations but represents also a rate-limiting step for many functional studies. In the cell, a large fraction of eukaryotic proteins exists as large multicomponent assemblies with many subunits, which act in concert to catalyze specific activities. Many of these complexes cannot be obtained from endogenous source material, so recombinant expression and reconstitution are then required to overcome this bottleneck. This chapter describes current strategies and protocols for the efficient production of multiprotein complexes in large quantities and of high quality, using the baculovirus/insect cell expression system.

  11. The Large Area Crop Inventory Experiment (LACIE). Part 3: A systematic approach to the practical application of remote-sensing technology

    NASA Technical Reports Server (NTRS)

    Murphy, J. D.; Dideriksen, R. I.

    1975-01-01

    The application of remote sensing technology by the U.S. Department of Agriculture (USDA) is examined. The activities of the USDA Remote-Sensing User Requirement Task Force which include cataloging USDA requirements for earth resources data, determining those requirements that would return maximum benefits by using remote sensing technology and developing a plan for acquiring, processing, analyzing, and distributing data to satisfy those requirements are described. Emphasis is placed on the large area crop inventory experiment and its relationship to the task force.

  12. Development of the Large-Scale Forcing Data to Support MC3E Cloud Modeling Studies

    NASA Astrophysics Data System (ADS)

    Xie, S.; Zhang, Y.

    2011-12-01

    The large-scale forcing fields (e.g., vertical velocity and advective tendencies) are required to run single-column and cloud-resolving models (SCMs/CRMs), which are the two key modeling frameworks widely used to link field data to climate model developments. In this study, we use an advanced objective analysis approach to derive the required forcing data from the soundings collected by the Midlatitude Continental Convective Cloud Experiment (MC3E) in support of its cloud modeling studies. MC3E is the latest major field campaign conducted during the period 22 April 2011 to 06 June 2011 in south-central Oklahoma through a joint effort between the DOE ARM program and the NASA Global Precipitation Measurement Program. One of its primary goals is to provide a comprehensive dataset that can be used to describe the large-scale environment of convective cloud systems and evaluate model cumulus parameterizations. The objective analysis used in this study is the constrained variational analysis method. A unique feature of this approach is the use of domain-averaged surface and top-of-the atmosphere (TOA) observations (e.g., precipitation and radiative and turbulent fluxes) as constraints to adjust atmospheric state variables from soundings by the smallest possible amount to conserve column-integrated mass, moisture, and static energy so that the final analysis data is dynamically and thermodynamically consistent. To address potential uncertainties in the surface observations, an ensemble forcing dataset will be developed. Multi-scale forcing will be also created for simulating various scale convective systems. At the meeting, we will provide more details about the forcing development and present some preliminary analysis of the characteristics of the large-scale forcing structures for several selected convective systems observed during MC3E.

  13. Operational fitness of box truss antennas in response to dynamic slewing

    NASA Technical Reports Server (NTRS)

    Bachtell, E. E.; Bettadapur, S. S.; Schartel, W. A.; Karanian, L. A.

    1985-01-01

    A parametric study was performed to define slewing capability of large satellites along with associated system changes or subsystem weight and complexity impacts. The satellite configuration and structural arrangement from the Earth Observation Spacecraft (EOS) study was used as the baseline spacecraft. Varying slew rates, settling times, damping, maneuver frequencies, and attitude hold times provided the data required to establish applicability to a wide range of potential missions. The key elements of the study are: (1) determine the dynamic transient response of the antenna system; (2) calculate the system errors produced by the dynamic response; (3) determine if the antenna has exceeded operational requirements at completion of the slew, and if so; (4) determine when the antenna has settled to the operational requirements. The slew event is not considered complete until the antenna is within operational limits.

  14. Proximity operations concept design study, task 6

    NASA Technical Reports Server (NTRS)

    Williams, A. N.

    1990-01-01

    The feasibility of using optical technology to perform the mission of the proximity operations communications subsystem on Space Station Freedom was determined. Proximity operations mission requirements are determined and the relationship to the overall operational environment of the space station is defined. From this information, the design requirements of the communication subsystem are derived. Based on these requirements, a preliminary design is developed and the feasibility of implementation determined. To support the Orbital Maneuvering Vehicle and National Space Transportation System, the optical system development is straightforward. The requirements on extra-vehicular activity are such as to allow large fields of uncertainty, thus exacerbating the acquisition problem; however, an approach is given that could mitigate this problem. In general, it is found that such a system could indeed perform the proximity operations mission requirement, with some development required to support extra-vehicular activity.

  15. Developing a NASA strategy for the verification of large space telescope observatories

    NASA Astrophysics Data System (ADS)

    Crooke, Julie A.; Gunderson, Johanna A.; Hagopian, John G.; Levine, Marie

    2006-06-01

    In July 2005, the Office of Program Analysis and Evaluation (PA&E) at NASA Headquarters was directed to develop a strategy for verification of the performance of large space telescope observatories, which occurs predominantly in a thermal vacuum test facility. A mission model of the expected astronomical observatory missions over the next 20 years was identified along with performance, facility and resource requirements. Ground testing versus alternatives was analyzed to determine the pros, cons and break points in the verification process. Existing facilities and their capabilities were examined across NASA, industry and other government agencies as well as the future demand for these facilities across NASA's Mission Directorates. Options were developed to meet the full suite of mission verification requirements, and performance, cost, risk and other analyses were performed. Findings and recommendations from the study were presented to the NASA Administrator and the NASA Strategic Management Council (SMC) in February 2006. This paper details the analysis, results, and findings from this study.

  16. Implementation and value of using a split-plot reader design in a study of digital breast tomosynthesis in a breast cancer assessment clinic

    NASA Astrophysics Data System (ADS)

    Mall, Suneeta; Brennan, Patrick C.; Mello-Thoms, Claudia

    2015-03-01

    The rapid evolution in medical imaging has led to an increased number of recurrent trials, primarily to ensure that the efficacy of new imaging techniques is known. The cost associated with time and resources in conducting such trials is usually high. The recruitment of participants, in a medium to large reader study, is often very challenging as the demanding number of cases discourages involvement with the trial. We aim to evaluate the efficacy of Digital Breast Tomosynthesis (DBT) in a recall assessment clinic in Australia in a prospective multi-reader-multi-case (MRMC) trial. Conducting such a study with the more commonly used fully crossed MRMC study design would require more cases and more cases read per reader, which was not viable in our setting. With an aim to perform a cost effective yet statistically efficient clinical trial, we evaluated alternative study designs, particularly the alternative split-plot MRMC study design and compared and contrasted it with more commonly used fully crossed MRMC study design. Our results suggest that `split-plot', an alternative MRMC study design, could be very beneficial for medium to large clinical trials and the cost associated with conducting such trials can be greatly reduced without adversely effecting the variance of the study. We have also noted an inverse dependency between number of required readers and cases to achieve a target variance. This suggests that split-plot could also be very beneficial for studies that focus on cases that are hard to procure or readers that are hard to recruit. We believe that our results may be relevant to other researchers seeking to design a medium to large clinical trials.

  17. A Pipeline for Large Data Processing Using Regular Sampling for Unstructured Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berres, Anne Sabine; Adhinarayanan, Vignesh; Turton, Terece

    2017-05-12

    Large simulation data requires a lot of time and computational resources to compute, store, analyze, visualize, and run user studies. Today, the largest cost of a supercomputer is not hardware but maintenance, in particular energy consumption. Our goal is to balance energy consumption and cognitive value of visualizations of resulting data. This requires us to go through the entire processing pipeline, from simulation to user studies. To reduce the amount of resources, data can be sampled or compressed. While this adds more computation time, the computational overhead is negligible compared to the simulation time. We built a processing pipeline atmore » the example of regular sampling. The reasons for this choice are two-fold: using a simple example reduces unnecessary complexity as we know what to expect from the results. Furthermore, it provides a good baseline for future, more elaborate sampling methods. We measured time and energy for each test we did, and we conducted user studies in Amazon Mechanical Turk (AMT) for a range of different results we produced through sampling.« less

  18. Young workers in the construction industry and initial OSH-training when entering work life.

    PubMed

    Holte, Kari Anne; Kjestveit, Kari

    2012-01-01

    Studies have found that young workers are at risk for injuries. The risk for accidents is high within construction, indicating that young workers may be especially vulnerable in this industry. In Norway, it is possible to enter the construction industry as a full time worker at the age of 18. The aim of this paper was to explore how young construction workers are received at their workplace with regards to OHS-training. The study was designed as a qualitative case study. Each case consisted of a young worker or apprentice (< 25 years), a colleague, the immediate superior, the OHS manager, and a safety representative in the company. The interviews were recorded and analyzed through content analysis. The results showed that there were differences between large and small companies, where large companies had more formalized routines and systems for receiving and training young workers. These routines were however more dependent on requirements set by legislators and contractors more than by company size, since the legislation has different requirements with impact on OHS.

  19. Human Mars EDL Pathfinder Study: Assessment of Technology Development Gaps and Mitigations

    NASA Technical Reports Server (NTRS)

    Lillard, Randolph; Olejniczak, Joe; Polsgrove, Tara; Cianciolo, Alice Dwyer; Munk, Michelle; Whetsel, Charles; Drake, Bret

    2017-01-01

    This paper presents the results of a NASA initiated Agency-wide assessment to better characterize the risks and potential mitigation approaches associated with landing human class Entry, Descent, and Landing (EDL) systems on Mars. Due to the criticality and long-lead nature of advancing EDL techniques, it is necessary to determine an appropriate strategy to improve the capability to land large payloads. A key focus of this study was to understand the key EDL risks and with a focus on determining what "must" be tested at Mars. This process identified the various risks and potential risk mitigation strategies along with the key near term technology development efforts required and in what environment those technology demonstrations were best suited. The study identified key risks along with advantages to each entry technology. In addition, it was identified that provided the EDL concept of operations (con ops) minimized large scale transition events, there was no technology requirement for a Mars pre-cursor demonstration. Instead, NASA should take a direct path to a human-scale lander.

  20. Evaluating markers for the early detection of cancer: overview of study designs and methods.

    PubMed

    Baker, Stuart G; Kramer, Barnett S; McIntosh, Martin; Patterson, Blossom H; Shyr, Yu; Skates, Steven

    2006-01-01

    The field of cancer biomarker development has been evolving rapidly. New developments both in the biologic and statistical realms are providing increasing opportunities for evaluation of markers for both early detection and diagnosis of cancer. To review the major conceptual and methodological issues in cancer biomarker evaluation, with an emphasis on recent developments in statistical methods together with practical recommendations. We organized this review by type of study: preliminary performance, retrospective performance, prospective performance and cancer screening evaluation. For each type of study, we discuss methodologic issues, provide examples and discuss strengths and limitations. Preliminary performance studies are useful for quickly winnowing down the number of candidate markers; however their results may not apply to the ultimate target population, asymptomatic subjects. If stored specimens from cohort studies with clinical cancer endpoints are available, retrospective studies provide a quick and valid way to evaluate performance of the markers or changes in the markers prior to the onset of clinical symptoms. Prospective studies have a restricted role because they require large sample sizes, and, if the endpoint is cancer on biopsy, there may be bias due to overdiagnosis. Cancer screening studies require very large sample sizes and long follow-up, but are necessary for evaluating the marker as a trigger of early intervention.

  1. Using a novel flood prediction model and GIS automation to measure the valley and channel morphology of large river networks

    EPA Science Inventory

    Traditional methods for measuring river valley and channel morphology require intensive ground-based surveys which are often expensive, time consuming, and logistically difficult to implement. The number of surveys required to assess the hydrogeomorphic structure of large river n...

  2. Development of an interactive data base management system for capturing large volumes of data.

    PubMed

    Moritz, T E; Ellis, N K; VillaNueva, C B; Steeger, J E; Ludwig, S T; Deegan, N I; Shroyer, A L; Henderson, W G; Sethi, G K; Grover, F L

    1995-10-01

    Accurate collection and successful management of data are problems common to all scientific studies. For studies in which large quantities of data are collected by means of questionnaires and/or forms, data base management becomes quite laborious and time consuming. Data base management comprises data collection, data entry, data editing, and data base maintenance. In this article, the authors describe the development of an interactive data base management (IDM) system for the collection of more than 1,400 variables from a targeted population of 6,000 patients undergoing heart surgery requiring cardiopulmonary bypass. The goals of the IDM system are to increase the accuracy and efficiency with which this large amount of data is collected and processed, to reduce research nurse work load through automation of certain administrative and clerical activities, and to improve the process for implementing a uniform study protocol, standardized forms, and definitions across sites.

  3. Monitoring and control requirement definition study for Dispersed Storage and Generation (DSG), volume 1

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Twenty-four functional requirements were prepared under six categories and serve to indicate how to integrate dispersed storage generation (DSG) systems with the distribution and other portions of the electric utility system. Results indicate that there are no fundamental technical obstacles to prevent the connection of dispersed storage and generation to the distribution system. However, a communication system of some sophistication is required to integrate the distribution system and the dispersed generation sources for effective control. The large-size span of generators from 10 KW to 30 MW means that a variety of remote monitoring and control may be required. Increased effort is required to develop demonstration equipment to perform the DSG monitoring and control functions and to acquire experience with this equipment in the utility distribution environment.

  4. Large-scale dynamo action precedes turbulence in shearing box simulations of the magnetorotational instability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhat, Pallavi; Ebrahimi, Fatima; Blackman, Eric G.

    Here, we study the dynamo generation (exponential growth) of large-scale (planar averaged) fields in unstratified shearing box simulations of the magnetorotational instability (MRI). In contrast to previous studies restricted to horizontal (x–y) averaging, we also demonstrate the presence of large-scale fields when vertical (y–z) averaging is employed instead. By computing space–time planar averaged fields and power spectra, we find large-scale dynamo action in the early MRI growth phase – a previously unidentified feature. Non-axisymmetric linear MRI modes with low horizontal wavenumbers and vertical wavenumbers near that of expected maximal growth, amplify the large-scale fields exponentially before turbulence and high wavenumbermore » fluctuations arise. Thus the large-scale dynamo requires only linear fluctuations but not non-linear turbulence (as defined by mode–mode coupling). Vertical averaging also allows for monitoring the evolution of the large-scale vertical field and we find that a feedback from horizontal low wavenumber MRI modes provides a clue as to why the large-scale vertical field sustains against turbulent diffusion in the non-linear saturation regime. We compute the terms in the mean field equations to identify the individual contributions to large-scale field growth for both types of averaging. The large-scale fields obtained from vertical averaging are found to compare well with global simulations and quasi-linear analytical analysis from a previous study by Ebrahimi & Blackman. We discuss the potential implications of these new results for understanding the large-scale MRI dynamo saturation and turbulence.« less

  5. Large-scale dynamo action precedes turbulence in shearing box simulations of the magnetorotational instability

    DOE PAGES

    Bhat, Pallavi; Ebrahimi, Fatima; Blackman, Eric G.

    2016-07-06

    Here, we study the dynamo generation (exponential growth) of large-scale (planar averaged) fields in unstratified shearing box simulations of the magnetorotational instability (MRI). In contrast to previous studies restricted to horizontal (x–y) averaging, we also demonstrate the presence of large-scale fields when vertical (y–z) averaging is employed instead. By computing space–time planar averaged fields and power spectra, we find large-scale dynamo action in the early MRI growth phase – a previously unidentified feature. Non-axisymmetric linear MRI modes with low horizontal wavenumbers and vertical wavenumbers near that of expected maximal growth, amplify the large-scale fields exponentially before turbulence and high wavenumbermore » fluctuations arise. Thus the large-scale dynamo requires only linear fluctuations but not non-linear turbulence (as defined by mode–mode coupling). Vertical averaging also allows for monitoring the evolution of the large-scale vertical field and we find that a feedback from horizontal low wavenumber MRI modes provides a clue as to why the large-scale vertical field sustains against turbulent diffusion in the non-linear saturation regime. We compute the terms in the mean field equations to identify the individual contributions to large-scale field growth for both types of averaging. The large-scale fields obtained from vertical averaging are found to compare well with global simulations and quasi-linear analytical analysis from a previous study by Ebrahimi & Blackman. We discuss the potential implications of these new results for understanding the large-scale MRI dynamo saturation and turbulence.« less

  6. [Bioequivalence of dermatological topical medicines:the Brazilian scenario and the challenges for health surveillance].

    PubMed

    Soares, Kelen Carine Costa; Moraes, Marcelo Vogler; Gelfuso, Guilherme Martins; Gratieri, Taís

    2015-11-01

    The comparative evaluation required for the registration of generic topical medicines in Brazil is conducted by means of a pharmaceutical equivalence study, which merely assesses the physical/chemical and microbiological parameters of the formulations. At the international level, clinical or pharmacodynamic studies are now being required to prove the efficacy and safety of semisolid topical generic formulations. This work presents a comparison of the different requirements for the registration of topical formulations, taking into consideration the various regulatory authorities, and presents a survey of topical medicines registered in Brazil prior to 2013. The survey revealed that in comparison with the USA there were many more copies of these formulations registered in Brazil. This fact, together with the large number of studies in the literature showing the lack of bioequivalence of topical medication, is clear proof of the major importance of the need to realign Brazilian legislation with respect to the technical requirements for the registration of generic and similar medication for dermatological topical application in Brazil.

  7. [Challenges to implementation of the ECG reading center in ELSA-Brasil].

    PubMed

    Ribeiro, Antonio Luiz; Pereira, Samuel Vianney da Cunha; Bergmann, Kaiser; Ladeira, Roberto Marini; Oliveira, Rackel Aguiar Mendes; Lotufo, Paulo A; Mill, José Geraldo; Barreto, Sandhi Maria

    2013-06-01

    Electrocardiography is an established low-cost method of cardiovascular assessment, utilized for decades large epidemiological studies. Nonetheless, its use in large epidemiological studies presents challenges, especially when seeking to develop a reading center. This article describes the process, difficulties and challenges of implementing an electrocardiogram reading center in Brazilian Longitudinal Study for Adult Health (ELSA-Brasil). Among the issues discussed, we have emphasized: the criteria for selection of the electrocardiography machine and the central for storage and management of the machines; the required personnel; the procedures for acquisition and transmission of electrocardiographs to the Reading Center; coding systems, with emphasis on the Minnesota code; ethical and practical issues regarding the delivery of reports to study participants; and aspects related to quality control.

  8. Efficiency and economics of large scale hydrogen liquefaction. [for future generation aircraft requirements

    NASA Technical Reports Server (NTRS)

    Baker, C. R.

    1975-01-01

    Liquid hydrogen is being considered as a substitute for conventional hydrocarbon-based fuels for future generations of commercial jet aircraft. Its acceptance will depend, in part, upon the technology and cost of liquefaction. The process and economic requirements for providing a sufficient quantity of liquid hydrogen to service a major airport are described. The design is supported by thermodynamic studies which determine the effect of process arrangement and operating parameters on the process efficiency and work of liquefaction.

  9. Technology Requirements for Information Management

    NASA Technical Reports Server (NTRS)

    Graves, Sara; Knoblock, Craig A.; Lannom, Larry

    2002-01-01

    This report provides the results of a panel study conducted into the technology requirements for information management in support of application domains of particular government interest, including digital libraries, mission operations, and scientific research. The panel concluded that it was desirable to have a coordinated program of R&D that pursues a science of information management focused on an environment typified by applications of government interest - highly distributed with very large amounts of data and a high degree of heterogeneity of sources, data, and users.

  10. FFTFIL; a filtering program based on two-dimensional Fourier analysis of geophysical data

    USGS Publications Warehouse

    Hildenbrand, T.G.

    1983-01-01

    The filtering program 'fftfil' performs a variety of operations commonly required in geophysical studies of gravity, magnetic, and terrain data. Filtering operations are carried out in the wave number domain where the Fourier coefficients of the input data are multiplied by the response of the selected filter. Input grids can be large (2=number of rows or columns=1024) and are not required to have numbers of rows and columns equal to powers of two.

  11. ms_lims, a simple yet powerful open source laboratory information management system for MS-driven proteomics.

    PubMed

    Helsens, Kenny; Colaert, Niklaas; Barsnes, Harald; Muth, Thilo; Flikka, Kristian; Staes, An; Timmerman, Evy; Wortelkamp, Steffi; Sickmann, Albert; Vandekerckhove, Joël; Gevaert, Kris; Martens, Lennart

    2010-03-01

    MS-based proteomics produces large amounts of mass spectra that require processing, identification and possibly quantification before interpretation can be undertaken. High-throughput studies require automation of these various steps, and management of the data in association with the results obtained. We here present ms_lims (http://genesis.UGent.be/ms_lims), a freely available, open-source system based on a central database to automate data management and processing in MS-driven proteomics analyses.

  12. Tropomodulin 1 Regulation of Actin Is Required for the Formation of Large Paddle Protrusions Between Mature Lens Fiber Cells

    PubMed Central

    Cheng, Catherine; Nowak, Roberta B.; Biswas, Sondip K.; Lo, Woo-Kuen; FitzGerald, Paul G.; Fowler, Velia M.

    2016-01-01

    Purpose To elucidate the proteins required for specialized small interlocking protrusions and large paddle domains at lens fiber cell tricellular junctions (vertices), we developed a novel method to immunostain single lens fibers and studied changes in cell morphology due to loss of tropomodulin 1 (Tmod1), an F-actin pointed end–capping protein. Methods We investigated F-actin and F-actin–binding protein localization in interdigitations of Tmod1+/+ and Tmod1−/− single mature lens fibers. Results F-actin–rich small protrusions and large paddles were present along cell vertices of Tmod1+/+ mature fibers. In contrast, Tmod1−/− mature fiber cells lack normal paddle domains, while small protrusions were unaffected. In Tmod1+/+ mature fibers, Tmod1, β2-spectrin, and α-actinin are localized in large puncta in valleys between paddles; but in Tmod1−/− mature fibers, β2-spectrin was dispersed while α-actinin was redistributed at the base of small protrusions and rudimentary paddles. Fimbrin and Arp3 (actin-related protein 3) were located in puncta at the base of small protrusions, while N-cadherin and ezrin outlined the cell membrane in both Tmod1+/+ and Tmod1−/− mature fibers. Conclusions These results suggest that distinct F-actin organizations are present in small protrusions versus large paddles. Formation and/or maintenance of large paddle domains depends on a β2-spectrin–actin network stabilized by Tmod1. α-Actinin–crosslinked F-actin bundles are enhanced in absence of Tmod1, indicating altered cytoskeleton organization. Formation of small protrusions is likely facilitated by Arp3-branched and fimbrin-bundled F-actin networks, which do not depend on Tmod1. This is the first work to reveal the F-actin–associated proteins required for the formation of paddles between lens fibers. PMID:27537257

  13. Conserving large-river fishes: Is the highway analogy an appropriate paradigm

    USGS Publications Warehouse

    Galat, D.L.; Zweimuller, I.

    2001-01-01

    A tenet of the flood pulse concept, the highway analogy, states that the mare channel of large floodplain rivers is used by fishes mainly as a route for gaining access to floodplain habitats. We examined this proposition by analyzing habitat use for freshwater fishes in 4 large rivers in the United States (Colorado, Columbia, Mississippi, Missouri) and 4 in Europe (Danube, Rhine, Rho??ne, Volga). Fish species from floodplain segments of each river were classified as fluvial specialist, fluvial dependent, and macrohabitat generalist based on literature and expert opinion. We also summarized the proportion of imperiled and introduced fishes present in each of these categories. The high proportion (mean ?? 1 SD = 29 ?? 17.5%) of fluvial specialist fishes inhabiting north-temperate large rivers was inconsistent with the highway analogy. Most members of the families Petromyzontidae, Acidpenseridae, Hiodontidae, Osmeridae, Salmonidae, and Gobiidae require flowing water during some life stage. Between 29 and 100% of the native fish assemblage was of conservation concern, and from 50 to 85% of these fishes required riverine habitats to complete their life cycles. Macrohabitat generalists are adapted to capitalize on floodplain habitats and composed from 44 to 96% of introduced fishes in the rivers studied. Habitat diversity inherent in main-channel complexes of unaltered large rivers and reestablished in regulated large rivers is essential to meet life-history needs of native fluvial fishes while discouraging expansion of introduced species. Restoration of north-temperate large rivers and their native fish fauna should incorporate the dynamic interplay among main channel, floodplain, and tributary habitats and processes.

  14. Electrotherapy for the treatment of painful diabetic peripheral neuropathy: a review.

    PubMed

    Pieber, Karin; Herceg, Malvina; Paternostro-Sluga, Tatjana

    2010-04-01

    To review different types of electrotherapy for the treatment of painful diabetic peripheral neuropathy. A structured search of the electronic database MEDLINE was performed from the time of its initiation to July 2009. Articles in English and German were selected. The efficacy of different types of electrotherapy for painful diabetic peripheral neuropathy has been evaluated in 15 studies; the effects of transcutaneous electrical nerve stimulation are consistent. The beneficial effects of prolonged use have been reported in three large studies and one small study. The effects of frequency-modulated electromagnetic neural stimulation were assessed in one large study, and a significant reduction in pain was reported. Treatment with pulsed and static electromagnetic fields has been investigated in two small and three large studies, and analgesic benefits have been reported. In one large study focusing on pulsed electromagnetic fields, no beneficial effect on pain was registered. Only small studies were found concerning other types of electrotherapy, such as pulsed-dose electrical stimulation, high-frequency external muscle stimulation or high-tone external muscle stimulation. The conclusions drawn in these articles are diverse. Shortcomings and problems, including a poor study design, were observed in some. Further randomized, double-blind, placebo-controlled studies comprising larger sample sizes, a longer duration of treatment, and longer follow-up assessments are required.

  15. Multibody dynamic simulation of knee contact mechanics

    PubMed Central

    Bei, Yanhong; Fregly, Benjamin J.

    2006-01-01

    Multibody dynamic musculoskeletal models capable of predicting muscle forces and joint contact pressures simultaneously would be valuable for studying clinical issues related to knee joint degeneration and restoration. Current three-dimensional multi-body knee models are either quasi-static with deformable contact or dynamic with rigid contact. This study proposes a computationally efficient methodology for combining multibody dynamic simulation methods with a deformable contact knee model. The methodology requires preparation of the articular surface geometry, development of efficient methods to calculate distances between contact surfaces, implementation of an efficient contact solver that accounts for the unique characteristics of human joints, and specification of an application programming interface for integration with any multibody dynamic simulation environment. The current implementation accommodates natural or artificial tibiofemoral joint models, small or large strain contact models, and linear or nonlinear material models. Applications are presented for static analysis (via dynamic simulation) of a natural knee model created from MRI and CT data and dynamic simulation of an artificial knee model produced from manufacturer’s CAD data. Small and large strain natural knee static analyses required 1 min of CPU time and predicted similar contact conditions except for peak pressure, which was higher for the large strain model. Linear and nonlinear artificial knee dynamic simulations required 10 min of CPU time and predicted similar contact force and torque but different contact pressures, which were lower for the nonlinear model due to increased contact area. This methodology provides an important step toward the realization of dynamic musculoskeletal models that can predict in vivo knee joint motion and loading simultaneously. PMID:15564115

  16. An improved method for bivariate meta-analysis when within-study correlations are unknown.

    PubMed

    Hong, Chuan; D Riley, Richard; Chen, Yong

    2018-03-01

    Multivariate meta-analysis, which jointly analyzes multiple and possibly correlated outcomes in a single analysis, is becoming increasingly popular in recent years. An attractive feature of the multivariate meta-analysis is its ability to account for the dependence between multiple estimates from the same study. However, standard inference procedures for multivariate meta-analysis require the knowledge of within-study correlations, which are usually unavailable. This limits standard inference approaches in practice. Riley et al proposed a working model and an overall synthesis correlation parameter to account for the marginal correlation between outcomes, where the only data needed are those required for a separate univariate random-effects meta-analysis. As within-study correlations are not required, the Riley method is applicable to a wide variety of evidence synthesis situations. However, the standard variance estimator of the Riley method is not entirely correct under many important settings. As a consequence, the coverage of a function of pooled estimates may not reach the nominal level even when the number of studies in the multivariate meta-analysis is large. In this paper, we improve the Riley method by proposing a robust variance estimator, which is asymptotically correct even when the model is misspecified (ie, when the likelihood function is incorrect). Simulation studies of a bivariate meta-analysis, in a variety of settings, show a function of pooled estimates has improved performance when using the proposed robust variance estimator. In terms of individual pooled estimates themselves, the standard variance estimator and robust variance estimator give similar results to the original method, with appropriate coverage. The proposed robust variance estimator performs well when the number of studies is relatively large. Therefore, we recommend the use of the robust method for meta-analyses with a relatively large number of studies (eg, m≥50). When the sample size is relatively small, we recommend the use of the robust method under the working independence assumption. We illustrate the proposed method through 2 meta-analyses. Copyright © 2017 John Wiley & Sons, Ltd.

  17. Requirements for clinical information modelling tools.

    PubMed

    Moreno-Conde, Alberto; Jódar-Sánchez, Francisco; Kalra, Dipak

    2015-07-01

    This study proposes consensus requirements for clinical information modelling tools that can support modelling tasks in medium/large scale institutions. Rather than identify which functionalities are currently available in existing tools, the study has focused on functionalities that should be covered in order to provide guidance about how to evolve the existing tools. After identifying a set of 56 requirements for clinical information modelling tools based on a literature review and interviews with experts, a classical Delphi study methodology was applied to conduct a two round survey in order to classify them as essential or recommended. Essential requirements are those that must be met by any tool that claims to be suitable for clinical information modelling, and if we one day have a certified tools list, any tool that does not meet essential criteria would be excluded. Recommended requirements are those more advanced requirements that may be met by tools offering a superior product or only needed in certain modelling situations. According to the answers provided by 57 experts from 14 different countries, we found a high level of agreement to enable the study to identify 20 essential and 21 recommended requirements for these tools. It is expected that this list of identified requirements will guide developers on the inclusion of new basic and advanced functionalities that have strong support by end users. This list could also guide regulators in order to identify requirements that could be demanded of tools adopted within their institutions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  18. Design, development and fabrication of a deployable/retractable truss beam model for large space structures application

    NASA Technical Reports Server (NTRS)

    Adams, Louis R.

    1987-01-01

    The design requirements for a truss beam model are reviewed. The concept behind the beam is described. Pertinent analysis and studies concerning beam definition, deployment loading, joint compliance, etc. are given. Design, fabrication and assembly procedures are discussed.

  19. CHURCHILL COUNTY, NEVADA ARSENIC STUDY: WATER CONSUMPTION AND EXPOSURE BIOMARKERS

    EPA Science Inventory

    The US Environmental Protection Agency is required to reevaluate the Maximum Contaminant Level (MCL) for arsenic in 2006. To provide data for reducing uncertainties in assessing health risks associated with exposure to low levels (<200 g/l) of arsenic, a large scale biomarker st...

  20. Molybdenum Accumulation in Marine Sediments as an Indicator of Hypoxic Water Conditions (NACAETAC)

    EPA Science Inventory

    Direct monitoring of hypoxic water column conditions over large spatial and temporal extents is difficult due to the substantial logistical and financial investment required. Recent studies have indicated that concentrations of molybdenum (Mo) in marine sediments may serve as a u...

  1. Satellite Systems for Instructional Radio.

    ERIC Educational Resources Information Center

    Jamison, Dean; And Others

    Recent studies suggest that Educational Television (ETV) broadcast from geostationary satellites can markedly reduce the cost and time required to provide educational opportunity for the citizens of large, less-developed countries. The sheer volume of educational needs precludes, however, the possibility of satisfying very many of them with only a…

  2. Quality Assurance Project Plan - Modeling the Impact of Hydraulic Fracturing on Water Resources Based on Water Acquisition Scenarios

    EPA Pesticide Factsheets

    This planning document describes the quality assurance/quality control activities and technical requirements that will be used during the research study. The goal of this project is to evaluate the potential impacts of large volume water withdrawals.

  3. The Systems Revolution

    ERIC Educational Resources Information Center

    Ackoff, Russell L.

    1974-01-01

    The major organizational and social problems of our time do not lend themselves to the reductionism of traditional analytical and disciplinary approaches. They must be attacked holistically, with a comprehensive systems approach. The effective study of large-scale social systems requires the synthesis of science with the professions that use it.…

  4. EVALUATION OF ANALYTICAL METHODS FOR DETERMINING PESTICIDES IN BABY FOOD AND ADULT DUPLICATE-DIET SAMPLES

    EPA Science Inventory

    Determinations of pesticides in food are often complicated by the presence of fats and require multiple cleanup steps before analysis. Cost-effective analytical methods are needed for conducting large-scale exposure studies. We examined two extraction methods, supercritical flu...

  5. Narrating Neoliberalism: Alternative Education Teachers' Conceptions of Their Changing Roles

    ERIC Educational Resources Information Center

    Golden, Noah Asher

    2018-01-01

    The signifier "alternative" in education has largely shifted from progressive or humanizing pedagogies to deficit framings requiring alternate graduation criteria. This development is part of broader neoliberal educational reform efforts that disrupt longstanding conceptions of teachers' roles. This study serves to investigate long-term…

  6. Identifying Metabolically Active Chemicals Using a Consensus Quantitative Structure Activity Relationship Model for Estrogen Receptor Binding

    EPA Science Inventory

    Traditional toxicity testing provides insight into the mechanisms underlying toxicological responses but requires a high investment in a large number of resources. The new paradigm of testing approaches involves rapid screening studies able to evaluate thousands of chemicals acro...

  7. Initial strides for invent-VTE: Towards global collaboration to accelerate clinical research in venous thromboembolism.

    PubMed

    Rodger, Marc; Langlois, Nicole; Middeldorp, Saskia; Kahn, Susan; Sandset, Per Morten; Brighton, Timothy; Huisman, Menno V; Meyer, Guy; Konstantinides, Stavros; Ageno, Walter; Morange, Pierre; Garcia, David; Kreuziger, Lisa Baumann; Young, Laura; Key, Nigel; Monreal, Manuel; Jiménez, David

    2018-03-01

    Venous thromboembolism (VTE) represents a major global burden of disease and requires collaborative efforts to conduct large, high-quality investigator-initiated and academically sponsored studies addressing the most relevant clinical questions. Owing to increasing regulatory requirements, the highly competitive nature of peer-reviewed funding and costs associated with conducting large, multinational clinical trials, completing practice-changing research constitutes a growing challenge for clinical investigators. As clinical trialists interested in VTE, we founded INVENT (International Network of Venous Thromboembolism Clinical Research Networks) in an effort to promote and accelerate patient-oriented, investigator-initiated, international collaborative research, to identify, prioritize and answer key clinical research questions for patients with VTE. We report on our activities to formalize the INVENT network and our accomplishments in our first year. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. A Primer on Infectious Disease Bacterial Genomics

    PubMed Central

    Petkau, Aaron; Knox, Natalie; Graham, Morag; Van Domselaar, Gary

    2016-01-01

    SUMMARY The number of large-scale genomics projects is increasing due to the availability of affordable high-throughput sequencing (HTS) technologies. The use of HTS for bacterial infectious disease research is attractive because one whole-genome sequencing (WGS) run can replace multiple assays for bacterial typing, molecular epidemiology investigations, and more in-depth pathogenomic studies. The computational resources and bioinformatics expertise required to accommodate and analyze the large amounts of data pose new challenges for researchers embarking on genomics projects for the first time. Here, we present a comprehensive overview of a bacterial genomics projects from beginning to end, with a particular focus on the planning and computational requirements for HTS data, and provide a general understanding of the analytical concepts to develop a workflow that will meet the objectives and goals of HTS projects. PMID:28590251

  9. Effect of steady and time-harmonic magnetic fields on macrosegragation in alloy solidification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Incropera, F.P.; Prescott, P.J.

    Buoyancy-induced convection during the solidification of alloys can contribute significantly to the redistribution of alloy constituents, thereby creating large composition gradients in the final ingot. Termed macrosegregation, the condition diminishes the quality of the casting and, in the extreme, may require that the casting be remelted. The deleterious effects of buoyancy-driven flows may be suppressed through application of an external magnetic field, and in this study the effects of both steady and time-harmonic fields have been considered. For a steady magnetic field, extremely large field strengths would be required to effectively dampen convection patterns that contribute to macrosegregation. However, bymore » reducing spatial variations in temperature and composition, turbulent mixing induced by a time-harmonic field reduces the number and severity of segregates in the final casting.« less

  10. Optimizing liquid effluent monitoring at a large nuclear complex.

    PubMed

    Chou, Charissa J; Barnett, D Brent; Johnson, Vernon G; Olson, Phil M

    2003-12-01

    Effluent monitoring typically requires a large number of analytes and samples during the initial or startup phase of a facility. Once a baseline is established, the analyte list and sampling frequency may be reduced. Although there is a large body of literature relevant to the initial design, few, if any, published papers exist on updating established effluent monitoring programs. This paper statistically evaluates four years of baseline data to optimize the liquid effluent monitoring efficiency of a centralized waste treatment and disposal facility at a large defense nuclear complex. Specific objectives were to: (1) assess temporal variability in analyte concentrations, (2) determine operational factors contributing to waste stream variability, (3) assess the probability of exceeding permit limits, and (4) streamline the sampling and analysis regime. Results indicated that the probability of exceeding permit limits was one in a million under normal facility operating conditions, sampling frequency could be reduced, and several analytes could be eliminated. Furthermore, indicators such as gross alpha and gross beta measurements could be used in lieu of more expensive specific isotopic analyses (radium, cesium-137, and strontium-90) for routine monitoring. Study results were used by the state regulatory agency to modify monitoring requirements for a new discharge permit, resulting in an annual cost savings of US dollars 223,000. This case study demonstrates that statistical evaluation of effluent contaminant variability coupled with process knowledge can help plant managers and regulators streamline analyte lists and sampling frequencies based on detection history and environmental risk.

  11. Comparison of Multi-Scale Digital Elevation Models for Defining Waterways and Catchments Over Large Areas

    NASA Astrophysics Data System (ADS)

    Harris, B.; McDougall, K.; Barry, M.

    2012-07-01

    Digital Elevation Models (DEMs) allow for the efficient and consistent creation of waterways and catchment boundaries over large areas. Studies of waterway delineation from DEMs are usually undertaken over small or single catchment areas due to the nature of the problems being investigated. Improvements in Geographic Information Systems (GIS) techniques, software, hardware and data allow for analysis of larger data sets and also facilitate a consistent tool for the creation and analysis of waterways over extensive areas. However, rarely are they developed over large regional areas because of the lack of available raw data sets and the amount of work required to create the underlying DEMs. This paper examines definition of waterways and catchments over an area of approximately 25,000 km2 to establish the optimal DEM scale required for waterway delineation over large regional projects. The comparative study analysed multi-scale DEMs over two test areas (Wivenhoe catchment, 543 km2 and a detailed 13 km2 within the Wivenhoe catchment) including various data types, scales, quality, and variable catchment input parameters. Historic and available DEM data was compared to high resolution Lidar based DEMs to assess variations in the formation of stream networks. The results identified that, particularly in areas of high elevation change, DEMs at 20 m cell size created from broad scale 1:25,000 data (combined with more detailed data or manual delineation in flat areas) are adequate for the creation of waterways and catchments at a regional scale.

  12. Environmental impact assessment and environmental audit in large-scale public infrastructure construction: the case of the Qinghai-Tibet Railway.

    PubMed

    He, Guizhen; Zhang, Lei; Lu, Yonglong

    2009-09-01

    Large-scale public infrastructure projects have featured in China's modernization course since the early 1980s. During the early stages of China's rapid economic development, public attention focused on the economic and social impact of high-profile construction projects. In recent years, however, we have seen a shift in public concern toward the environmental and ecological effects of such projects, and today governments are required to provide valid environmental impact assessments prior to allowing large-scale construction. The official requirement for the monitoring of environmental conditions has led to an increased number of debates in recent years regarding the effectiveness of Environmental Impact Assessments (EIAs) and Governmental Environmental Audits (GEAs) as environmental safeguards in instances of large-scale construction. Although EIA and GEA are conducted by different institutions and have different goals and enforcement potential, these two practices can be closely related in terms of methodology. This article cites the construction of the Qinghai-Tibet Railway as an instance in which EIA and GEA offer complementary approaches to environmental impact management. This study concludes that the GEA approach can serve as an effective follow-up to the EIA and establishes that the EIA lays a base for conducting future GEAs. The relationship that emerges through a study of the Railway's construction calls for more deliberate institutional arrangements and cooperation if the two practices are to be used in concert to optimal effect.

  13. Mastery learning for health professionals using technology-enhanced simulation: a systematic review and meta-analysis.

    PubMed

    Cook, David A; Brydges, Ryan; Zendejas, Benjamin; Hamstra, Stanley J; Hatala, Rose

    2013-08-01

    Competency-based education requires individualization of instruction. Mastery learning, an instructional approach requiring learners to achieve a defined proficiency before proceeding to the next instructional objective, offers one approach to individualization. The authors sought to summarize the quantitative outcomes of mastery learning simulation-based medical education (SBME) in comparison with no intervention and nonmastery instruction, and to determine what features of mastery SBME make it effective. The authors searched MEDLINE, EMBASE, CINAHL, ERIC, PsycINFO, Scopus, key journals, and previous review bibliographies through May 2011. They included original research in any language evaluating mastery SBME, in comparison with any intervention or no intervention, for practicing and student physicians, nurses, and other health professionals. Working in duplicate, they abstracted information on trainees, instructional design (interactivity, feedback, repetitions, and learning time), study design, and outcomes. They identified 82 studies evaluating mastery SBME. In comparison with no intervention, mastery SBME was associated with large effects on skills (41 studies; effect size [ES] 1.29 [95% confidence interval, 1.08-1.50]) and moderate effects on patient outcomes (11 studies; ES 0.73 [95% CI, 0.36-1.10]). In comparison with nonmastery SBME instruction, mastery learning was associated with large benefit in skills (3 studies; effect size 1.17 [95% CI, 0.29-2.05]) but required more time. Pretraining and additional practice improved outcomes but, again, took longer. Studies exploring enhanced feedback and self-regulated learning in the mastery model showed mixed results. Limited evidence suggests that mastery learning SBME is superior to nonmastery instruction but takes more time.

  14. Algorithm sensitivity analysis and parameter tuning for tissue image segmentation pipelines

    PubMed Central

    Kurç, Tahsin M.; Taveira, Luís F. R.; Melo, Alba C. M. A.; Gao, Yi; Kong, Jun; Saltz, Joel H.

    2017-01-01

    Abstract Motivation: Sensitivity analysis and parameter tuning are important processes in large-scale image analysis. They are very costly because the image analysis workflows are required to be executed several times to systematically correlate output variations with parameter changes or to tune parameters. An integrated solution with minimum user interaction that uses effective methodologies and high performance computing is required to scale these studies to large imaging datasets and expensive analysis workflows. Results: The experiments with two segmentation workflows show that the proposed approach can (i) quickly identify and prune parameters that are non-influential; (ii) search a small fraction (about 100 points) of the parameter search space with billions to trillions of points and improve the quality of segmentation results (Dice and Jaccard metrics) by as much as 1.42× compared to the results from the default parameters; (iii) attain good scalability on a high performance cluster with several effective optimizations. Conclusions: Our work demonstrates the feasibility of performing sensitivity analyses, parameter studies and auto-tuning with large datasets. The proposed framework can enable the quantification of error estimations and output variations in image segmentation pipelines. Availability and Implementation: Source code: https://github.com/SBU-BMI/region-templates/. Contact: teodoro@unb.br Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28062445

  15. Algorithm sensitivity analysis and parameter tuning for tissue image segmentation pipelines.

    PubMed

    Teodoro, George; Kurç, Tahsin M; Taveira, Luís F R; Melo, Alba C M A; Gao, Yi; Kong, Jun; Saltz, Joel H

    2017-04-01

    Sensitivity analysis and parameter tuning are important processes in large-scale image analysis. They are very costly because the image analysis workflows are required to be executed several times to systematically correlate output variations with parameter changes or to tune parameters. An integrated solution with minimum user interaction that uses effective methodologies and high performance computing is required to scale these studies to large imaging datasets and expensive analysis workflows. The experiments with two segmentation workflows show that the proposed approach can (i) quickly identify and prune parameters that are non-influential; (ii) search a small fraction (about 100 points) of the parameter search space with billions to trillions of points and improve the quality of segmentation results (Dice and Jaccard metrics) by as much as 1.42× compared to the results from the default parameters; (iii) attain good scalability on a high performance cluster with several effective optimizations. Our work demonstrates the feasibility of performing sensitivity analyses, parameter studies and auto-tuning with large datasets. The proposed framework can enable the quantification of error estimations and output variations in image segmentation pipelines. Source code: https://github.com/SBU-BMI/region-templates/ . teodoro@unb.br. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  16. Exogenous lysophospholipids with large head groups perturb clathrin-mediated endocytosis.

    PubMed

    Ailte, Ieva; Lingelem, Anne Berit D; Kvalvaag, Audun S; Kavaliauskiene, Simona; Brech, Andreas; Koster, Gerbrand; Dommersnes, Paul G; Bergan, Jonas; Skotland, Tore; Sandvig, Kirsten

    2017-03-01

    In this study, we have investigated how clathrin-dependent endocytosis is affected by exogenously added lysophospholipids (LPLs). Addition of LPLs with large head groups strongly inhibits transferrin (Tf) endocytosis in various cell lines, while LPLs with small head groups do not. Electron and total internal reflection fluorescence microscopy (EM and TIRF) reveal that treatment with lysophosphatidylinositol (LPI) with the fatty acyl group C18:0 leads to reduced numbers of invaginated clathrin-coated pits (CCPs) at the plasma membrane, fewer endocytic events per membrane area and increased lifetime of CCPs. Also, endocytosis of Tf becomes dependent on actin upon LPI treatment. Thus, our results demonstrate that one can regulate the kinetics and properties of clathrin-dependent endocytosis by addition of LPLs in a head group size- and fatty acyl-dependent manner. Furthermore, studies performed with optical tweezers show that less force is required to pull membrane tubules outwards from the plasma membrane when LPI is added to the cells. The results are in agreement with the notion that insertion of LPLs with large head groups creates a positive membrane curvature which might have a negative impact on events that require plasma membrane invagination, while it may facilitate membrane bending toward the cell exterior. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  17. Use of tandem circulation wells to measure hydraulic conductivity without groundwater extraction

    NASA Astrophysics Data System (ADS)

    Goltz, Mark N.; Huang, Junqi; Close, Murray E.; Flintoft, Mark J.; Pang, Liping

    2008-09-01

    Conventional methods to measure the hydraulic conductivity of an aquifer on a relatively large scale (10-100 m) require extraction of significant quantities of groundwater. This can be expensive, and otherwise problematic, when investigating a contaminated aquifer. In this study, innovative approaches that make use of tandem circulation wells to measure hydraulic conductivity are proposed. These approaches measure conductivity on a relatively large scale, but do not require extraction of groundwater. Two basic approaches for using circulation wells to measure hydraulic conductivity are presented; one approach is based upon the dipole-flow test method, while the other approach relies on a tracer test to measure the flow of water between two recirculating wells. The approaches are tested in a relatively homogeneous and isotropic artificial aquifer, where the conductivities measured by both approaches are compared to each other and to the previously measured hydraulic conductivity of the aquifer. It was shown that both approaches have the potential to accurately measure horizontal and vertical hydraulic conductivity for a relatively large subsurface volume without the need to pump groundwater to the surface. Future work is recommended to evaluate the ability of these tandem circulation wells to accurately measure hydraulic conductivity when anisotropy and heterogeneity are greater than in the artificial aquifer used for these studies.

  18. Implications of Information Technology for Employment, Skills, and Wages: Findings from Sectoral and Case Study Research. Final Report

    ERIC Educational Resources Information Center

    Handel, Michael J.

    2004-01-01

    This paper reviews evidence from industry-specific and case studies that shed light on the extent to which computers and automation eliminate jobs, raise job skill requirements, and, consequently, contribute to increased wage inequality between less- and more skilled workers. This paper complements a previous review of large-scale econometric…

  19. Novel Concept for Flexible and Resilient Large Power Transformers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Upadhyay, Parag; Englebretson, Steven; Ramanan, V. R. R.

    This feasibility study investigates a flexible and adaptable LPT design solution which can facilitate long-term replacement in the event of both catastrophic failures as well as scheduled replacements, thereby increasing grid resilience. The scope of this project has been defined based on an initial system study and identification of the transformer requirements from an overall system load flow perspective.

  20. Task Switching in a Hierarchical Task Structure: Evidence for the Fragility of the Task Repetition Benefit

    ERIC Educational Resources Information Center

    Lien, Mei-Ching; Ruthruff, Eric

    2004-01-01

    This study examined how task switching is affected by hierarchical task organization. Traditional task-switching studies, which use a constant temporal and spatial distance between each task element (defined as a stimulus requiring a response), promote a flat task structure. Using this approach, Experiment 1 revealed a large switch cost of 238 ms.…

  1. 49 CFR 37.185 - Fleet accessibility requirement for OTRB fixed-route systems of large operators.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false Fleet accessibility requirement for OTRB fixed-route systems of large operators. 37.185 Section 37.185 Transportation Office of the Secretary of Transportation TRANSPORTATION SERVICES FOR INDIVIDUALS WITH DISABILITIES (ADA) Over-the-Road Buses (OTRBs) § 37...

  2. 16 CFR 1701.3 - Applicability of special packaging requirements to hazardous substances in large size containers.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 16 Commercial Practices 2 2014-01-01 2014-01-01 false Applicability of special packaging requirements to hazardous substances in large size containers. 1701.3 Section 1701.3 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION POISON PREVENTION PACKAGING ACT OF 1970 REGULATIONS STATEMENTS OF POLICY...

  3. 16 CFR 1701.3 - Applicability of special packaging requirements to hazardous substances in large size containers.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Applicability of special packaging requirements to hazardous substances in large size containers. 1701.3 Section 1701.3 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION POISON PREVENTION PACKAGING ACT OF 1970 REGULATIONS STATEMENTS OF POLICY...

  4. 16 CFR 1701.3 - Applicability of special packaging requirements to hazardous substances in large size containers.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 16 Commercial Practices 2 2011-01-01 2011-01-01 false Applicability of special packaging requirements to hazardous substances in large size containers. 1701.3 Section 1701.3 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION POISON PREVENTION PACKAGING ACT OF 1970 REGULATIONS STATEMENTS OF POLICY...

  5. 16 CFR 1701.3 - Applicability of special packaging requirements to hazardous substances in large size containers.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 16 Commercial Practices 2 2012-01-01 2012-01-01 false Applicability of special packaging requirements to hazardous substances in large size containers. 1701.3 Section 1701.3 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION POISON PREVENTION PACKAGING ACT OF 1970 REGULATIONS STATEMENTS OF POLICY...

  6. Program Design for Retrospective Searches on Large Data Bases

    ERIC Educational Resources Information Center

    Thiel, L. H.; Heaps, H. S.

    1972-01-01

    Retrospective search of large data bases requires development of special techniques for automatic compression of data and minimization of the number of input-output operations to the computer files. The computer program should require a relatively small amount of internal memory. This paper describes the structure of such a program. (9 references)…

  7. In-space production of large space systems from extraterrestrial materials: A program implementation model

    NASA Technical Reports Server (NTRS)

    Vontiesenhausen, G. F.

    1977-01-01

    A program implementation model is presented which covers the in-space construction of certain large space systems from extraterrestrial materials. The model includes descriptions of major program elements and subelements and their operational requirements and technology readiness requirements. It provides a structure for future analysis and development.

  8. Size matters: large objects capture attention in visual search.

    PubMed

    Proulx, Michael J

    2010-12-23

    Can objects or events ever capture one's attention in a purely stimulus-driven manner? A recent review of the literature set out the criteria required to find stimulus-driven attentional capture independent of goal-directed influences, and concluded that no published study has satisfied that criteria. Here visual search experiments assessed whether an irrelevantly large object can capture attention. Capture of attention by this static visual feature was found. The results suggest that a large object can indeed capture attention in a stimulus-driven manner and independent of displaywide features of the task that might encourage a goal-directed bias for large items. It is concluded that these results are either consistent with the stimulus-driven criteria published previously or alternatively consistent with a flexible, goal-directed mechanism of saliency detection.

  9. Design of pilot studies to inform the construction of composite outcome measures.

    PubMed

    Edland, Steven D; Ard, M Colin; Li, Weiwei; Jiang, Lingjing

    2017-06-01

    Composite scales have recently been proposed as outcome measures for clinical trials. For example, the Prodromal Alzheimer's Cognitive Composite (PACC) is the sum of z-score normed component measures assessing episodic memory, timed executive function, and global cognition. Alternative methods of calculating composite total scores using the weighted sum of the component measures that maximize signal-to-noise of the resulting composite score have been proposed. Optimal weights can be estimated from pilot data, but it is an open question how large a pilot trial is required to calculate reliably optimal weights. In this manuscript, we describe the calculation of optimal weights, and use large-scale computer simulations to investigate the question of how large a pilot study sample is required to inform the calculation of optimal weights. The simulations are informed by the pattern of decline observed in cognitively normal subjects enrolled in the Alzheimer's Disease Cooperative Study (ADCS) Prevention Instrument cohort study, restricting to n=75 subjects age 75 and over with an ApoE E4 risk allele and therefore likely to have an underlying Alzheimer neurodegenerative process. In the context of secondary prevention trials in Alzheimer's disease, and using the components of the PACC, we found that pilot studies as small as 100 are sufficient to meaningfully inform weighting parameters. Regardless of the pilot study sample size used to inform weights, the optimally weighted PACC consistently outperformed the standard PACC in terms of statistical power to detect treatment effects in a clinical trial. Pilot studies of size 300 produced weights that achieved near-optimal statistical power, and reduced required sample size relative to the standard PACC by more than half. These simulations suggest that modestly sized pilot studies, comparable to that of a phase 2 clinical trial, are sufficient to inform the construction of composite outcome measures. Although these findings apply only to the PACC in the context of prodromal AD, the observation that weights only have to approximate the optimal weights to achieve near-optimal performance should generalize. Performing a pilot study or phase 2 trial to inform the weighting of proposed composite outcome measures is highly cost-effective. The net effect of more efficient outcome measures is that smaller trials will be required to test novel treatments. Alternatively, second generation trials can use prior clinical trial data to inform weighting, so that greater efficiency can be achieved as we move forward.

  10. Accounting for patient size in the optimization of dose and image quality of pelvis cone beam CT protocols on the Varian OBI system.

    PubMed

    Wood, Tim J; Moore, Craig S; Horsfield, Carl J; Saunderson, John R; Beavis, Andrew W

    2015-01-01

    The purpose of this study was to develop size-based radiotherapy kilovoltage cone beam CT (CBCT) protocols for the pelvis. Image noise was measured in an elliptical phantom of varying size for a range of exposure factors. Based on a previously defined "small pelvis" reference patient and CBCT protocol, appropriate exposure factors for small, medium, large and extra-large patients were derived which approximate the image noise behaviour observed on a Philips CT scanner (Philips Medical Systems, Best, Netherlands) with automatic exposure control (AEC). Selection criteria, based on maximum tube current-time product per rotation selected during the radiotherapy treatment planning scan, were derived based on an audit of patient size. It has been demonstrated that 110 kVp yields acceptable image noise for reduced patient dose in pelvic CBCT scans of small, medium and large patients, when compared with manufacturer's default settings (125 kVp). Conversely, extra-large patients require increased exposure factors to give acceptable images. 57% of patients in the local population now receive much lower radiation doses, whereas 13% require higher doses (but now yield acceptable images). The implementation of size-based exposure protocols has significantly reduced radiation dose to the majority of patients with no negative impact on image quality. Increased doses are required on the largest patients to give adequate image quality. The development of size-based CBCT protocols that use the planning CT scan (with AEC) to determine which protocol is appropriate ensures adequate image quality whilst minimizing patient radiation dose.

  11. Advanced-technology space station study: Summary of systems and pacing technologies

    NASA Technical Reports Server (NTRS)

    Butterfield, A. J.; Garn, P. A.; King, C. B.; Queijo, M. J.

    1990-01-01

    The principal system features defined for the Advanced Technology Space Station are summarized and the 21 pacing technologies identified during the course of the study are described. The descriptions of system configurations were extracted from four previous study reports. The technological areas focus on those systems particular to all large spacecraft which generate artificial gravity by rotation. The summary includes a listing of the functions, crew requirements and electrical power demand that led to the studied configuration. The pacing technologies include the benefits of advanced materials, in-orbit assembly requirements, stationkeeping, evaluations of electrical power generation alternates, and life support systems. The descriptions of systems show the potential for synergies and identifies the beneficial interactions that can result from technological advances.

  12. Optimization of a Fluorescence-Based Assay for Large-Scale Drug Screening against Babesia and Theileria Parasites

    PubMed Central

    Terkawi, Mohamed Alaa; Youssef, Mohamed Ahmed; El Said, El Said El Shirbini; Elsayed, Gehad; El-Khodery, Sabry; El-Ashker, Maged; Elsify, Ahmed; Omar, Mosaab; Salama, Akram; Yokoyama, Naoaki; Igarashi, Ikuo

    2015-01-01

    A rapid and accurate assay for evaluating antibabesial drugs on a large scale is required for the discovery of novel chemotherapeutic agents against Babesia parasites. In the current study, we evaluated the usefulness of a fluorescence-based assay for determining the efficacies of antibabesial compounds against bovine and equine hemoparasites in in vitro cultures. Three different hematocrits (HCTs; 2.5%, 5%, and 10%) were used without daily replacement of the medium. The results of a high-throughput screening assay revealed that the best HCT was 2.5% for bovine Babesia parasites and 5% for equine Babesia and Theileria parasites. The IC50 values of diminazene aceturate obtained by fluorescence and microscopy did not differ significantly. Likewise, the IC50 values of luteolin, pyronaridine tetraphosphate, nimbolide, gedunin, and enoxacin did not differ between the two methods. In conclusion, our fluorescence-based assay uses low HCT and does not require daily replacement of culture medium, making it highly suitable for in vitro large-scale drug screening against Babesia and Theileria parasites that infect cattle and horses. PMID:25915529

  13. Optimization of a Fluorescence-Based Assay for Large-Scale Drug Screening against Babesia and Theileria Parasites.

    PubMed

    Rizk, Mohamed Abdo; El-Sayed, Shimaa Abd El-Salam; Terkawi, Mohamed Alaa; Youssef, Mohamed Ahmed; El Said, El Said El Shirbini; Elsayed, Gehad; El-Khodery, Sabry; El-Ashker, Maged; Elsify, Ahmed; Omar, Mosaab; Salama, Akram; Yokoyama, Naoaki; Igarashi, Ikuo

    2015-01-01

    A rapid and accurate assay for evaluating antibabesial drugs on a large scale is required for the discovery of novel chemotherapeutic agents against Babesia parasites. In the current study, we evaluated the usefulness of a fluorescence-based assay for determining the efficacies of antibabesial compounds against bovine and equine hemoparasites in in vitro cultures. Three different hematocrits (HCTs; 2.5%, 5%, and 10%) were used without daily replacement of the medium. The results of a high-throughput screening assay revealed that the best HCT was 2.5% for bovine Babesia parasites and 5% for equine Babesia and Theileria parasites. The IC50 values of diminazene aceturate obtained by fluorescence and microscopy did not differ significantly. Likewise, the IC50 values of luteolin, pyronaridine tetraphosphate, nimbolide, gedunin, and enoxacin did not differ between the two methods. In conclusion, our fluorescence-based assay uses low HCT and does not require daily replacement of culture medium, making it highly suitable for in vitro large-scale drug screening against Babesia and Theileria parasites that infect cattle and horses.

  14. The impacts of mandatory service on students in service-learning classes.

    PubMed

    Dienhart, Carolyn; Maruyama, Geoffrey; Snyder, Mark; Furco, Andrew; McKay, Monica Siems; Hirt, Laurel; Huesman, Ronald

    2016-01-01

    This naturalistic study examined differences in students' motivations for elective versus required service-learning (SL) classes. Students in two successive academic years' cohorts were surveyed by the SL center at a large Midwestern university. Analyses compared classes differing in requirements for community-based service. Students required to participate in community service as part of a class within a program required for admission to a university were less likely to: want to be involved in future community work; enroll in another SL class; and recommend their class, compared to other groups of students, including others from classes in which SL was required as part of the program in which students were enrolled. These findings suggest that students' motivations to participate in community-engaged activities are not shaped simply by whether or not community engagement is required in SL classes, but also by other factors including how the engagement opportunity is contextualized.

  15. A valuable animal model of spinal cord injury to study motor dysfunctions, comorbid conditions, and aging associated diseases.

    PubMed

    Rouleau, Pascal; Guertin, Pierre A

    2013-01-01

    Most animal models of contused, compressed or transected spinal cord injury (SCI) require a laminectomy to be performed. However, despite advantages and disadvantages associated with each of these models, the laminectomy itself is generally associated with significant problems including longer surgery and anaesthesia (related post-operative complications), neuropathic pain, spinal instabilities, deformities, lordosis, and biomechanical problems, etc. This review provides an overview of findings obtained mainly from our laboratory that are associated with the development and characterization of a novel murine model of spinal cord transection that does not require a laminectomy. A number of studies successfully conducted with this model provided strong evidence that it constitutes a simple, reliable and reproducible transection model of complete paraplegia which is particularly useful for studies on large cohorts of wild-type or mutant animals - e.g., drug screening studies in vivo or studies aimed at characterizing neuronal and non-neuronal adaptive changes post-trauma. It is highly suitable also for studies aimed at identifying and developing new pharmacological treatments against aging associated comorbid problems and specific SCI-related dysfunctions (e.g., stereotyped motor behaviours such as locomotion, sexual response, defecation and micturition) largely related with 'command centers' located in lumbosacral areas of the spinal cord.

  16. A fast boosting-based screening method for large-scale association study in complex traits with genetic heterogeneity.

    PubMed

    Wang, Lu-Yong; Fasulo, D

    2006-01-01

    Genome-wide association study for complex diseases will generate massive amount of single nucleotide polymorphisms (SNPs) data. Univariate statistical test (i.e. Fisher exact test) was used to single out non-associated SNPs. However, the disease-susceptible SNPs may have little marginal effects in population and are unlikely to retain after the univariate tests. Also, model-based methods are impractical for large-scale dataset. Moreover, genetic heterogeneity makes the traditional methods harder to identify the genetic causes of diseases. A more recent random forest method provides a more robust method for screening the SNPs in thousands scale. However, for more large-scale data, i.e., Affymetrix Human Mapping 100K GeneChip data, a faster screening method is required to screening SNPs in whole-genome large scale association analysis with genetic heterogeneity. We propose a boosting-based method for rapid screening in large-scale analysis of complex traits in the presence of genetic heterogeneity. It provides a relatively fast and fairly good tool for screening and limiting the candidate SNPs for further more complex computational modeling task.

  17. Sensemaking in a Value Based Context for Large Scale Complex Engineered Systems

    NASA Astrophysics Data System (ADS)

    Sikkandar Basha, Nazareen

    The design and the development of Large-Scale Complex Engineered Systems (LSCES) requires the involvement of multiple teams and numerous levels of the organization and interactions with large numbers of people and interdisciplinary departments. Traditionally, requirements-driven Systems Engineering (SE) is used in the design and development of these LSCES. The requirements are used to capture the preferences of the stakeholder for the LSCES. Due to the complexity of the system, multiple levels of interactions are required to elicit the requirements of the system within the organization. Since LSCES involves people and interactions between the teams and interdisciplinary departments, it should be socio-technical in nature. The elicitation of the requirements of most large-scale system projects are subjected to creep in time and cost due to the uncertainty and ambiguity of requirements during the design and development. In an organization structure, the cost and time overrun can occur at any level and iterate back and forth thus increasing the cost and time. To avoid such creep past researches have shown that rigorous approaches such as value based designing can be used to control it. But before the rigorous approaches can be used, the decision maker should have a proper understanding of requirements creep and the state of the system when the creep occurs. Sensemaking is used to understand the state of system when the creep occurs and provide a guidance to decision maker. This research proposes the use of the Cynefin framework, sensemaking framework which can be used in the design and development of LSCES. It can aide in understanding the system and decision making to minimize the value gap due to requirements creep by eliminating ambiguity which occurs during design and development. A sample hierarchical organization is used to demonstrate the state of the system at the occurrence of requirements creep in terms of cost and time using the Cynefin framework. These trials are continued for different requirements and at different sub-system level. The results obtained show that the Cynefin framework can be used to improve the value of the system and can be used for predictive analysis. The decision makers can use these findings and use rigorous approaches and improve the design of Large Scale Complex Engineered Systems.

  18. How to Effectively Use Bismuth Quadruple Therapy: The Good, the Bad, and the Ugly

    PubMed Central

    Graham, David Y.; Lee, Sun-Young

    2015-01-01

    Bismuth triple therapy was the first truly effective Helicobacter pylori eradication therapy. The addition of a proton pump inhibitor largely overcame the problem of metronidazole resistance. Resistance to its being the primary first line therapy have centered on convenience (the large number of tablets required) and side effects causing difficulties with patient adherence. Understanding why the regimen is less successful in some regions remains unexplained in part because of the lack of studies including susceptibility testing. A number of modifications have been proposed such as twice-a-day therapy which addresses both major criticism but the studies with susceptibility testing required to prove its effectiveness in high metronidazole resistance areas are lacking. Most publications lack the data required to understand why they were successful or failed (e.g., detailed resistance and adherence data) and are therefore of little value. We discuss and provide recommendations regarding variations including substitution of doxycycline, amoxicillin, and twice a day therapy. We describe what is known and unknown and provide suggestions regarding what is needed to rationally and effectively use bismuth quadruple therapy. Its primary use is when penicillin cannot be used or when clarithromycin and metronidazole resistance is common. Durations of therapy less than 14 days are not recommended. PMID:26314667

  19. Pediatric anesthesia after APRICOT (Anaesthesia PRactice In Children Observational Trial): who should do it?

    PubMed

    Habre, Walid

    2018-06-01

    This review highlights the requirements for harmonization of training, certification and continuous professional development and discusses the implications for anesthesia management of children in Europe. A large prospective cohort study, Anaesthesia PRactice In Children Observational Trial (APRICOT), revealed a high incidence of perioperative severe critical events and a large variability of anesthesia practice across 33 European countries. Relevantly, quality improvement programs have been implemented in North America, which precisely define the requirements to manage anesthesia care for children. These programs, with the introduction of an incident-reporting system at local and national levels, could contribute to the improvement of anesthesia care for children in Europe. The main factors that likely contributed to the APRICOT study results are discussed with the goal of defining clear requirement guidelines for anesthetizing children. Emphasis is placed on the importance of an incident-reporting system that can be used for both competency-based curriculum for postgraduate training as well as for continuous professional development. Variability in training as well as in available resources, equipment and facilities limit the generalization of some of the APRICOT results. Finally, the impact on case outcome of the total number of pediatric cases attended by the anesthesiologist should be taken into consideration along with the level of expertise of the anesthesiologist for complex pediatric anesthesia cases.

  20. An Investigation of Rotorcraft Stability-Phase Margin Requirements in Hover

    NASA Technical Reports Server (NTRS)

    Blanken, Chris L.; Lusardi, Jeff A.; Ivler, Christina M.; Tischler, Mark B.; Hoefinger, Marc T.; Decker, William A.; Malpica, Carlos A.; Berger, Tom; Tucker, George E.

    2009-01-01

    A cooperative study was performed to investigate the handling quality effects from reduced flight control system stability margins, and the trade-offs with higher disturbance rejection bandwidth (DRB). The piloted simulation study, perform on the NASA-Ames Vertical Motion Simulator, included three classes of rotorcraft in four configurations: a utility-class helicopter; a medium-lift helicopter evaluated with and without an external slung load; and a large (heavy-lift) civil tiltrotor aircraft. This large aircraft also allowed an initial assessment of ADS-33 handling quality requirements for an aircraft of this size. Ten experimental test pilots representing the U.S. Army, Marine Corps, NASA, rotorcraft industry, and the German Aerospace Center (DLR), evaluated the four aircraft configurations, for a range of flight control stability-margins and turbulence levels, while primarily performing the ADS-33 Hover and Lateral Reposition MTEs. Pilot comments and aircraft-task performance data were analyzed. The preliminary stability margin results suggest higher DRB and less phase margin cases are preferred as the aircraft increases in size. Extra care will need to be taken to assess the influence of variability when nominal flight control gains start with reduced margins. Phase margins as low as 20-23 degrees resulted in low disturbance-response damping ratios, objectionable oscillations, PIO tendencies, and a perception of an incipient handling qualities cliff. Pilot comments on the disturbance response of the aircraft correlated well to the DRB guidelines provided in the ADS-33 Test Guide. The A D-3S3 mid-term response-to-control damping ratio metrics can be measured and applied to the disturbance-response damping ratio. An initial assessment of LCTR yaw bandwidth shows the current Level 1 boundary needs to be relaxed to help account for a large pilot off-set from the c.g. Future efforts should continue to investigate the applicability/refinement of the current ADS-33 requirements to large vehicles, like an LCTR.

  1. A study of facilities and fixtures for testing of a high speed civil transport wing component

    NASA Technical Reports Server (NTRS)

    Cerro, J. A.; Vause, R. F.; Bowman, L. M.; Jensen, J. K.; Martin, C. J., Jr.; Stockwell, A. E.; Waters, W. A., Jr.

    1996-01-01

    A study was performed to determine the feasibility of testing a large-scale High Speed Civil Transport wing component in the Structures and Materials Testing Laboratory in Building 1148 at NASA Langley Research Center. The report includes a survey of the electrical and hydraulic resources and identifies the backing structure and floor hard points which would be available for reacting the test loads. The backing structure analysis uses a new finite element model of the floor and backstop support system in the Structures Laboratory. Information on the data acquisition system and the thermal power requirements is also presented. The study identified the hardware that would be required to test a typical component, including the number and arrangement of hydraulic actuators required to simulate expected flight loads. Load introduction and reaction structure concepts were analyzed to investigate the effects of experimentally induced boundary conditions.

  2. Study of V/STOL aircraft implementation. Volume 1: Summary

    NASA Technical Reports Server (NTRS)

    Portenier, W. J.; Webb, H. M.

    1973-01-01

    A high density short haul air market which by 1980 is large enough to support the introduction of an independent short haul air transportation system is discussed. This system will complement the existing air transportation system and will provide relief of noise and congestion problems at conventional airports. The study has found that new aircraft, exploiting V/STOL and quiet engine technology, can be available for implementing these new services, and they can operate from existing reliever and general aviation airports. The study has also found that the major funding requirements for implementing new short haul services could be borne by private capital, and that the government funding requirement would be minimal and/or recovered through the airline ticket tax. In addition, a suitable new short haul aircraft would have a market potential for $3.5 billion in foreign sales. The long lead times needed for aircraft and engine technology development will require timely actions by federal agencies.

  3. Endoclips vs large or small-volume epinephrine in peptic ulcer recurrent bleeding

    PubMed Central

    Ljubicic, Neven; Budimir, Ivan; Biscanin, Alen; Nikolic, Marko; Supanc, Vladimir; Hrabar, Davor; Pavic, Tajana

    2012-01-01

    AIM: To compare the recurrent bleeding after endoscopic injection of different epinephrine volumes with hemoclips in patients with bleeding peptic ulcer. METHODS: Between January 2005 and December 2009, 150 patients with gastric or duodenal bleeding ulcer with major stigmata of hemorrhage and nonbleeding visible vessel in an ulcer bed (Forrest IIa) were included in the study. Patients were randomized to receive a small-volume epinephrine group (15 to 25 mL injection group; Group 1, n = 50), a large-volume epinephrine group (30 to 40 mL injection group; Group 2, n = 50) and a hemoclip group (Group 3, n = 50). The rate of recurrent bleeding, as the primary outcome, was compared between the groups of patients included in the study. Secondary outcomes compared between the groups were primary hemostasis rate, permanent hemostasis, need for emergency surgery, 30 d mortality, bleeding-related deaths, length of hospital stay and transfusion requirements. RESULTS: Initial hemostasis was obtained in all patients. The rate of early recurrent bleeding was 30% (15/50) in the small-volume epinephrine group (Group 1) and 16% (8/50) in the large-volume epinephrine group (Group 2) (P = 0.09). The rate of recurrent bleeding was 4% (2/50) in the hemoclip group (Group 3); the difference was statistically significant with regard to patients treated with either small-volume or large-volume epinephrine solution (P = 0.0005 and P = 0.045, respectively). Duration of hospital stay was significantly shorter among patients treated with hemoclips than among patients treated with epinephrine whereas there were no differences in transfusion requirement or even 30 d mortality between the groups. CONCLUSION: Endoclip is superior to both small and large volume injection of epinephrine in the prevention of recurrent bleeding in patients with peptic ulcer. PMID:22611315

  4. Chapter 15: Preparations of entomopathogens and diseased specimens for more detailed study using microscopy

    USDA-ARS?s Scientific Manuscript database

    The science of Insect Pathology encompasses a diverse assemblage of pathogens from a large and varied group of hosts. Microscopy techniques and protocols for these organisms are complex and varied and often require modifications and adaptations of standard procedures. The objective of this chapter...

  5. The role of gtcA in the pathogenesis of gastrointestinal listeriosis

    USDA-ARS?s Scientific Manuscript database

    Serotype 4b strains of Listeria monocytogenes have been implicated in most large outbreaks of listeriosis. The reason for this relationship remains unclear. The gtcA gene is required for glycosylation of teichoic acid on serotype 4b L. monocytogenes. In this study, we investigated two different sero...

  6. Calcium and vitamin D for bone health in adults

    USDA-ARS?s Scientific Manuscript database

    The calcium intake requirement is challenging to determine, and the IOM recommendations are based largely on calcium balance studies. The IOM recommends a calcium intake of 1000-1200 mg per day for older adults to support the preservation of bone mass. Food sources of calcium are preferred because h...

  7. Student Engagement in Neo-Liberal Times: What Is Missing?

    ERIC Educational Resources Information Center

    Zepke, Nick

    2018-01-01

    Quality teaching is increasingly prioritized in higher education. One reason is that government funding requires students to succeed in their studies and be ready for employment. In response, educators throughout the Western world have generated large quantities of evidence-based, practical, often uncritical research about what works to improve…

  8. SURFACE WATER FLOW IN LANDSCAPE MODELS: 1. EVERGLADES CASE STUDY. (R824766)

    EPA Science Inventory

    Many landscape models require extensive computational effort using a large array of grid cells that represent the landscape. The number of spatial cells may be in the thousands and millions, while the ecological component run in each of the cells to account for landscape dynamics...

  9. Development of near-infrared spectroscopy calibrations to measure quality characteristics in intact Brassicaceae germplasm

    USDA-ARS?s Scientific Manuscript database

    Determining seed quality parameters is an integral part of cultivar improvement and germplasm screening. However, quality tests are often time cnsuming, seed destructive, and can require large seed samples. This study describes the development of near-infrared spectroscopy (NIRS) calibrations to mea...

  10. 32 CFR 256.3 - Criteria.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... assumptions and data upon which these guidelines are based and require individual study. Where it is desirable to restrict the density of development of an area, it is not usually possible to state that one... wind conditions are such that a large percentage (i.e., over 80 percent) of the operations are in one...

  11. 32 CFR 256.3 - Criteria.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... assumptions and data upon which these guidelines are based and require individual study. Where it is desirable to restrict the density of development of an area, it is not usually possible to state that one... wind conditions are such that a large percentage (i.e., over 80 percent) of the operations are in one...

  12. Los Alamos high-power proton linac designs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawrence, G.P.

    1995-10-01

    Medium-energy high-power proton linear accelerators have been studied at Los Alamos as drivers for spallation neutron applications requiring large amounts of beam power. Reference designs for such accelerators are discussed, important design factors are reviewed, and issues and concern specific to this unprecedented power regime are discussed.

  13. Small- and Large-Effect Quantitative Trait Locus Interactions Underlie Variation in Yeast Sporulation Efficiency

    PubMed Central

    Lorenz, Kim; Cohen, Barak A.

    2012-01-01

    Quantitative trait loci (QTL) with small effects on phenotypic variation can be difficult to detect and analyze. Because of this a large fraction of the genetic architecture of many complex traits is not well understood. Here we use sporulation efficiency in Saccharomyces cerevisiae as a model complex trait to identify and study small-effect QTL. In crosses where the large-effect quantitative trait nucleotides (QTN) have been genetically fixed we identify small-effect QTL that explain approximately half of the remaining variation not explained by the major effects. We find that small-effect QTL are often physically linked to large-effect QTL and that there are extensive genetic interactions between small- and large-effect QTL. A more complete understanding of quantitative traits will require a better understanding of the numbers, effect sizes, and genetic interactions of small-effect QTL. PMID:22942125

  14. Large deployable antenna program. Phase 1: Technology assessment and mission architecture

    NASA Technical Reports Server (NTRS)

    Rogers, Craig A.; Stutzman, Warren L.

    1991-01-01

    The program was initiated to investigate the availability of critical large deployable antenna technologies which would enable microwave remote sensing missions from geostationary orbits as required for Mission to Planet Earth. Program goals for the large antenna were: 40-meter diameter, offset-fed paraboloid, and surface precision of 0.1 mm rms. Phase 1 goals were: to review the state-of-the-art for large, precise, wide-scanning radiometers up to 60 GHz; to assess critical technologies necessary for selected concepts; to develop mission architecture for these concepts; and to evaluate generic technologies to support the large deployable reflectors necessary for these missions. Selected results of the study show that deployable reflectors using furlable segments are limited by surface precision goals to 12 meters in diameter, current launch vehicles can place in geostationary only a 20-meter class antenna, and conceptual designs using stiff reflectors are possible with areal densities of 2.4 deg/sq m.

  15. Developing closed life support systems for large space habitats

    NASA Technical Reports Server (NTRS)

    Phillips, J. M.; Harlan, A. D.; Krumhar, K. C.

    1978-01-01

    In anticipation of possible large-scale, long-duration space missions which may be conducted in the future, NASA has begun to investigate the research and technology development requirements to create life support systems for large space habitats. An analysis suggests the feasibility of a regeneration of food in missions which exceed four years duration. Regeneration of food in space may be justified for missions of shorter duration when large crews must be supported at remote sites such as lunar bases and space manufacturing facilities. It is thought that biological components consisting principally of traditional crop and livestock species will prove to be the most acceptable means of closing the food cycle. A description is presented of the preliminary results of a study of potential biological components for large space habitats. Attention is given to controlled ecosystems, Russian life support system research, controlled-environment agriculture, and the social aspects of the life-support system.

  16. A large high vacuum, high pumping speed space simulation chamber for electric propulsion

    NASA Technical Reports Server (NTRS)

    Grisnik, Stanley P.; Parkes, James E.

    1994-01-01

    Testing high power electric propulsion devices poses unique requirements on space simulation facilities. Very high pumping speeds are required to maintain high vacuum levels while handling large volumes of exhaust products. These pumping speeds are significantly higher than those available in most existing vacuum facilities. There is also a requirement for relatively large vacuum chamber dimensions to minimize facility wall/thruster plume interactions and to accommodate far field plume diagnostic measurements. A 4.57 m (15 ft) diameter by 19.2 m (63 ft) long vacuum chamber at NASA Lewis Research Center is described. The chamber utilizes oil diffusion pumps in combination with cryopanels to achieve high vacuum pumping speeds at high vacuum levels. The facility is computer controlled for all phases of operation from start-up, through testing, to shutdown. The computer control system increases the utilization of the facility and reduces the manpower requirements needed for facility operations.

  17. The trade-off characteristics of acoustic and pressure sensors for the NASP

    NASA Technical Reports Server (NTRS)

    Winkler, Martin; Bush, Chuck

    1992-01-01

    Results of a trade study for the development of pressure and acoustic sensors for use on the National Aerospace Plane (NASP) are summarized. Pressure sensors are needed to operate to 100 psia; acoustic sensors are needed that can give meaningful information about a 200 dB sound pressure level (SPL) environment. Both sensors will have to operate from a high temperature of 2000 F down to absolute zero. The main conclusions of the study are the following: (1) Diaphragm materials limit minimum size and maximum frequency response attainable. (2) No transduction is available to meet all the NASP requirements with existing technology. (3) Capacitive sensors are large relative to the requirement, have limited resolution and frequency response due to noise, and cable length is limited to approximately 20 feet. (4) Eddy current sensors are large relative to the requirement and have limited cable lengths. (5) Fiber optic sensors provide the possibility for a small sensor, even though present developments do not exhibit that characteristic. The need to use sapphire at high temperature complicates the design. Present high temperature research sensors suffer from poor resolution. A significant development effort will be required to realize the potential of fiber optics. (6) Short-term development seems to favor eddy current techniques with the penalty of larger size and reduced dynamic range for acoustic sensors. (7) Long-term development may favor fiber optics with the penalties of cost, schedule, and uncertainty.

  18. Wuhan large pig roundworm virus identified in human feces in Brazil.

    PubMed

    Luchs, Adriana; Leal, Elcio; Komninakis, Shirley Vasconcelos; de Pádua Milagres, Flavio Augusto; Brustulin, Rafael; da Aparecida Rodrigues Teles, Maria; Gill, Danielle Elise; Deng, Xutao; Delwart, Eric; Sabino, Ester Cerdeira; da Costa, Antonio Charlys

    2018-03-28

    We report here the complete genome sequence of a bipartite virus, herein denoted WLPRV/human/BRA/TO-34/201, from a sample collected in 2015 from a two-year-old child in Brazil presenting acute gastroenteritis. The virus has 98-99% identity (segments 2 and 1, respectively) with the Wuhan large pig roundworm virus (unclassified RNA virus) that was recently discovered in the stomachs of pigs from China. This is the first report of a Wuhan large pig roundworm virus detected in human specimens, and the second genome described worldwide. However, the generation of more sequence data and further functional studies are required to fully understand the ecology, epidemiology, and evolution of this new unclassified virus.

  19. STEP Experiment Requirements

    NASA Technical Reports Server (NTRS)

    Brumfield, M. L. (Compiler)

    1984-01-01

    A plan to develop a space technology experiments platform (STEP) was examined. NASA Langley Research Center held a STEP Experiment Requirements Workshop on June 29 and 30 and July 1, 1983, at which experiment proposers were invited to present more detailed information on their experiment concept and requirements. A feasibility and preliminary definition study was conducted and the preliminary definition of STEP capabilities and experiment concepts and expected requirements for support services are presented. The preliminary definition of STEP capabilities based on detailed review of potential experiment requirements is investigated. Topics discussed include: Shuttle on-orbit dynamics; effects of the space environment on damping materials; erectable beam experiment; technology for development of very large solar array deployers; thermal energy management process experiment; photovoltaic concentrater pointing dynamics and plasma interactions; vibration isolation technology; flight tests of a synthetic aperture radar antenna with use of STEP.

  20. Next Generation Heavy-Lift Launch Vehicle: Large Diameter, Hydrocarbon-Fueled Concepts

    NASA Technical Reports Server (NTRS)

    Holliday, Jon; Monk, Timothy; Adams, Charles; Campbell, Ricky

    2012-01-01

    With the passage of the 2010 NASA Authorization Act, NASA was directed to begin the development of the Space Launch System (SLS) as a follow-on to the Space Shuttle Program. The SLS is envisioned as a heavy lift launch vehicle that will provide the foundation for future large-scale, beyond low Earth orbit (LEO) missions. Supporting the Mission Concept Review (MCR) milestone, several teams were formed to conduct an initial Requirements Analysis Cycle (RAC). These teams identified several vehicle concept candidates capable of meeting the preliminary system requirements. One such team, dubbed RAC Team 2, was tasked with identifying launch vehicles that are based on large stage diameters (up to the Saturn V S-IC and S-II stage diameters of 33 ft) and utilize high-thrust liquid oxygen (LOX)/RP engines as a First Stage propulsion system. While the trade space for this class of LOX/RP vehicles is relatively large, recent NASA activities (namely the Heavy Lift Launch Vehicle Study in late 2009 and the Heavy Lift Propulsion Technology Study of 2010) examined specific families within this trade space. Although the findings from these studies were incorporated in the Team 2 activity, additional branches of the trade space were examined and alternative approaches to vehicle development were considered. Furthermore, Team 2 set out to define a highly functional, flexible, and cost-effective launch vehicle concept. Utilizing this approach, a versatile two-stage launch vehicle concept was chosen as a preferred option. The preferred vehicle option has the capability to fly in several different configurations (e.g. engine arrangements) that gives this concept an inherent operational flexibility which allows the vehicle to meet a wide range of performance requirements without the need for costly block upgrades. Even still, this concept preserves the option for evolvability should the need arise in future mission scenarios. The foundation of this conceptual design is a focus on low cost and effectiveness rather than efficiency or cutting-edge technology. This paper details the approach and process, as well as the trade space analysis, leading to the preferred vehicle concept.

  1. Comparison of different estimation techniques for biomass concentration in large scale yeast fermentation.

    PubMed

    Hocalar, A; Türker, M; Karakuzu, C; Yüzgeç, U

    2011-04-01

    In this study, previously developed five different state estimation methods are examined and compared for estimation of biomass concentrations at a production scale fed-batch bioprocess. These methods are i. estimation based on kinetic model of overflow metabolism; ii. estimation based on metabolic black-box model; iii. estimation based on observer; iv. estimation based on artificial neural network; v. estimation based on differential evaluation. Biomass concentrations are estimated from available measurements and compared with experimental data obtained from large scale fermentations. The advantages and disadvantages of the presented techniques are discussed with regard to accuracy, reproducibility, number of primary measurements required and adaptation to different working conditions. Among the various techniques, the metabolic black-box method seems to have advantages although the number of measurements required is more than that for the other methods. However, the required extra measurements are based on commonly employed instruments in an industrial environment. This method is used for developing a model based control of fed-batch yeast fermentations. Copyright © 2010 ISA. Published by Elsevier Ltd. All rights reserved.

  2. An Exploratory Study of Cost Engineering in Axiomatic Design: Creation of the Cost Model Based on an FR-DP Map

    NASA Technical Reports Server (NTRS)

    Lee, Taesik; Jeziorek, Peter

    2004-01-01

    Large complex projects cost large sums of money throughout their life cycle for a variety of reasons and causes. For such large programs, the credible estimation of the project cost, a quick assessment of the cost of making changes, and the management of the project budget with effective cost reduction determine the viability of the project. Cost engineering that deals with these issues requires a rigorous method and systematic processes. This paper introduces a logical framework to a&e effective cost engineering. The framework is built upon Axiomatic Design process. The structure in the Axiomatic Design process provides a good foundation to closely tie engineering design and cost information together. The cost framework presented in this paper is a systematic link between the functional domain (FRs), physical domain (DPs), cost domain (CUs), and a task/process-based model. The FR-DP map relates a system s functional requirements to design solutions across all levels and branches of the decomposition hierarchy. DPs are mapped into CUs, which provides a means to estimate the cost of design solutions - DPs - from the cost of the physical entities in the system - CUs. The task/process model describes the iterative process ot-developing each of the CUs, and is used to estimate the cost of CUs. By linking the four domains, this framework provides a superior traceability from requirements to cost information.

  3. Energy Contents of Frequently Ordered Restaurant Meals and Comparison with Human Energy Requirements and US Department of Agriculture Database Information: A Multisite Randomized Study

    PubMed Central

    Urban, Lorien E.; Weber, Judith L.; Heyman, Melvin B.; Schichtl, Rachel L.; Verstraete, Sofia; Lowery, Nina S.; Das, Sai Krupa; Schleicher, Molly M.; Rogers, Gail; Economos, Christina; Masters, William A.; Roberts, Susan B.

    2017-01-01

    Background Excess energy intake from meals consumed away from home is implicated as a major contributor to obesity, and ~50% of US restaurants are individual or small-chain (non–chain) establishments that do not provide nutrition information. Objective To measure the energy content of frequently ordered meals in non–chain restaurants in three US locations, and compare with the energy content of meals from large-chain restaurants, energy requirements, and food database information. Design A multisite random-sampling protocol was used to measure the energy contents of the most frequently ordered meals from the most popular cuisines in non–chain restaurants, together with equivalent meals from large-chain restaurants. Setting Meals were obtained from restaurants in San Francisco, CA; Boston, MA; and Little Rock, AR, between 2011 and 2014. Main outcome measures Meal energy content determined by bomb calorimetry. Statistical analysis performed Regional and cuisine differences were assessed using a mixed model with restaurant nested within region×cuisine as the random factor. Paired t tests were used to evaluate differences between non–chain and chain meals, human energy requirements, and food database values. Results Meals from non–chain restaurants contained 1,205±465 kcal/meal, amounts that were not significantly different from equivalent meals from large-chain restaurants (+5.1%; P=0.41). There was a significant effect of cuisine on non–chain meal energy, and three of the four most popular cuisines (American, Italian, and Chinese) had the highest mean energy (1,495 kcal/meal). Ninety-two percent of meals exceeded typical energy requirements for a single eating occasion. Conclusions Non–chain restaurants lacking nutrition information serve amounts of energy that are typically far in excess of human energy requirements for single eating occasions, and are equivalent to amounts served by the large-chain restaurants that have previously been criticized for providing excess energy. Restaurants in general, rather than specific categories of restaurant, expose patrons to excessive portions that induce overeating through established biological mechanisms. PMID:26803805

  4. Energy Contents of Frequently Ordered Restaurant Meals and Comparison with Human Energy Requirements and U.S. Department of Agriculture Database Information: A Multisite Randomized Study.

    PubMed

    Urban, Lorien E; Weber, Judith L; Heyman, Melvin B; Schichtl, Rachel L; Verstraete, Sofia; Lowery, Nina S; Das, Sai Krupa; Schleicher, Molly M; Rogers, Gail; Economos, Christina; Masters, William A; Roberts, Susan B

    2016-04-01

    Excess energy intake from meals consumed away from home is implicated as a major contributor to obesity, and ∼50% of US restaurants are individual or small-chain (non-chain) establishments that do not provide nutrition information. To measure the energy content of frequently ordered meals in non-chain restaurants in three US locations, and compare with the energy content of meals from large-chain restaurants, energy requirements, and food database information. A multisite random-sampling protocol was used to measure the energy contents of the most frequently ordered meals from the most popular cuisines in non-chain restaurants, together with equivalent meals from large-chain restaurants. Meals were obtained from restaurants in San Francisco, CA; Boston, MA; and Little Rock, AR, between 2011 and 2014. Meal energy content determined by bomb calorimetry. Regional and cuisine differences were assessed using a mixed model with restaurant nested within region×cuisine as the random factor. Paired t tests were used to evaluate differences between non-chain and chain meals, human energy requirements, and food database values. Meals from non-chain restaurants contained 1,205±465 kcal/meal, amounts that were not significantly different from equivalent meals from large-chain restaurants (+5.1%; P=0.41). There was a significant effect of cuisine on non-chain meal energy, and three of the four most popular cuisines (American, Italian, and Chinese) had the highest mean energy (1,495 kcal/meal). Ninety-two percent of meals exceeded typical energy requirements for a single eating occasion. Non-chain restaurants lacking nutrition information serve amounts of energy that are typically far in excess of human energy requirements for single eating occasions, and are equivalent to amounts served by the large-chain restaurants that have previously been criticized for providing excess energy. Restaurants in general, rather than specific categories of restaurant, expose patrons to excessive portions that induce overeating through established biological mechanisms. Copyright © 2016 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  5. Experimental study of a constrained vapor bubble fin heat exchanger in the absence of external natural convection.

    PubMed

    Basu, Sumita; Plawsky, Joel L; Wayner, Peter C

    2004-11-01

    In preparation for a microgravity flight experiment on the International Space Station, a constrained vapor bubble fin heat exchanger (CVB) was operated both in a vacuum chamber and in air on Earth to evaluate the effect of the absence of external natural convection. The long-term objective is a general study of a high heat flux, low capillary pressure system with small viscous effects due to the relatively large 3 x 3 x 40 mm dimensions. The current CVB can be viewed as a large-scale version of a micro heat pipe with a large Bond number in the Earth environment but a small Bond number in microgravity. The walls of the CVB are quartz, to allow for image analysis of naturally occurring interference fringes that give the pressure field for liquid flow. The research is synergistic in that the study requires a microgravity environment to obtain a low Bond number and the space program needs thermal control systems, like the CVB, with a large characteristic dimension. In the absence of natural convection, operation of the CVB may be dominated by external radiative losses from its quartz surface. Therefore, an understanding of radiation from the quartz cell is required. All radiative exchange with the surroundings occurs from the outer surface of the CVB when the temperature range renders the quartz walls of the CVB optically thick (lambda > 4 microns). However, for electromagnetic radiation where lambda < 2 microns, the walls are transparent. Experimental results obtained for a cell charged with pentane are compared with those obtained for a dry cell. A numerical model was developed that successfully simulated the behavior and performance of the device observed experimentally.

  6. Spatial considerations during cryopreservation of a large volume sample.

    PubMed

    Kilbride, Peter; Lamb, Stephen; Milne, Stuart; Gibbons, Stephanie; Erro, Eloy; Bundy, James; Selden, Clare; Fuller, Barry; Morris, John

    2016-08-01

    There have been relatively few studies on the implications of the physical conditions experienced by cells during large volume (litres) cryopreservation - most studies have focused on the problem of cryopreservation of smaller volumes, typically up to 2 ml. This study explores the effects of ice growth by progressive solidification, generally seen during larger scale cryopreservation, on encapsulated liver hepatocyte spheroids, and it develops a method to reliably sample different regions across the frozen cores of samples experiencing progressive solidification. These issues are examined in the context of a Bioartificial Liver Device which requires cryopreservation of a 2 L volume in a strict cylindrical geometry for optimal clinical delivery. Progressive solidification cannot be avoided in this arrangement. In such a system optimal cryoprotectant concentrations and cooling rates are known. However, applying these parameters to a large volume is challenging due to the thermal mass and subsequent thermal lag. The specific impact of this to the cryopreservation outcome is required. Under conditions of progressive solidification, the spatial location of Encapsulated Liver Spheroids had a strong impact on post-thaw recovery. Cells in areas first and last to solidify demonstrated significantly impaired post-thaw function, whereas areas solidifying through the majority of the process exhibited higher post-thaw outcome. It was also found that samples where the ice thawed more rapidly had greater post-thaw viability 24 h post-thaw (75.7 ± 3.9% and 62.0 ± 7.2% respectively). These findings have implications for the cryopreservation of large volumes with a rigid shape and for the cryopreservation of a Bioartificial Liver Device. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  7. A MBD-seq protocol for large-scale methylome-wide studies with (very) low amounts of DNA.

    PubMed

    Aberg, Karolina A; Chan, Robin F; Shabalin, Andrey A; Zhao, Min; Turecki, Gustavo; Staunstrup, Nicklas Heine; Starnawska, Anna; Mors, Ole; Xie, Lin Y; van den Oord, Edwin Jcg

    2017-09-01

    We recently showed that, after optimization, our methyl-CpG binding domain sequencing (MBD-seq) application approximates the methylome-wide coverage obtained with whole-genome bisulfite sequencing (WGB-seq), but at a cost that enables adequately powered large-scale association studies. A prior drawback of MBD-seq is the relatively large amount of genomic DNA (ideally >1 µg) required to obtain high-quality data. Biomaterials are typically expensive to collect, provide a finite amount of DNA, and may simply not yield sufficient starting material. The ability to use low amounts of DNA will increase the breadth and number of studies that can be conducted. Therefore, we further optimized the enrichment step. With this low starting material protocol, MBD-seq performed equally well, or better, than the protocol requiring ample starting material (>1 µg). Using only 15 ng of DNA as input, there is minimal loss in data quality, achieving 93% of the coverage of WGB-seq (with standard amounts of input DNA) at similar false/positive rates. Furthermore, across a large number of genomic features, the MBD-seq methylation profiles closely tracked those observed for WGB-seq with even slightly larger effect sizes. This suggests that MBD-seq provides similar information about the methylome and classifies methylation status somewhat more accurately. Performance decreases with <15 ng DNA as starting material but, even with as little as 5 ng, MBD-seq still achieves 90% of the coverage of WGB-seq with comparable genome-wide methylation profiles. Thus, the proposed protocol is an attractive option for adequately powered and cost-effective methylome-wide investigations using (very) low amounts of DNA.

  8. Application and research of block caving in Pulang copper mine

    NASA Astrophysics Data System (ADS)

    Ge, Qifa; Fan, Wenlu; Zhu, Weigen; Chen, Xiaowei

    2018-01-01

    The application of block caving in mines shows significant advantages in large scale, low cost and high efficiency, thus block caving is worth promoting in the mines that meets the requirement of natural caving. Due to large scale of production and low ore grade in Pulang copper mine in China, comprehensive analysis and research were conducted on rock mechanics, mining sequence, undercutting and stability of bottom structure in terms of raising mine benefit and maximizing the recovery mineral resources. Finally this study summarizes that block caving is completely suitable for Pulang copper mine.

  9. Study of large hemispherical photomultiplier tubes for the ANTARES neutrino telescope

    NASA Astrophysics Data System (ADS)

    Aguilar, J. A.; Albert, A.; Ameli, F.; Amram, P.; Anghinolfi, M.; Anton, G.; Anvar, S.; Ardellier-Desages, F. E.; Aslanides, E.; Aubert, J.-J.; Bailey, D.; Basa, S.; Battaglieri, M.; Becherini, Y.; Bellotti, R.; Beltramelli, J.; Bertin, V.; Billault, M.; Blaes, R.; Blanc, F.; de Botton, N.; Boulesteix, J.; Bouwhuis, M. C.; Brooks, C. B.; Bradbury, S. M.; Bruijn, R.; Brunner, J.; Burgio, G. F.; Cafagna, F.; Calzas, A.; Capone, A.; Caponetto, L.; Carmona, E.; Carr, J.; Cartwright, S. L.; Castorina, E.; Cavasinni, V.; Cecchini, S.; Charvis, P.; Circella, M.; Colnard, C.; Compère, C.; Coniglione, R.; Cooper, S.; Coyle, P.; Cuneo, S.; Damy, G.; van Dantzig, R.; Deschamps, A.; de Marzo, C.; Denans, D.; Destelle, J.-J.; de Vita, R.; Dinkelspiler, B.; Distefano, C.; Drogou, J.-F.; Druillole, F.; Engelen, J.; Ernenwein, J.-P.; Falchini, E.; Favard, S.; Feinstein, F.; Ferry, S.; Festy, D.; Flaminio, V.; Fopma, J.; Fuda, J.-L.; Gallone, J.-M.; Giacomelli, G.; Girard, N.; Goret, P.; Graf, K.; Hallewell, G.; Hartmann, B.; Heijboer, A.; Hello, Y.; Hernández-Rey, J. J.; Herrouin, G.; Hößl, J.; Hoffmann, C.; Hubbard, J. R.; Jaquet, M.; de Jong, M.; Jouvenot, F.; Kappes, A.; Karg, T.; Karkar, S.; Karolak, M.; Katz, U.; Keller, P.; Kooijman, P.; Korolkova, E. V.; Kouchner, A.; Kretschmer, W.; Kuch, S.; Kudryavtsev, V. A.; Lafoux, H.; Lagier, P.; Lahmann, R.; Lamare, P.; Languillat, J.-C.; Laschinsky, H.; Laubier, L.; Legou, T.; Le Guen, Y.; Le Provost, H.; Le van Suu, A.; Lo Nigro, L.; Lo Presti, D.; Loucatos, S.; Louis, F.; Lyashuk, V.; Marcelin, M.; Margiotta, A.; Maron, C.; Massol, A.; Masullo, R.; Mazéas, F.; Mazure, A.; McMillan, J. E.; Migneco, E.; Millot, C.; Milovanovic, A.; Montanet, F.; Montaruli, T.; Morel, J.-P.; Morganti, M.; Moscoso, L.; Musumeci, M.; Naumann, C.; Naumann-Godo, M.; Nezri, E.; Niess, V.; Nooren, G. J.; Ogden, P.; Olivetto, C.; Palanque-Delabrouille, N.; Papaleo, R.; Payre, P.; Petta, C.; Piattelli, P.; Pineau, J.-P.; Poinsignon, J.; Popa, V.; Potheau, R.; Pradier, T.; Racca, C.; Raia, G.; Randazzo, N.; Real, D.; van Rens, B. A. P.; Réthoré, F.; Riccobene, G.; Rigaud, V.; Ripani, M.; Roca-Blay, V.; Rolin, J.-F.; Romita, M.; Rose, H. J.; Rostovtsev, A.; Ruppi, M.; Russo, G. V.; Sacquin, Y.; Salesa, F.; Salomon, K.; Saouter, S.; Sapienza, P.; Shanidze, R.; Schuller, J.-P.; Schuster, W.; Sokalski, I.; Spurio, M.; Stolarczyk, T.; Stubert, D.; Taiuti, M.; Thompson, L. F.; Tilav, S.; Valdy, P.; Valente, V.; Vallage, B.; Vernin, P.; Virieux, J.; de Vries, G.; de Witt Huberts, P.; de Wolf, E.; Zaborov, D.; Zaccone, H.; Zakharov, V.; Zornoza, J. D.; Zúñiga, J.

    2005-12-01

    The ANTARES neutrino telescope, to be immersed depth in the Mediterranean Sea, will consist of a three-dimensional matrix of 900 large area photomultiplier tubes housed in pressure-resistant glass spheres. The selection of the optimal photomultiplier was a critical step for the project and required an intensive phase of tests and developments carried out in close collaboration with the main manufacturers worldwide. This paper provides an overview of the tests performed by the collaboration and describes in detail the features of the photomultiplier tube chosen for ANTARES.

  10. An Extension of Holographic Moiré to Micromechanics

    NASA Astrophysics Data System (ADS)

    Sciammarella, C. A.; Sciammarella, F. M.

    The electronic Holographic Moiré is an ideal tool for micromechanics studies. It does not require a modification of the surface by the introduction of a reference grating. This is of particular advantage when dealing with materials such as solid propellant grains whose chemical nature and surface finish makes the application of a reference grating very difficult. Traditional electronic Holographic Moiré presents some difficult problems when large magnifications are needed and large rigid body motion takes place. This paper presents developments that solves these problems and extends the application of the technique to micromechanics.

  11. Investigation of orbitofrontal sulcogyral pattern in chronic schizophrenia.

    PubMed

    Cropley, Vanessa L; Bartholomeusz, Cali F; Wu, Peter; Wood, Stephen J; Proffitt, Tina; Brewer, Warrick J; Desmond, Patricia M; Velakoulis, Dennis; Pantelis, Christos

    2015-11-30

    Abnormalities of orbitofrontal cortex (OFC) pattern type distribution have been associated with schizophrenia-spectrum disorders. We investigated OFC pattern type in a large sample of chronic schizophrenia patients and healthy controls. We found an increased frequency of Type II but no difference in Type I or III folding pattern in the schizophrenia group in comparison to controls. Further large studies are required to investigate the diagnostic specificity of altered OFC pattern type and to confirm the distribution of pattern type in the normal population. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  12. Advanced power sources for space missions

    NASA Technical Reports Server (NTRS)

    Gavin, Joseph G., Jr.; Burkes, Tommy R.; English, Robert E.; Grant, Nicholas J.; Kulcinski, Gerald L.; Mullin, Jerome P.; Peddicord, K. Lee; Purvis, Carolyn K.; Sarjeant, W. James; Vandevender, J. Pace

    1989-01-01

    Approaches to satisfying the power requirements of space-based Strategic Defense Initiative (SDI) missions are studied. The power requirements for non-SDI military space missions and for civil space missions of the National Aeronautics and Space Administration (NASA) are also considered. The more demanding SDI power requirements appear to encompass many, if not all, of the power requirements for those missions. Study results indicate that practical fulfillment of SDI requirements will necessitate substantial advances in the state of the art of power technology. SDI goals include the capability to operate space-based beam weapons, sometimes referred to as directed-energy weapons. Such weapons pose unprecedented power requirements, both during preparation for battle and during battle conditions. The power regimes for these two sets of applications are referred to as alert mode and burst mode, respectively. Alert-mode power requirements are presently stated to range from about 100 kW to a few megawatts for cumulative durations of about a year or more. Burst-mode power requirements are roughly estimated to range from tens to hundreds of megawatts for durations of a few hundred to a few thousand seconds. There are two likely energy sources, chemical and nuclear, for powering SDI directed-energy weapons during the alert and burst modes. The choice between chemical and nuclear space power systems depends in large part on the total duration during which power must be provided. Complete study findings, conclusions, and eight recommendations are reported.

  13. Power and instrument strength requirements for Mendelian randomization studies using multiple genetic variants.

    PubMed

    Pierce, Brandon L; Ahsan, Habibul; Vanderweele, Tyler J

    2011-06-01

    Mendelian Randomization (MR) studies assess the causality of an exposure-disease association using genetic determinants [i.e. instrumental variables (IVs)] of the exposure. Power and IV strength requirements for MR studies using multiple genetic variants have not been explored. We simulated cohort data sets consisting of a normally distributed disease trait, a normally distributed exposure, which affects this trait and a biallelic genetic variant that affects the exposure. We estimated power to detect an effect of exposure on disease for varying allele frequencies, effect sizes and samples sizes (using two-stage least squares regression on 10,000 data sets-Stage 1 is a regression of exposure on the variant. Stage 2 is a regression of disease on the fitted exposure). Similar analyses were conducted using multiple genetic variants (5, 10, 20) as independent or combined IVs. We assessed IV strength using the first-stage F statistic. Simulations of realistic scenarios indicate that MR studies will require large (n > 1000), often very large (n > 10,000), sample sizes. In many cases, so-called 'weak IV' problems arise when using multiple variants as independent IVs (even with as few as five), resulting in biased effect estimates. Combining genetic factors into fewer IVs results in modest power decreases, but alleviates weak IV problems. Ideal methods for combining genetic factors depend upon knowledge of the genetic architecture underlying the exposure. The feasibility of well-powered, unbiased MR studies will depend upon the amount of variance in the exposure that can be explained by known genetic factors and the 'strength' of the IV set derived from these genetic factors.

  14. Assessing estimation techniques for missing plot observations in the U.S. forest inventory

    Treesearch

    Grant M. Domke; Christopher W. Woodall; Ronald E. McRoberts; James E. Smith; Mark A. Hatfield

    2012-01-01

    The U.S. Forest Service, Forest Inventory and Analysis Program made a transition from state-by-state periodic forest inventories--with reporting standards largely tailored to regional requirements--to a nationally consistent, annual inventory tailored to large-scale strategic requirements. Lack of measurements on all forest land during the periodic inventory, along...

  15. Governance of extended lifecycle in large-scale eHealth initiatives: analyzing variability of enterprise architecture elements.

    PubMed

    Mykkänen, Juha; Virkanen, Hannu; Tuomainen, Mika

    2013-01-01

    The governance of large eHealth initiatives requires traceability of many requirements and design decisions. We provide a model which we use to conceptually analyze variability of several enterprise architecture (EA) elements throughout the extended lifecycle of development goals using interrelated projects related to the national ePrescription in Finland.

  16. Networks and landscapes: a framework for setting goals and evaluating performance at the large landscape scale

    Treesearch

    R Patrick Bixler; Shawn Johnson; Kirk Emerson; Tina Nabatchi; Melly Reuling; Charles Curtin; Michele Romolini; Morgan Grove

    2016-01-01

    The objective of large landscape conser vation is to mitigate complex ecological problems through interventions at multiple and overlapping scales. Implementation requires coordination among a diverse network of individuals and organizations to integrate local-scale conservation activities with broad-scale goals. This requires an understanding of the governance options...

  17. 16 CFR § 1701.3 - Applicability of special packaging requirements to hazardous substances in large size containers.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 16 Commercial Practices 2 2013-01-01 2013-01-01 false Applicability of special packaging requirements to hazardous substances in large size containers. § 1701.3 Section § 1701.3 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION POISON PREVENTION PACKAGING ACT OF 1970 REGULATIONS STATEMENTS OF POLICY...

  18. Large Composite Structures Processing Technologies for Reusable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Clinton, R. G., Jr.; Vickers, J. H.; McMahon, W. M.; Hulcher, A. B.; Johnston, N. J.; Cano, R. J.; Belvin, H. L.; McIver, K.; Franklin, W.; Sidwell, D.

    2001-01-01

    Significant efforts have been devoted to establishing the technology foundation to enable the progression to large scale composite structures fabrication. We are not capable today of fabricating many of the composite structures envisioned for the second generation reusable launch vehicle (RLV). Conventional 'aerospace' manufacturing and processing methodologies (fiber placement, autoclave, tooling) will require substantial investment and lead time to scale-up. Out-of-autoclave process techniques will require aggressive efforts to mature the selected technologies and to scale up. Focused composite processing technology development and demonstration programs utilizing the building block approach are required to enable envisioned second generation RLV large composite structures applications. Government/industry partnerships have demonstrated success in this area and represent best combination of skills and capabilities to achieve this goal.

  19. Large-eddy simulation of the urban boundary layer in the MEGAPOLI Paris Plume experiment

    NASA Astrophysics Data System (ADS)

    Esau, Igor

    2010-05-01

    This study presents results from the specific large-eddy simulation study of the urban boundary layer in the MEGAPOLI Paris Plume field campaign. We used LESNIC and PALM codes, MEGAPOLI city morphology database, nudging to the observed meteorological conditions during the Paris Plume campaign and some concentration measurements from that campaign to simulate and better understand the nature of the urban boundary layer on scales larger then the street canyon scales. The primary attention was paid to turbulence self-organization and structure-to-surface interaction. The study has been aimed to demonstrate feasibility and estimate required resources for such research. Therefore, at this stage we do not compare the simulation with other relevant studies as well as we do not formulate the theoretical conclusions.

  20. Trials of large group teaching in Malaysian private universities: a cross sectional study of teaching medicine and other disciplines

    PubMed Central

    2011-01-01

    Background This is a pilot cross sectional study using both quantitative and qualitative approach towards tutors teaching large classes in private universities in the Klang Valley (comprising Kuala Lumpur, its suburbs, adjoining towns in the State of Selangor) and the State of Negeri Sembilan, Malaysia. The general aim of this study is to determine the difficulties faced by tutors when teaching large group of students and to outline appropriate recommendations in overcoming them. Findings Thirty-two academics from six private universities from different faculties such as Medical Sciences, Business, Information Technology, and Engineering disciplines participated in this study. SPSS software was used to analyse the data. The results in general indicate that the conventional instructor-student approach has its shortcoming and requires changes. Interestingly, tutors from Medicine and IT less often faced difficulties and had positive experience in teaching large group of students. Conclusion However several suggestions were proposed to overcome these difficulties ranging from breaking into smaller classes, adopting innovative teaching, use of interactive learning methods incorporating interactive assessment and creative technology which enhanced students learning. Furthermore the study provides insights on the trials of large group teaching which are clearly identified to help tutors realise its impact on teaching. The suggestions to overcome these difficulties and to maximize student learning can serve as a guideline for tutors who face these challenges. PMID:21902839

  1. Trials of large group teaching in Malaysian private universities: a cross sectional study of teaching medicine and other disciplines.

    PubMed

    Thomas, Susan; Subramaniam, Shamini; Abraham, Mathew; Too, Laysan; Beh, Loosee

    2011-09-09

    This is a pilot cross sectional study using both quantitative and qualitative approach towards tutors teaching large classes in private universities in the Klang Valley (comprising Kuala Lumpur, its suburbs, adjoining towns in the State of Selangor) and the State of Negeri Sembilan, Malaysia. The general aim of this study is to determine the difficulties faced by tutors when teaching large group of students and to outline appropriate recommendations in overcoming them. Thirty-two academics from six private universities from different faculties such as Medical Sciences, Business, Information Technology, and Engineering disciplines participated in this study. SPSS software was used to analyse the data. The results in general indicate that the conventional instructor-student approach has its shortcoming and requires changes. Interestingly, tutors from Medicine and IT less often faced difficulties and had positive experience in teaching large group of students. However several suggestions were proposed to overcome these difficulties ranging from breaking into smaller classes, adopting innovative teaching, use of interactive learning methods incorporating interactive assessment and creative technology which enhanced students learning. Furthermore the study provides insights on the trials of large group teaching which are clearly identified to help tutors realise its impact on teaching. The suggestions to overcome these difficulties and to maximize student learning can serve as a guideline for tutors who face these challenges.

  2. Large Format Arrays for Far Infrared and Millimeter Astronomy

    NASA Technical Reports Server (NTRS)

    Moseley, Harvey

    2004-01-01

    Some of the most compelling questions in modem astronomy are best addressed with submillimeter and millimeter observations. The question of the role of inflation in the early evolution of the universe is best addressed with large sensitive arrays of millimeter polarimeters. The study of the first generations of galaxies requires sensitive submillimeter imaging, which can help us to understand the history of energy release and nucleosynthesis in the universe. Our ability to address these questions is dramatically increasing, driven by dramatic steps in the sensitivity and size of available detector arrays. While the MIPS instrument on the SIRTF mission will revolutionize far infrared astronomy with its 1024 element array of photoconductors, thermal detectors remain the dominant technology for submillimeter and millimeter imaging and polarimetry. The last decade has seen the deployment of increasingly large arrays of bolometers, ranging from the 48 element arrays deployed on the KAO in the late 198Os, to the SHARC and SCUBA arrays in the 1990s. The past years have seen the deployment of a new generation of larger detector arrays in SHARC II (384 channels) and Bolocam (144 channels). These detectors are in operation and are beginning to make significant impacts on the field. Arrays of sensitive submillimeter bolometers on the SPIRE instrument on Herschel will allow the first large areas surveys of the sky, providing important insight into the evolution of galaxies. The next generation of detectors, led by SCUBA II, will increase the focal scale of these instruments by an order of magnitude. Two major missions are being planned by NASA for which further development of long wavelength detectors is essential, The SAFlR mission, a 10-m class telescope with large arrays of background limited detectors, will extend our reach into the epoch of initial galaxy formation. A major goal of modem cosmology is to test the inflationary paradigm in the early evolution of the universe. To this end, a mission is planned to detect the imprint of inflation on the CMB by precision measurement of its polarization. This work requires very large arrays of sensitive detectors which can provide unprecedented control of a wide range of systematic errors, given the small amplitude of the signal of interest. We will describe the current state of large format detector arrays, the performance requirements set by the new missions, and the different approaches being developed in the community to meet these requirements. We are confident that within a decade, these developments will lead to dramatic advances in our understanding of the evolution of the universe.

  3. Side effects of problem-solving strategies in large-scale nutrition science: towards a diversification of health.

    PubMed

    Penders, Bart; Vos, Rein; Horstman, Klasien

    2009-11-01

    Solving complex problems in large-scale research programmes requires cooperation and division of labour. Simultaneously, large-scale problem solving also gives rise to unintended side effects. Based upon 5 years of researching two large-scale nutrigenomic research programmes, we argue that problems are fragmented in order to be solved. These sub-problems are given priority for practical reasons and in the process of solving them, various changes are introduced in each sub-problem. Combined with additional diversity as a result of interdisciplinarity, this makes reassembling the original and overall goal of the research programme less likely. In the case of nutrigenomics and health, this produces a diversification of health. As a result, the public health goal of contemporary nutrition science is not reached in the large-scale research programmes we studied. Large-scale research programmes are very successful in producing scientific publications and new knowledge; however, in reaching their political goals they often are less successful.

  4. Summing the strokes: energy economy in northern elephant seals during large-scale foraging migrations.

    PubMed

    Maresh, J L; Adachi, T; Takahashi, A; Naito, Y; Crocker, D E; Horning, M; Williams, T M; Costa, D P

    2015-01-01

    The energy requirements of free-ranging marine mammals are challenging to measure due to cryptic and far-ranging feeding habits, but are important to quantify given the potential impacts of high-level predators on ecosystems. Given their large body size and carnivorous lifestyle, we would predict that northern elephant seals (Mirounga angustirostris) have elevated field metabolic rates (FMRs) that require high prey intake rates, especially during pregnancy. Disturbance associated with climate change or human activity is predicted to further elevate energy requirements due to an increase in locomotor costs required to accommodate a reduction in prey or time available to forage. In this study, we determined the FMRs, total energy requirements, and energy budgets of adult, female northern elephant seals. We also examined the impact of increased locomotor costs on foraging success in this species. Body size, time spent at sea and reproductive status strongly influenced FMR. During the short foraging migration, FMR averaged 90.1 (SE = 1.7) kJ kg(-1)d(-1) - only 36 % greater than predicted basal metabolic rate. During the long migration, when seals were pregnant, FMRs averaged 69.4 (±3.0) kJ kg(-1)d(-1) - values approaching those predicted to be necessary to support basal metabolism in mammals of this size. Low FMRs in pregnant seals were driven by hypometabolism coupled with a positive feedback loop between improving body condition and reduced flipper stroking frequency. In contrast, three additional seals carrying large, non-streamlined instrumentation saw a four-fold increase in energy partitioned toward locomotion, resulting in elevated FMRs and only half the mass gain of normally-swimming study animals. These results highlight the importance of keeping locomotion costs low for successful foraging in this species. In preparation for lactation and two fasting periods with high demands on energy reserves, migrating elephant seals utilize an economical foraging strategy whereby energy savings from reduced locomotion costs are shuttled towards somatic growth and fetal gestation. Remarkably, the energy requirements of this species, particularly during pregnancy, are 70-80 % lower than expected for mammalian carnivores, approaching or even falling below values predicted to be necessary to support basal metabolism in mammals of this size.

  5. ATHENA: system studies and optics accommodation

    NASA Astrophysics Data System (ADS)

    Ayre, M.; Bavdaz, M.; Ferreira, I.; Wille, E.; Fransen, S.; Stefanescu, A.; Linder, M.

    2016-07-01

    ATHENA is currently in Phase A, with a view to adoption upon a successful Mission Adoption Review in 2019/2020. After a brief presentation of the reference spacecraft (SC) design, this paper will focus on the functional and environmental requirements, the thermo-mechanical design and the Assembly, Integration, Verification & Test (AIVT) considerations related to housing the Silicon Pore Optics (SPO) Mirror Modules (MM) in the very large Mirror Assembly Module (MAM). Initially functional requirements on the MM accommodation are presented, with the Effective Area and Half Energy Width (HEW) requirements leading to a MAM comprising (depending on final mirror size selected) between 700-1000 MMs, co-aligned with exquisite accuracy to provide a common focus. A preliminary HEW budget allocated across the main error-contributors is presented, and this is then used as a reference to derive subsequent requirements and engineering considerations, including: The procedures and technologies for MM-integration into the Mirror Structure (MS) to achieve the required alignment accuracies in a timely manner; stiffness requirements and handling scheme required to constrain deformation under gravity during x-ray testing; temperature control to constrain thermo-elastic deformation during flight; and the role of the Instrument Switching Mechanism (ISM) in constraining HEW and Effective Area errors. Next, we present the key environmental requirements of the MMs, and the need to minimise shock-loading of the MMs is stressed. Methods to achieve this Ø are presented, including: Selection of a large clamp-band launch vehicle interface (LV I/F); lengthening of the shock-path from the LV I/F to the MAM I/F; modal-tuning of the MAM to act as a low-pass filter during launch shock events; use of low-shock HDRMs for the MAM; and the possibility to deploy a passive vibration solution at the LV I/F to reduce loads.

  6. Flavivirus internalization is regulated by a size-dependent endocytic pathway.

    PubMed

    Hackett, Brent A; Cherry, Sara

    2018-04-17

    Flaviviruses enter host cells through the process of clathrin-mediated endocytosis, and the spectrum of host factors required for this process are incompletely understood. Here we found that lymphocyte antigen 6 locus E (LY6E) promotes the internalization of multiple flaviviruses, including West Nile virus, Zika virus, and dengue virus. Perhaps surprisingly, LY6E is dispensable for the internalization of the endogenous cargo transferrin, which is also dependent on clathrin-mediated endocytosis for uptake. Since viruses are substantially larger than transferrin, we reasoned that LY6E may be required for uptake of larger cargoes and tested this using transferrin-coated beads of similar size as flaviviruses. LY6E was indeed required for the internalization of transferrin-coated beads, suggesting that LY6E is selectively required for large cargo. Cell biological studies found that LY6E forms tubules upon viral infection and bead internalization, and we found that tubule formation was dependent on RNASEK, which is also required for flavivirus internalization, but not transferrin uptake. Indeed, we found that RNASEK is also required for the internalization of transferrin-coated beads, suggesting it functions upstream of LY6E. These LY6E tubules resembled microtubules, and we found that microtubule assembly was required for their formation and flavivirus uptake. Since microtubule end-binding proteins link microtubules to downstream activities, we screened the three end-binding proteins and found that EB3 promotes virus uptake and LY6E tubularization. Taken together, these results highlight a specialized pathway required for the uptake of large clathrin-dependent endocytosis cargoes, including flaviviruses. Copyright © 2018 the Author(s). Published by PNAS.

  7. Study of thermal management for space platform applications: Unmanned modular thermal management and radiator technologies

    NASA Technical Reports Server (NTRS)

    Oren, J. A.

    1981-01-01

    Candidate techniques for thermal management of unmanned modules docked to a large 250 kW platform were evaluated. Both automatically deployed and space constructed radiator systems were studied to identify characteristics and potential problems. Radiator coating requirements and current state-of-the-art were identified. An assessment of the technology needs was made and advancements were recommended.

  8. "We've Got to Keep Meeting Like This": A Pilot Study Comparing Academic Performance in Shifting-Membership Cooperative Groups versus Stable-Membership Cooperative Groups in an Introductory-Level Lab

    ERIC Educational Resources Information Center

    Walker, Alicia; Bush, Amy; Sanchagrin, Ken; Holland, Jonathon

    2017-01-01

    This study examined possible ways to increase student engagement in small sections of a large, introductory-level, required university course. Research shows that cooperative group learning boosts achievement through fostering better interpersonal relationships between students. Cooperative group learning is an evidence-based instructional…

  9. Universities and Fields of Study in Argentina: A Public-Private Comparison from the Supply and Demand Side. PROPHE Working Paper Series. WP No. 15

    ERIC Educational Resources Information Center

    Rabossi, Marcelo

    2010-01-01

    Private higher education literature recognizes large public-private differentiation in terms of field of study. Relative to public counterparts, private universities tend to offer their services in fields that require low initial investments and present at least relatively attractive internal private rates of return. Thus, the main objective of…

  10. Aero-Propulsion Technology (APT) Task V Low Noise ADP Engine Definition Study

    NASA Technical Reports Server (NTRS)

    Holcombe, V.

    2003-01-01

    A study was conducted to identify and evaluate noise reduction technologies for advanced ducted prop propulsion systems that would allow increased capacity operation and result in an economically competitive commercial transport. The study investigated the aero/acoustic/structural advancements in fan and nacelle technology required to match or exceed the fuel burned and economic benefits of a constrained diameter large Advanced Ducted Propeller (ADP) compared to an unconstrained ADP propulsion system with a noise goal of 5 to 10 EPNDB reduction relative to FAR 36 Stage 3 at each of the three measuring stations namely, takeoff (cutback), approach and sideline. A second generation ADP was selected to operate within the maximum nacelle diameter constrain of 160 deg to allow installation under the wing. The impact of fan and nacelle technologies of the second generation ADP on fuel burn and direct operating costs for a typical 3000 nm mission was evaluated through use of a large, twin engine commercial airplane simulation model. The major emphasis of this study focused on fan blade aero/acoustic and structural technology evaluations and advanced nacelle designs. Results of this study have identified the testing required to verify the interactive performance of these components, along with noise characteristics, by wind tunnel testing utilizing and advanced interaction rig.

  11. Power monitoring and control for large scale projects: SKA, a case study

    NASA Astrophysics Data System (ADS)

    Barbosa, Domingos; Barraca, João. Paulo; Maia, Dalmiro; Carvalho, Bruno; Vieira, Jorge; Swart, Paul; Le Roux, Gerhard; Natarajan, Swaminathan; van Ardenne, Arnold; Seca, Luis

    2016-07-01

    Large sensor-based science infrastructures for radio astronomy like the SKA will be among the most intensive datadriven projects in the world, facing very high demanding computation, storage, management, and above all power demands. The geographically wide distribution of the SKA and its associated processing requirements in the form of tailored High Performance Computing (HPC) facilities, require a Greener approach towards the Information and Communications Technologies (ICT) adopted for the data processing to enable operational compliance to potentially strict power budgets. Addressing the reduction of electricity costs, improve system power monitoring and the generation and management of electricity at system level is paramount to avoid future inefficiencies and higher costs and enable fulfillments of Key Science Cases. Here we outline major characteristics and innovation approaches to address power efficiency and long-term power sustainability for radio astronomy projects, focusing on Green ICT for science and Smart power monitoring and control.

  12. Terahertz Real-Time Imaging Uncooled Arrays Based on Antenna-Coupled Bolometers or FET Developed at CEA-Leti

    NASA Astrophysics Data System (ADS)

    Simoens, François; Meilhan, Jérôme; Nicolas, Jean-Alain

    2015-10-01

    Sensitive and large-format terahertz focal plane arrays (FPAs) integrated in compact and hand-held cameras that deliver real-time terahertz (THz) imaging are required for many application fields, such as non-destructive testing (NDT), security, quality control of food, and agricultural products industry. Two technologies of uncooled THz arrays that are being studied at CEA-Leti, i.e., bolometer and complementary metal oxide semiconductor (CMOS) field effect transistors (FET), are able to meet these requirements. This paper reminds the followed technological approaches and focuses on the latest modeling and performance analysis. The capabilities of application of these arrays to NDT and security are then demonstrated with experimental tests. In particular, high technological maturity of the THz bolometer camera is illustrated with fast scanning of large field of view of opaque scenes achieved in a complete body scanner prototype.

  13. Habitat Size Optimization of the O'Neill - Glaser Economic Model for Space Solar Satellite Production

    NASA Technical Reports Server (NTRS)

    Curreri, Peter A.; Detweiler, Michael

    2010-01-01

    Creating large space habitats by launching all materials from Earth is prohibitively expensive. Using space resources and space based labor to build space solar power satellites can yield extraordinary profits after a few decades. The economic viability of this program depends on the use of space resources and space labor. To maximize the return on the investment, the early use of high density bolo habitats is required. Other shapes do not allow for the small initial scale required for a quick population increase in space. This study found that 5 Man Year, or 384 person bolo high density habitats will be the most economically feasible for a program started at year 2010 and will cause a profit by year 24 of the program, put over 45,000 people into space, and create a large system of space infrastructure for the further exploration and development of space.

  14. Advanced space system concepts and their orbital support needs (1980 - 2000). Volume 2: Final report

    NASA Technical Reports Server (NTRS)

    Bekey, I.; Mayer, H. L.; Wolfe, M. G.

    1976-01-01

    The results are presented of a study which identifies over 100 new and highly capable space systems for the 1980-2000 time period: civilian systems which could bring benefits to large numbers of average citizens in everyday life, much enhance the kinds and levels of public services, increase the economic motivation for industrial investment in space, expand scientific horizons; and, in the military area, systems which could materially alter current concepts of tactical and strategic engagements. The requirements for space transportation, orbital support, and technology for these systems are derived, and those requirements likely to be shared between NASA and the DoD in the time period identified. The high leverage technologies for the time period are identified as very large microwave antennas and optics, high energy power subsystems, high precision and high power lasers, microelectronic circuit complexes and data processors, mosaic solid state sensing devices, and long-life cryogenic refrigerators.

  15. Dynamic assembly of brambleberry mediates nuclear envelope fusion during early development.

    PubMed

    Abrams, Elliott W; Zhang, Hong; Marlow, Florence L; Kapp, Lee; Lu, Sumei; Mullins, Mary C

    2012-08-03

    To accommodate the large cells following zygote formation, early blastomeres employ modified cell divisions. Karyomeres are one such modification, mitotic intermediates wherein individual chromatin masses are surrounded by nuclear envelope; the karyomeres then fuse to form a single mononucleus. We identified brambleberry, a maternal-effect zebrafish mutant that disrupts karyomere fusion, resulting in formation of multiple micronuclei. As karyomeres form, Brambleberry protein localizes to the nuclear envelope, with prominent puncta evident near karyomere-karyomere interfaces corresponding to membrane fusion sites. brambleberry corresponds to an unannotated gene with similarity to Kar5p, a protein that participates in nuclear fusion in yeast. We also demonstrate that Brambleberry is required for pronuclear fusion following fertilization in zebrafish. Our studies provide insight into the machinery required for karyomere fusion and suggest that specialized proteins are necessary for proper nuclear division in large dividing blastomeres. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. Studies in Software Cost Model Behavior: Do We Really Understand Cost Model Performance?

    NASA Technical Reports Server (NTRS)

    Lum, Karen; Hihn, Jairus; Menzies, Tim

    2006-01-01

    While there exists extensive literature on software cost estimation techniques, industry practice continues to rely upon standard regression-based algorithms. These software effort models are typically calibrated or tuned to local conditions using local data. This paper cautions that current approaches to model calibration often produce sub-optimal models because of the large variance problem inherent in cost data and by including far more effort multipliers than the data supports. Building optimal models requires that a wider range of models be considered while correctly calibrating these models requires rejection rules that prune variables and records and use multiple criteria for evaluating model performance. The main contribution of this paper is to document a standard method that integrates formal model identification, estimation, and validation. It also documents what we call the large variance problem that is a leading cause of cost model brittleness or instability.

  17. Automatic measurements and computations for radiochemical analyses

    USGS Publications Warehouse

    Rosholt, J.N.; Dooley, J.R.

    1960-01-01

    In natural radioactive sources the most important radioactive daughter products useful for geochemical studies are protactinium-231, the alpha-emitting thorium isotopes, and the radium isotopes. To resolve the abundances of these thorium and radium isotopes by their characteristic decay and growth patterns, a large number of repeated alpha activity measurements on the two chemically separated elements were made over extended periods of time. Alpha scintillation counting with automatic measurements and sample changing is used to obtain the basic count data. Generation of the required theoretical decay and growth functions, varying with time, and the least squares solution of the overdetermined simultaneous count rate equations are done with a digital computer. Examples of the complex count rate equations which may be solved and results of a natural sample containing four ??-emitting isotopes of thorium are illustrated. These methods facilitate the determination of the radioactive sources on the large scale required for many geochemical investigations.

  18. Effects of relational coordination among colleagues and span of control on work engagement among home-visiting nurses.

    PubMed

    Naruse, Takashi; Sakai, Mahiro; Nagata, Satoko

    2016-04-01

    Home-visiting nursing agencies are required to foster staff nurse's work engagement; thus, the factors related to work engagement require identification. This study examined relational coordination among colleagues and agency span of control on the work engagement of home-visiting nurses. Cross-sectional data from 93 staff nurses in 31 home-visiting nursing agencies were collected via a survey and analyzed using mixed linear regression. There was no significant main effect of relational coordination among nurse colleagues on work engagement. In large agencies with a large span of control, relational coordination among nursing colleagues predicted work engagement. Nursing managers' relational coordination was found to be positively associated with staff nurse work engagement. Agency span of control is a moderating factor on the positive effect of relational coordination with nursing colleagues on staff nurse work engagement. © 2016 Japan Academy of Nursing Science.

  19. Adapted RF pulse design for SAR reduction in parallel excitation with experimental verification at 9.4 T.

    PubMed

    Wu, Xiaoping; Akgün, Can; Vaughan, J Thomas; Andersen, Peter; Strupp, John; Uğurbil, Kâmil; Van de Moortele, Pierre-François

    2010-07-01

    Parallel excitation holds strong promises to mitigate the impact of large transmit B1 (B+1) distortion at very high magnetic field. Accelerated RF pulses, however, inherently tend to require larger values in RF peak power which may result in substantial increase in Specific Absorption Rate (SAR) in tissues, which is a constant concern for patient safety at very high field. In this study, we demonstrate adapted rate RF pulse design allowing for SAR reduction while preserving excitation target accuracy. Compared with other proposed implementations of adapted rate RF pulses, our approach is compatible with any k-space trajectories, does not require an analytical expression of the gradient waveform and can be used for large flip angle excitation. We demonstrate our method with numerical simulations based on electromagnetic modeling and we include an experimental verification of transmit pattern accuracy on an 8 transmit channel 9.4 T system.

  20. Ontology-based tools to expedite predictive model construction.

    PubMed

    Haug, Peter; Holmen, John; Wu, Xinzi; Mynam, Kumar; Ebert, Matthew; Ferraro, Jeffrey

    2014-01-01

    Large amounts of medical data are collected electronically during the course of caring for patients using modern medical information systems. This data presents an opportunity to develop clinically useful tools through data mining and observational research studies. However, the work necessary to make sense of this data and to integrate it into a research initiative can require substantial effort from medical experts as well as from experts in medical terminology, data extraction, and data analysis. This slows the process of medical research. To reduce the effort required for the construction of computable, diagnostic predictive models, we have developed a system that hybridizes a medical ontology with a large clinical data warehouse. Here we describe components of this system designed to automate the development of preliminary diagnostic models and to provide visual clues that can assist the researcher in planning for further analysis of the data behind these models.

  1. A machine learning model with human cognitive biases capable of learning from small and biased datasets.

    PubMed

    Taniguchi, Hidetaka; Sato, Hiroshi; Shirakawa, Tomohiro

    2018-05-09

    Human learners can generalize a new concept from a small number of samples. In contrast, conventional machine learning methods require large amounts of data to address the same types of problems. Humans have cognitive biases that promote fast learning. Here, we developed a method to reduce the gap between human beings and machines in this type of inference by utilizing cognitive biases. We implemented a human cognitive model into machine learning algorithms and compared their performance with the currently most popular methods, naïve Bayes, support vector machine, neural networks, logistic regression and random forests. We focused on the task of spam classification, which has been studied for a long time in the field of machine learning and often requires a large amount of data to obtain high accuracy. Our models achieved superior performance with small and biased samples in comparison with other representative machine learning methods.

  2. Prototype Development of a Geostationary Synthetic Thinned Aperture Radiometer, GeoSTAR

    NASA Technical Reports Server (NTRS)

    Tanner, Alan B.; Wilson, William J.; Kangaslahti, Pekka P.; Lambrigsten, Bjorn H.; Dinardo, Steven J.; Piepmeier, Jeffrey R.; Ruf, Christopher S.; Rogacki, Steven; Gross, S. M.; Musko, Steve

    2004-01-01

    Preliminary details of a 2-D synthetic aperture radiometer prototype operating from 50 to 58 GHz will be presented. The instrument is being developed as a laboratory testbed, and the goal of this work is to demonstrate the technologies needed to do atmospheric soundings with high spatial resolution from Geostationary orbit. The concept is to deploy a large sparse aperture Y-array from a geostationary satellite, and to use aperture synthesis to obtain images of the earth without the need for a large mechanically scanned antenna. The laboratory prototype consists of a Y-array of 24 horn antennas, MMIC receivers, and a digital cross-correlation sub-system. System studies are discussed, including an error budget which has been derived from numerical simulations. The error budget defines key requirements, such as null offsets, phase calibration, and antenna pattern knowledge. Details of the instrument design are discussed in the context of these requirements.

  3. miRNAFold: a web server for fast miRNA precursor prediction in genomes.

    PubMed

    Tav, Christophe; Tempel, Sébastien; Poligny, Laurent; Tahi, Fariza

    2016-07-08

    Computational methods are required for prediction of non-coding RNAs (ncRNAs), which are involved in many biological processes, especially at post-transcriptional level. Among these ncRNAs, miRNAs have been largely studied and biologists need efficient and fast tools for their identification. In particular, ab initio methods are usually required when predicting novel miRNAs. Here we present a web server dedicated for miRNA precursors identification at a large scale in genomes. It is based on an algorithm called miRNAFold that allows predicting miRNA hairpin structures quickly with high sensitivity. miRNAFold is implemented as a web server with an intuitive and user-friendly interface, as well as a standalone version. The web server is freely available at: http://EvryRNA.ibisc.univ-evry.fr/miRNAFold. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  4. Explicit solution techniques for impact with contact constraints

    NASA Technical Reports Server (NTRS)

    Mccarty, Robert E.

    1993-01-01

    Modern military aircraft transparency systems, windshields and canopies, are complex systems which must meet a large and rapidly growing number of requirements. Many of these transparency system requirements are conflicting, presenting difficult balances which must be achieved. One example of a challenging requirements balance or trade is shaping for stealth versus aircrew vision. The large number of requirements involved may be grouped in a variety of areas including man-machine interface; structural integration with the airframe; combat hazards; environmental exposures; and supportability. Some individual requirements by themselves pose very difficult, severely nonlinear analysis problems. One such complex problem is that associated with the dynamic structural response resulting from high energy bird impact. An improved analytical capability for soft-body impact simulation was developed.

  5. Explicit solution techniques for impact with contact constraints

    NASA Astrophysics Data System (ADS)

    McCarty, Robert E.

    1993-08-01

    Modern military aircraft transparency systems, windshields and canopies, are complex systems which must meet a large and rapidly growing number of requirements. Many of these transparency system requirements are conflicting, presenting difficult balances which must be achieved. One example of a challenging requirements balance or trade is shaping for stealth versus aircrew vision. The large number of requirements involved may be grouped in a variety of areas including man-machine interface; structural integration with the airframe; combat hazards; environmental exposures; and supportability. Some individual requirements by themselves pose very difficult, severely nonlinear analysis problems. One such complex problem is that associated with the dynamic structural response resulting from high energy bird impact. An improved analytical capability for soft-body impact simulation was developed.

  6. Stabilizing canonical-ensemble calculations in the auxiliary-field Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Gilbreth, C. N.; Alhassid, Y.

    2015-03-01

    Quantum Monte Carlo methods are powerful techniques for studying strongly interacting Fermi systems. However, implementing these methods on computers with finite-precision arithmetic requires careful attention to numerical stability. In the auxiliary-field Monte Carlo (AFMC) method, low-temperature or large-model-space calculations require numerically stabilized matrix multiplication. When adapting methods used in the grand-canonical ensemble to the canonical ensemble of fixed particle number, the numerical stabilization increases the number of required floating-point operations for computing observables by a factor of the size of the single-particle model space, and thus can greatly limit the systems that can be studied. We describe an improved method for stabilizing canonical-ensemble calculations in AFMC that exhibits better scaling, and present numerical tests that demonstrate the accuracy and improved performance of the method.

  7. Learning Gains from a Recurring "Teach and Question" Homework Assignment in a General Biology Course: Using Reciprocal Peer Tutoring Outside Class.

    PubMed

    Bailey, E G; Baek, D; Meiling, J; Morris, C; Nelson, N; Rice, N S; Rose, S; Stockdale, P

    2018-06-01

    Providing students with one-on-one interaction with instructors is a big challenge in large courses. One solution is to have students interact with their peers during class. Reciprocal peer tutoring (RPT) is a more involved interaction that requires peers to alternate the roles of "teacher" and "student." Theoretically, advantages for peer tutoring include the verbalization and questioning of information and the scaffolded exploration of material through social and cognitive interaction. Studies on RPT vary in their execution, but most require elaborate planning and take up valuable class time. We tested the effectiveness of a "teach and question" (TQ) assignment that required student pairs to engage in RPT regularly outside class. A quasi-experimental design was implemented: one section of a general biology course completed TQ assignments, while another section completed a substitute assignment requiring individuals to review course material. The TQ section outperformed the other section by ∼6% on exams. Session recordings were coded to investigate correlation between TQ quality and student performance. Asking more questions was the characteristic that best predicted exam performance, and this was more predictive than most aspects of the course. We propose the TQ as an easy assignment to implement with large performance gains.

  8. Major QTLs for critical photoperiod and vernalization underlie extensive variation in flowering in the Mimulus guttatus species complex.

    PubMed

    Friedman, Jannice; Willis, John H

    2013-07-01

    Species with extensive ranges experience highly variable environments with respect to temperature, light and soil moisture. Synchronizing the transition from vegetative to floral growth is important to employ favorable conditions for reproduction. Optimal timing of this transition might be different for semelparous annual plants and iteroparous perennial plants. We studied variation in the critical photoperiod necessary for floral induction and the requirement for a period of cold-chilling (vernalization) in 46 populations of annuals and perennials in the Mimulus guttatus species complex. We then examined critical photoperiod and vernalization QTLs in growth chambers using F(2) progeny from annual and perennial parents that differed in their requirements for flowering. We identify extensive variation in critical photoperiod, with most annual populations requiring substantially shorter day lengths to initiate flowering than perennial populations. We discover a novel type of vernalization requirement in perennial populations that is contingent on plants experiencing short days first. QTL analyses identify two large-effect QTLs which influence critical photoperiod. In two separate vernalization experiments we discover each set of crosses contain different large-effect QTLs for vernalization. Mimulus guttatus harbors extensive variation in critical photoperiod and vernalization that may be a consequence of local adaptation. © 2013 The Authors. New Phytologist © 2013 New Phytologist Trust.

  9. Design of Phase II Non-inferiority Trials.

    PubMed

    Jung, Sin-Ho

    2017-09-01

    With the development of inexpensive treatment regimens and less invasive surgical procedures, we are confronted with non-inferiority study objectives. A non-inferiority phase III trial requires a roughly four times larger sample size than that of a similar standard superiority trial. Because of the large required sample size, we often face feasibility issues to open a non-inferiority trial. Furthermore, due to lack of phase II non-inferiority trial design methods, we do not have an opportunity to investigate the efficacy of the experimental therapy through a phase II trial. As a result, we often fail to open a non-inferiority phase III trial and a large number of non-inferiority clinical questions still remain unanswered. In this paper, we want to develop some designs for non-inferiority randomized phase II trials with feasible sample sizes. At first, we review a design method for non-inferiority phase III trials. Subsequently, we propose three different designs for non-inferiority phase II trials that can be used under different settings. Each method is demonstrated with examples. Each of the proposed design methods is shown to require a reasonable sample size for non-inferiority phase II trials. The three different non-inferiority phase II trial designs are used under different settings, but require similar sample sizes that are typical for phase II trials.

  10. Orientation dependence of the stress rupture properties of Nickel-base superalloy single crystals

    NASA Technical Reports Server (NTRS)

    Mackay, R. A.

    1981-01-01

    The influence of orientation of the stress rupture behavior of Mar-M247 single crystals was studied. Stress rupture tests were performed at 724 MPa and 774 C where the effect of anisotropy is prominent. The mechanical behavior of the single crystals was rationalized on the basis of the Schmid factors for the operative slip systems and the lattice rotations which the crystals underwent during deformation. The stress rupture lives were found to be greatly influenced by the lattice rotations required to produce intersecting slip, because steady-state creep does not begin until after the onset of intersecting slip. Crystals which required large rotations to become oriented for intersecting slip exhibited a large primary creep strain, a large effective stress level at the onset of steady-state creep, and consequently a short stress rupture life. A unified analysis was attained for the stress rupture behavior of the Mar-M247 single crystals tested in this study at 774 C and that of the Mar-M200 single crystals tested in a prior study at 760 C. In this analysis, the standard 001-011-111 stereographic triangle was divided into several regions of crystallographic orientation which were rank ordered according to stress rupture life for this temperature regime. This plot indicates that those crystals having orientations within about 25 deg of the 001 exhibited significantly longer lives when their orientations were closer to the 001-011 boundary of the stereographic triangle than to the 001-111 boundary.

  11. 49 CFR 178.980 - Stacking test.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Packaging unsafe for transportation and no loss of contents. (2) For flexible Large Packagings, there may be no deterioration which renders the Large Packaging unsafe for transportation and no loss of contents... required load, there is no permanent deformation to the Large Packaging which renders the whole Large...

  12. Digitally controlled twelve-pulse firing generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berde, D.; Ferrara, A.A.

    1981-01-01

    Control System Studies for the Tokamak Fusion Test Reactor (TFTR) indicate that accurate thyristor firing in the AC-to-DC conversion system is required in order to achieve good regulation of the various field currents. Rapid update and exact firing angle control are required to avoid instabilities, large eddy currents, or parasitic oscillations. The Prototype Firing Generator was designed to satisfy these requirements. To achieve the required /plus or minus/0.77/degree/firing accuracy, a three-phase-locked loop reference was designed; otherwise, the Firing Generator employs digital circuitry. The unit, housed in a standard CAMAC crate, operates under microcomputer control. Functions are performed under program control,more » which resides in nonvolatile read-only memory. Communication with CICADA control system is provided via an 11-bit parallel interface.« less

  13. Gas-Centered Swirl Coaxial Liquid Injector Evaluations

    NASA Technical Reports Server (NTRS)

    Cohn, A. K.; Strakey, P. A.; Talley, D. G.

    2005-01-01

    Development of Liquid Rocket Engines is expensive. Extensive testing at large scales usually required. In order to verify engine lifetime, large number of tests required. Limited Resources available for development. Sub-scale cold-flow and hot-fire testing is extremely cost effective. Could be a necessary (but not sufficient) condition for long engine lifetime. Reduces overall costs and risk of large scale testing. Goal: Determine knowledge that can be gained from sub-scale cold-flow and hot-fire evaluations of LRE injectors. Determine relationships between cold-flow and hot-fire data.

  14. Increasing older persons' employment in Finland: in search of a new strategy.

    PubMed

    Sihto, M

    1999-01-01

    Using Finland as a case study, it is argued that early retirement will probably no longer be used on a large scale to reduce older-worker labor-force participation and unemployment among older workers. Instead, new strategies are needed to enhance the ability of older workers to remain productive and in the labor force, and to facilitate the reintegration of unemployed older persons back into working life. Both tasks require massive pioneering efforts. Reducing unemployment rates among older workers, particularly, requires completely new kinds of labor-market measures.

  15. Wireless gas detection with a smartphone via rf communication

    PubMed Central

    Azzarelli, Joseph M.; Mirica, Katherine A.; Ravnsbæk, Jens B.; Swager, Timothy M.

    2014-01-01

    Chemical sensing is of critical importance to human health, safety, and security, yet it is not broadly implemented because existing sensors often require trained personnel, expensive and bulky equipment, and have large power requirements. This study reports the development of a smartphone-based sensing strategy that employs chemiresponsive nanomaterials integrated into the circuitry of commercial near-field communication tags to achieve non-line-of-sight, portable, and inexpensive detection and discrimination of gas-phase chemicals (e.g., ammonia, hydrogen peroxide, cyclohexanone, and water) at part-per-thousand and part-per-million concentrations. PMID:25489066

  16. Current Measurements and Overwash Monitoring Using Tilt Current Meters in Three Coastal Environments

    NASA Astrophysics Data System (ADS)

    Lowell, N. S.; Sherwood, C. R.; Decarlo, T. M.; Grant, J. R.

    2014-12-01

    Tilt Current Meters (TCMs) provide accurate, cost effective measurements of near-bottom current velocities. Many studies in coastal environments require current measurements, which are frequently made with Acoustic Doppler Profilers (ADPs). ADPs are expensive, however, and may not be suitable for locations where there is significant risk of damage, loss, or theft or where a large spatial array of measurements is required. TCMs, by contrast, are smaller, less expensive, and easier to deploy. This study tested TCMs in three sites to determine their suitability for use in research applications. TCMs are based on the drag-tilt principle, where the instrument tilts in response to current. The meter consists of a buoyant float with an onboard accelerometer, three-axis tilt sensor, three-axis magnetometer (compass), and a data logger. Current measurements are derived by post processing the tilt and compass values and converting them to velocity using empirical calibration data. Large data-storage capacity (4 GB) and low power requirements allow long deployments (many months) at high sample rates (16 Hz). We demonstrate the utility of TCM current measurements on a reef at Dongsha Atoll in the South China Sea, and in Vineyard Sound off Cape Cod, where the TCM performance was evaluated against ADP measurements. We have also used the TCM to record waves during an overwash event on a Cape Cod barrier beach during a winter storm. The TCM recorded waves as they came through the overwash channel, and the data were in agreement with the water-level record used as a reference. These tests demonstrate that TCMs may be used in a variety of near shore environments and have the potential to significantly increase the density of meters in future studies were current measurements are required.

  17. Do Bird Friendly® Coffee Criteria Benefit Mammals? Assessment of Mammal Diversity in Chiapas, Mexico

    PubMed Central

    Caudill, S. Amanda; Rice, Robert A.

    2016-01-01

    Biodiversity-friendly coffee certifications offer a viable way to protect wildlife habitat while providing a financial incentive to farmers. Most studies related to these certifications focus on avian habitat requirements and it is not known whether these standards also apply to other wildlife, such as mammals, that inhabit the coffee landscapes. We assessed the non-volant mammalian fauna and their associated habitat requirements in 23 sites representing forest, Bird Friendly® shade, conventional shade, and sun coffee habitats. We used Sherman trap-grids to measure small mammal abundance and richness, while camera traps were set for medium-sized and large mammals. We detected 17 species of mammals, representing 11 families. This preliminary study indicates that coffee farms in this region provide an important refuge for mammalian wildlife. Mammal species density ranked significantly higher in Bird Friendly® coffee sites than other coffee habitats, although there was no significant difference for species richness (using Chao2 estimator) among the habitat types. No significant difference was found in small mammal abundance among the habitat types. We found a higher species density of medium and large mammals in sites with larger, more mature shade trees associated with, but not required by Bird Friendly® certification standards. However, lower strata vegetation (5 cm to 1 m tall), the only vegetation parameter found to increase abundance and density for small mammals, is not specified in the Bird Friendly® standards. Our findings suggest that although the standards devised for avian habitat do benefit mammals, further study is needed on the requirements specific for mammals that could be included to enhance the coffee habitat for mammals that inhabit these coffee landscapes. PMID:27880773

  18. Changes, disruption and innovation: An investigation of the introduction of new health information technology in a microbiology laboratory.

    PubMed

    Toouli, George; Georgiou, Andrew; Westbrook, Johanna

    2012-01-01

    It is expected that health information technology (HIT) will deliver a safer, more efficient and effective health care system. The aim of this study was to undertake a qualitative and video-ethnographic examination of the impact of information technologies on work processes in the reception area of a Microbiology Department, to ascertain what changed, how it changed and the impact of the change. The setting for this study was the microbiology laboratory of a large tertiary hospital in Sydney. The study consisted of qualitative (interview and focus group) data and observation sessions for the period August 2005 to October 2006 along with video footage shot in three sessions covering the original system and the two stages of the Cerner implementation. Data analysis was assisted by NVivo software and process maps were produced from the video footage. There were two laboratory information systems observed in the video footage with computerized provider order entry introduced four months later. Process maps highlighted the large number of pre data entry steps with the original system whilst the newer system incorporated many of these steps in to the data entry stage. However, any time saved with the new system was offset by the requirement to complete some data entry of patient information not previously required. Other changes noted included the change of responsibilities for the reception staff and the physical changes required to accommodate the increased activity around the data entry area. Implementing a new HIT is always an exciting time for any environment but ensuring that the implementation goes smoothly and with minimal trouble requires the administrator and their team to plan well in advance for staff training, physical layout and possible staff resource reallocation.

  19. Do Bird Friendly® Coffee Criteria Benefit Mammals? Assessment of Mammal Diversity in Chiapas, Mexico.

    PubMed

    Caudill, S Amanda; Rice, Robert A

    2016-01-01

    Biodiversity-friendly coffee certifications offer a viable way to protect wildlife habitat while providing a financial incentive to farmers. Most studies related to these certifications focus on avian habitat requirements and it is not known whether these standards also apply to other wildlife, such as mammals, that inhabit the coffee landscapes. We assessed the non-volant mammalian fauna and their associated habitat requirements in 23 sites representing forest, Bird Friendly® shade, conventional shade, and sun coffee habitats. We used Sherman trap-grids to measure small mammal abundance and richness, while camera traps were set for medium-sized and large mammals. We detected 17 species of mammals, representing 11 families. This preliminary study indicates that coffee farms in this region provide an important refuge for mammalian wildlife. Mammal species density ranked significantly higher in Bird Friendly® coffee sites than other coffee habitats, although there was no significant difference for species richness (using Chao2 estimator) among the habitat types. No significant difference was found in small mammal abundance among the habitat types. We found a higher species density of medium and large mammals in sites with larger, more mature shade trees associated with, but not required by Bird Friendly® certification standards. However, lower strata vegetation (5 cm to 1 m tall), the only vegetation parameter found to increase abundance and density for small mammals, is not specified in the Bird Friendly® standards. Our findings suggest that although the standards devised for avian habitat do benefit mammals, further study is needed on the requirements specific for mammals that could be included to enhance the coffee habitat for mammals that inhabit these coffee landscapes.

  20. A future large-aperture UVOIR space observatory: reference designs

    NASA Astrophysics Data System (ADS)

    Rioux, Norman; Thronson, Harley; Feinberg, Lee; Stahl, H. Philip; Redding, Dave; Jones, Andrew; Sturm, James; Collins, Christine; Liu, Alice

    2015-09-01

    Our joint NASA GSFC/JPL/MSFC/STScI study team has used community-provided science goals to derive mission needs, requirements, and candidate mission architectures for a future large-aperture, non-cryogenic UVOIR space observatory. We describe the feasibility assessment of system thermal and dynamic stability for supporting coronagraphy. The observatory is in a Sun-Earth L2 orbit providing a stable thermal environment and excellent field of regard. Reference designs include a 36-segment 9.2 m aperture telescope that stows within a five meter diameter launch vehicle fairing. Performance needs developed under the study are traceable to a variety of reference designs including options for a monolithic primary mirror.

  1. A Future Large-Aperture UVOIR Space Observatory: Reference Designs

    NASA Technical Reports Server (NTRS)

    Thronson, Harley; Rioux, Norman; Feinberg, Lee; Stahl, H. Philip; Redding, Dave; Jones, Andrew; Sturm, James; Collins, Christine; Liu, Alice

    2015-01-01

    Our joint NASA GSFC/JPL/MSFC/STScI study team has used community-provided science goals to derive mission needs, requirements, and candidate mission architectures for a future large-aperture, non-cryogenic UVOIR space observatory. We describe the feasibility assessment of system thermal and dynamic stability for supporting coronagraphy. The observatory is in a Sun-Earth L2 orbit providing a stable thermal environment and excellent field of regard. Reference designs include a 36-segment 9.2 m aperture telescope that stows within a five meter diameter launch vehicle fairing. Performance needs developed under the study are traceable to a variety of reference designs including options for a monolithic primary mirror.

  2. Extracellular matrix motion and early morphogenesis

    PubMed Central

    Loganathan, Rajprasad; Rongish, Brenda J.; Smith, Christopher M.; Filla, Michael B.; Czirok, Andras; Bénazéraf, Bertrand

    2016-01-01

    For over a century, embryologists who studied cellular motion in early amniotes generally assumed that morphogenetic movement reflected migration relative to a static extracellular matrix (ECM) scaffold. However, as we discuss in this Review, recent investigations reveal that the ECM is also moving during morphogenesis. Time-lapse studies show how convective tissue displacement patterns, as visualized by ECM markers, contribute to morphogenesis and organogenesis. Computational image analysis distinguishes between cell-autonomous (active) displacements and convection caused by large-scale (composite) tissue movements. Modern quantification of large-scale ‘total’ cellular motion and the accompanying ECM motion in the embryo demonstrates that a dynamic ECM is required for generation of the emergent motion patterns that drive amniote morphogenesis. PMID:27302396

  3. Study on zigzag maneuver characteristics of V-U very large crude oil (VLCC) tankers

    NASA Astrophysics Data System (ADS)

    Jaswar, Maimun, A.; Wahid, M. A.; Priyanto, A.; Zamani, Pauzi, Saman

    2012-06-01

    The Department of Marine Technology at the Faculty of Mechanical Engineering, University Teknologi Malaysia has recently developed an Ship Maneuverability tool which intends to upgrade student's level understanding the application of fluid dynamic on interaction between hull, propeller, and rudder during maneuvering. This paper discusses zigzag maneuver for conventional Very Large Crude Oil (VLCC) ships with the same principal dimensions but different stern flame shape. 10/10 zigzag maneuver characteristics of U and V types of VLCC ships are investigated. Simulation results for U-type show a good agreement with the experimental data, but V-type not good agreement with experimental one. Further study on zigzag maneuver characteristics are required.

  4. Developing Software Requirements for a Knowledge Management System That Coordinates Training Programs with Business Processes and Policies in Large Organizations

    ERIC Educational Resources Information Center

    Kiper, J. Richard

    2013-01-01

    For large organizations, updating instructional programs presents a challenge to keep abreast of constantly changing business processes and policies. Each time a process or policy changes, significant resources are required to locate and modify the training materials that convey the new content. Moreover, without the ability to track learning…

  5. Control of large space structures

    NASA Technical Reports Server (NTRS)

    Gran, R.; Rossi, M.; Moyer, H. G.; Austin, F.

    1979-01-01

    The control of large space structures was studied to determine what, if any, limitations are imposed on the size of spacecraft which may be controlled using current control system design technology. Using a typical structure in the 35 to 70 meter size category, a control system design that used actuators that are currently available was designed. The amount of control power required to maintain the vehicle in a stabilized gravity gradient pointing orientation that also damped various structural motions was determined. The moment of inertia and mass properties of this structure were varied to verify that stability and performance were maintained. The study concludes that the structure's size is required to change by at least a factor of two before any stability problems arise. The stability margin that is lost is due to the scaling of the gravity gradient torques (the rigid body control) and as such can easily be corrected by changing the control gains associated with the rigid body control. A secondary conclusion from the study is that the control design that accommodates the structural motions (to damp them) is a little more sensitive than the design that works on attitude control of the rigid body only.

  6. Balloon concepts for scientific investigation of Mars and Jupiter

    NASA Technical Reports Server (NTRS)

    Ash, R. L.

    1979-01-01

    Opportunities for scientific investigation of the atmospheric planets using buoyant balloons have been explored. Mars and Jupiter were considered in this study because design requirements at those planets bracket nominally the requirements at Venus, and plans are already underway for a joint Russian-French balloon system at Venus. Viking data has provided quantitative information for definition of specific balloon systems at Mars. Free flying balloons appear capable of providing valuable scientific support for more sophisticated Martian surface probes, but tethered and powered aerostats are not attractive. The Jovian environment is so extreme, hot atmosphere balloons may be the only scientific platforms capable of extended operations there. However, the estimated system mass and thermal energy required are very large.

  7. Real-time Bayesian anomaly detection in streaming environmental data

    NASA Astrophysics Data System (ADS)

    Hill, David J.; Minsker, Barbara S.; Amir, Eyal

    2009-04-01

    With large volumes of data arriving in near real time from environmental sensors, there is a need for automated detection of anomalous data caused by sensor or transmission errors or by infrequent system behaviors. This study develops and evaluates three automated anomaly detection methods using dynamic Bayesian networks (DBNs), which perform fast, incremental evaluation of data as they become available, scale to large quantities of data, and require no a priori information regarding process variables or types of anomalies that may be encountered. This study investigates these methods' abilities to identify anomalies in eight meteorological data streams from Corpus Christi, Texas. The results indicate that DBN-based detectors, using either robust Kalman filtering or Rao-Blackwellized particle filtering, outperform a DBN-based detector using Kalman filtering, with the former having false positive/negative rates of less than 2%. These methods were successful at identifying data anomalies caused by two real events: a sensor failure and a large storm.

  8. Resolving Properties of Polymers and Nanoparticle Assembly through Coarse-Grained Computational Studies.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grest, Gary S.

    2017-09-01

    Coupled length and time scales determine the dynamic behavior of polymers and polymer nanocomposites and underlie their unique properties. To resolve the properties over large time and length scales it is imperative to develop coarse grained models which retain the atomistic specificity. Here we probe the degree of coarse graining required to simultaneously retain significant atomistic details a nd access large length and time scales. The degree of coarse graining in turn sets the minimum length scale instrumental in defining polymer properties and dynamics. Using polyethylene as a model system, we probe how the coarse - graining scale affects themore » measured dynamics with different number methylene group s per coarse - grained beads. Using these models we simulate polyethylene melts for times over 500 ms to study the viscoelastic properties of well - entangled polymer melts and large nanoparticle assembly as the nanoparticles are driven close enough to form nanostructures.« less

  9. Some aspects of wind tunnel magnetic suspension systems with special application at large physical scales

    NASA Technical Reports Server (NTRS)

    Britcher, C. P.

    1983-01-01

    Wind tunnel magnetic suspension and balance systems (MSBSs) have so far failed to find application at the large physical scales necessary for the majority of aerodynamic testing. Three areas of technology relevant to such application are investigated. Two variants of the Spanwise Magnet roll torque generation scheme are studied. Spanwise Permanent Magnets are shown to be practical and are experimentally demonstrated. Extensive computations of the performance of the Spanwise Iron Magnet scheme indicate powerful capability, limited principally be electromagnet technology. Aerodynamic testing at extreme attitudes is shown to be practical in relatively conventional MSBSs. Preliminary operation of the MSBS over a wide range of angles of attack is demonstrated. The impact of a requirement for highly reliable operation on the overall architecture of Large MSBSs is studied and it is concluded that system cost and complexity need not be seriously increased.

  10. Genetic variation of apolipoproteins, diet and other environmental interactions; an updated review.

    PubMed

    Sotos-Prieto, Mercedes; Peñalvo, José Luis

    2013-01-01

    This paper summarizes the recent findings from studies investigating the potential environmental modulation of the genetic variation of apolipoprotein genes on metabolic traits. We reviewed nutrigenetic studies evaluating variations on apolipoproteins-related genes and its associated response to nutrients (mostly dietary fatty acids) or any other dietary or environmental component. Most revised research studied single nucleotide polymorphism (SNP) and specific nutrients through small intervention studies, and only few interactions have been replicated in large and independent populations (as in the case of -265T > C SNP in APOA2 gene). Although current knowledge shows that variations on apolipoprotein genes may contribute to the different response on metabolic traits due to dietary interventions, evidence is still scarce and results are inconsistent. Success in this area will require going beyond the limitations of current experimental designs and explore the hypotheses within large populations. Some of these limitations are being covered by the rapidly advance in high-throughput technologies and large scale-genome wide association studies. Copyright © AULA MEDICA EDICIONES 2013. Published by AULA MEDICA. All rights reserved.

  11. [Design requirements for clinical trials on re-evaluation of safety and efficacy of post-marketed Chinese herbs].

    PubMed

    Xie, Yanming; Wei, Xu

    2011-10-01

    Re-evaluation of post-marketed based on pharmacoepidemiology is to study and collect clinical medicine safety in large population under practical applications for a long time. It is necessary to conduct re-evaluation of clinical effectiveness because of particularity of traditional Chinese medicine (TCM). Right before carrying out clinical trials on re-evaluation of post-marketed TCM, we should determine the objective of the study and progress it in the assessment mode of combination of disease and syndrome. Specical population, involving children and seniors who were excluded in pre-marketed clinical trial, were brought into drug monitoring. Sample size needs to comply with statistical requirement. We commonly use cohort study, case-control study, nested case-control, pragmatic randomized controlled trials.

  12. A Study on: Exploring U.S. Missile Defense Requirements in 2010: What Are the Policy and Technology Challenges?

    DTIC Science & Technology

    1997-04-01

    technology matures. Mid-course phase Warhead & Booster ■’-=>- penaid deployment burnout v y...phase Warhead & Booster _^, penaid deployment burnout v y^ complete...and penaids fit so equippec I) are deployed immediately following boost phase burnout . • Large deceleration occurs from atmospheric drag upon re

  13. Ties That Bind International Research Teams: A Network Multilevel Model of Interdisciplinary Collaboration

    ERIC Educational Resources Information Center

    Kollasch, Aurelia Wiktoria

    2012-01-01

    Today large research projects require substantial involvement of researchers from different organizations, disciplines, or cultures working in groups or teams to accomplish a common goal of producing, sharing, and disseminating scientific knowledge. This study focuses on the international research team that was launched in response to pressing…

  14. Measuring the Impact of a New Faculty Program Using Institutional Data

    ERIC Educational Resources Information Center

    Meizlish, Deborah S.; Wright, Mary C.; Howard, Joseph; Kaplan, Matthew L.

    2018-01-01

    This paper presents a quasi-experimental evaluation of a required, teaching-focused, new faculty program at a large research university. The study makes use of institutional data, including student evaluations of teaching and faculty participation in educational development activities, which are available on many campuses yet rarely used in…

  15. Exploratory Factor Analysis with Small Sample Sizes

    ERIC Educational Resources Information Center

    de Winter, J. C. F.; Dodou, D.; Wieringa, P. A.

    2009-01-01

    Exploratory factor analysis (EFA) is generally regarded as a technique for large sample sizes ("N"), with N = 50 as a reasonable absolute minimum. This study offers a comprehensive overview of the conditions in which EFA can yield good quality results for "N" below 50. Simulations were carried out to estimate the minimum required "N" for different…

  16. Help-Seeking Behavior Following a Community Tragedy: An Application of the Andersen Model

    ERIC Educational Resources Information Center

    Cowart, Brian L.

    2013-01-01

    For healthcare agencies and other professionals to most efficiently provide aid following large scale community tragedies, agencies and professionals must understand the determinants that lead individuals to require and seek various forms of help. This study examined Andersen's Behavioral Model of Healthcare Use and its utility in predicting…

  17. Constructing 21st-Century Teacher Education

    ERIC Educational Resources Information Center

    Darling-Hammond, Linda

    2006-01-01

    Much of what teachers need to know to be successful is invisible to lay observers, leading to the view that teaching requires little formal study and to frequent disdain for teacher education programs. The weakness of traditional program models that are collections of largely unrelated courses reinforce this low regard. This article argues that we…

  18. A Study of Students' Reasoning about Probabilistic Causality: Implications for Understanding Complex Systems and for Instructional Design

    ERIC Educational Resources Information Center

    Grotzer, Tina A.; Solis, S. Lynneth; Tutwiler, M. Shane; Cuzzolino, Megan Powell

    2017-01-01

    Understanding complex systems requires reasoning about causal relationships that behave or appear to behave probabilistically. Features such as distributed agency, large spatial scales, and time delays obscure co-variation relationships and complex interactions can result in non-deterministic relationships between causes and effects that are best…

  19. Family Child Care Licensing Study, 1998.

    ERIC Educational Resources Information Center

    Children's Foundation, Washington, DC.

    This report details a survey of state child care regulatory agencies. Data on both small family child care homes (FCCH) and group or large family child care homes (LCCH or GCCH) are included and organized into 22 categories: (1) number of regulated homes; (2) definitions and regulatory requirements; (3) unannounced inspection procedure; (4)…

  20. Designing an Integrated System of Databases: A Workstation for Information Seekers.

    ERIC Educational Resources Information Center

    Micco, Mary; Smith, Irma

    1987-01-01

    Proposes a framework for the design of a full function workstation for information retrieval based on study of information seeking behavior. A large amount of local storage of the CD-ROM jukebox variety and full networking capability to both local and external databases are identified as requirements of the prototype. (MES)

  1. Moving from Pathology to Possibility: Integrating Strengths-Based Interventions in Child Welfare Provision

    ERIC Educational Resources Information Center

    Sabalauskas, Kara L.; Ortolani, Charles L.; McCall, Matthew J.

    2014-01-01

    Child welfare providers are increasingly required to demonstrate that strengths-based, evidence-informed practices are central to their intervention methodology. This case study describes how a large child welfare agency instituted cognitive behavioural therapy (CBT) as the core component of a strength-based practice model with the goal of…

  2. Team-Based Learning Exercise Efficiently Teaches Brief Intervention Skills to Medicine Residents

    ERIC Educational Resources Information Center

    Wamsley, Maria A.; Julian, Katherine A.; O'Sullivan, Patricia; McCance-Katz, Elinore F.; Batki, Steven L.; Satre, Derek D.; Satterfield, Jason

    2013-01-01

    Background: Evaluations of substance use screening and brief intervention (SBI) curricula typically focus on learner attitudes and knowledge, although effects on clinical skills are of greater interest and utility. Moreover, these curricula often require large amounts of training time and teaching resources. This study examined whether a 3-hour…

  3. Knowledge Production within the Innovation System: A Case Study from the United Kingdom

    ERIC Educational Resources Information Center

    Wilson-Medhurst, Sarah

    2010-01-01

    This paper focuses on a key issue for university managers, educational developers and teaching practitioners: that of producing new operational knowledge in the innovation system. More specifically, it explores the knowledge required to guide individual and institutional styles of teaching and learning in a large multi-disciplinary faculty. The…

  4. Age of Parental Concern, Diagnosis, and Service Initiation among Children with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Zablotsky, Benjamin; Colpe, Lisa J.; Pringle, Beverly A.; Kogan, Michael D.; Rice, Catherine; Blumberg, Stephen J.

    2017-01-01

    Children with autism spectrum disorder (ASD) require substantial support to address the core symptoms of ASD and co-occurring behavioral/developmental conditions. This study explores the early diagnostic experiences of school-aged children with ASD using survey data from a large probability-based national sample. Multivariate linear regressions…

  5. Managing Student Behavior in an Elementary School Music Classroom: A Study of Class-Wide Function-Related Intervention Teams

    ERIC Educational Resources Information Center

    Caldarella, Paul; Williams, Leslie; Jolstead, Krystine A.; Wills, Howard P.

    2017-01-01

    Classroom management is a common concern for teachers. Music teachers in particular experience unique behavior challenges because of large class sizes, uncommon pacing requirements, and performance-based outcomes. Positive behavior support is an evidence-based framework for preventing or eliminating challenging behaviors by teaching and…

  6. NITRIFICATION, AND IRON AND ARSENIC REMOVAL IN BIOLOGICALLY ACTIVE FILTERS: A CASE STUDY

    EPA Science Inventory

    The effectiveness of arsenic removal from water is largely dependent on the oxidation state of the arsenic. As (III) is much more difficult to remove relative to the oxidized As(V) form. Unlike Fe(II) that can be oxidized by oxygen, efficient As(III) oxidation requires a strong...

  7. Projecting Enrollment in Rural Schools: A Study of Three Vermont School Districts

    ERIC Educational Resources Information Center

    Grip, Richard S.

    2004-01-01

    Large numbers of rural districts have experienced sharp declines in enrollment, unlike their suburban counterparts. Accurate enrollment projections are required, whether a district needs to build new schools or consolidate existing ones. For school districts having more than 600 students, a quantitative method such as the Cohort-Survival Ratio…

  8. Affective Experiences of International and Home Students during the Information Search Process

    ERIC Educational Resources Information Center

    Haley, Adele Nicole; Clough, Paul

    2017-01-01

    An increasing number of students are studying abroad requiring that they interact with information in languages other than their mother tongue. The UK in particular has seen a large growth in international students within Higher Education. These nonnative English speaking students present a distinct user group for university information services,…

  9. Seasonal habitat associations of the wolverine in central Idaho

    Treesearch

    Jeffrey P. Copeland; James M. Peek; Craig R. Groves; Wayne E. Melquist; Kevin S. Mckelvey; Gregory W. McDaniel; Clinton D. Long; Charles E. Harris

    2007-01-01

    Although understanding habitat relationships remains fundamental to guiding wildlife management, these basic prerequisites remain vague and largely unstudied for the wolverine. Currently, a study of wolverine ecology conducted in Montana, USA, in the 1970s is the sole source of information on habitat requirements of wolverines in the conterminous United States. The...

  10. 7.0 Monitoring the status and impacts of forest fragmentation and urbanization

    Treesearch

    Rachel Riemann; Karen Riva-Murray; Peter S. Murdoch

    2008-01-01

    The geographic expansion of urban and suburban development and the influx of residential and recreational development into previously forested areas are growing concerns for natural resource managers. This project sought to: identify and characterize urbanization and forest fragmentation over large areas with the detail and accuracy required for studies of wildlife...

  11. Organizational Agility and Complex Enterprise System Innovations: A Mixed Methods Study of the Effects of Enterprise Systems on Organizational Agility

    ERIC Educational Resources Information Center

    Kharabe, Amol T.

    2012-01-01

    Over the last two decades, firms have operated in "increasingly" accelerated "high-velocity" dynamic markets, which require them to become "agile." During the same time frame, firms have increasingly deployed complex enterprise systems--large-scale packaged software "innovations" that integrate and automate…

  12. Striatal Degeneration Impairs Language Learning: Evidence from Huntington's Disease

    ERIC Educational Resources Information Center

    De Diego-Balaguer, R.; Couette, M.; Dolbeau, G.; Durr, A.; Youssov, K.; Bachoud-Levi, A.-C.

    2008-01-01

    Although the role of the striatum in language processing is still largely unclear, a number of recent proposals have outlined its specific contribution. Different studies report evidence converging to a picture where the striatum may be involved in those aspects of rule-application requiring non-automatized behaviour. This is the main…

  13. U.S. Accounting Education: Misalignment with the Needs of Small and Medium Companies

    ERIC Educational Resources Information Center

    Burke, Megan M.; Gandolfi, William R.

    2014-01-01

    This study looks to answer the question, "Does the current accounting educational system in the United States focus too heavily on the requirements of large (and SEC registered) companies at the expense of small companies and individuals who comprise the primary clientele of most practicing CPAs?" This investigation surveys CPAs…

  14. Mental Health Workforce Change through Social Work Education: A California Case Study

    ERIC Educational Resources Information Center

    Foster, Gwen; Morris, Meghan Brenna; Sirojudin, Sirojudin

    2013-01-01

    The 2004 California Mental Health Services Act requires large-scale system change in the public mental health system through a shift to recovery-oriented services for diverse populations. This article describes an innovative strategy for workforce recruitment and retention to create and sustain these systemic changes. The California Social Work…

  15. Effects of Teacher Evaluation on Teacher Job Satisfaction in Ohio

    ERIC Educational Resources Information Center

    Downing, Pamela R.

    2016-01-01

    The purpose of this quantitative study was to explore whether or not increased accountability measures found in the Ohio Teacher Evaluation System (OTES) impacted teacher job satisfaction. Student growth measures required by the OTES increased teacher accountability. Today, teachers are largely evaluated based on the results of what they do in the…

  16. Telecommunication market research processing

    NASA Astrophysics Data System (ADS)

    Dupont, J. F.

    1983-06-01

    The data processing in two telecommunication market investigations is described. One of the studies concerns the office applications of communication and the other the experiences with a videotex terminal. Statistical factorial analysis was performed on a large mass of data. A comparison between utilization intentions and effective utilization is made. Extensive rewriting of statistical analysis computer programs was required.

  17. An informal paper on large-scale dynamic systems

    NASA Technical Reports Server (NTRS)

    Ho, Y. C.

    1975-01-01

    Large scale systems are defined as systems requiring more than one decision maker to control the system. Decentralized control and decomposition are discussed for large scale dynamic systems. Information and many-person decision problems are analyzed.

  18. AICAR induces AMPK-independent programmed necrosis in prostate cancer cells.

    PubMed

    Guo, Feng; Liu, Shuang-Qing; Gao, Xing-Hua; Zhang, Long-Yang

    2016-05-27

    AICAR (5-Aminoimidazole-4-carboxamide riboside or acadesine) is an AMP-activated protein kinase (AMPK) agonist, which induces cytotoxic effect to several cancer cells. Its potential activity in prostate cancer cells and the underlying signaling mechanisms have not been extensively studied. Here, we showed that AICAR primarily induced programmed necrosis, but not apoptosis, in prostate cancer cells (LNCaP, PC-3 and PC-82 lines). AICAR's cytotoxicity to prostate cancer cells was largely attenuated by the necrosis inhibitor necrostatin-1. Mitochondrial protein cyclophilin-D (CYPD) is required for AICAR-induced programmed necrosis. CYPD inhibitors (cyclosporin A and sanglifehrin A) as well as CYPD shRNAs dramatically attenuated AICAR-induced prostate cancer cell necrosis and cytotoxicity. Notably, AICAR-induced cell necrosis appeared independent of AMPK, yet requiring reactive oxygen species (ROS) production. ROS scavengers (N-acetylcysteine and MnTBAP), but not AMPKα shRNAs, largely inhibited prostate cancer cell necrosis and cytotoxicity by AICAR. In summary, the results of the present study demonstrate mechanistic evidences that AMPK-independent programmed necrosis contributes to AICAR's cytotoxicity in prostate cancer cells. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. An automated approach to mapping corn from Landsat imagery

    USGS Publications Warehouse

    Maxwell, S.K.; Nuckols, J.R.; Ward, M.H.; Hoffer, R.M.

    2004-01-01

    Most land cover maps generated from Landsat imagery involve classification of a wide variety of land cover types, whereas some studies may only need spatial information on a single cover type. For example, we required a map of corn in order to estimate exposure to agricultural chemicals for an environmental epidemiology study. Traditional classification techniques, which require the collection and processing of costly ground reference data, were not feasible for our application because of the large number of images to be analyzed. We present a new method that has the potential to automate the classification of corn from Landsat satellite imagery, resulting in a more timely product for applications covering large geographical regions. Our approach uses readily available agricultural areal estimates to enable automation of the classification process resulting in a map identifying land cover as ‘highly likely corn,’ ‘likely corn’ or ‘unlikely corn.’ To demonstrate the feasibility of this approach, we produced a map consisting of the three corn likelihood classes using a Landsat image in south central Nebraska. Overall classification accuracy of the map was 92.2% when compared to ground reference data.

  20. Quality Assessments of Long-Term Quantitative Proteomic Analysis of Breast Cancer Xenograft Tissues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Jian-Ying; Chen, Lijun; Zhang, Bai

    The identification of protein biomarkers requires large-scale analysis of human specimens to achieve statistical significance. In this study, we evaluated the long-term reproducibility of an iTRAQ (isobaric tags for relative and absolute quantification) based quantitative proteomics strategy using one channel for universal normalization across all samples. A total of 307 liquid chromatography tandem mass spectrometric (LC-MS/MS) analyses were completed, generating 107 one-dimensional (1D) LC-MS/MS datasets and 8 offline two-dimensional (2D) LC-MS/MS datasets (25 fractions for each set) for human-in-mouse breast cancer xenograft tissues representative of basal and luminal subtypes. Such large-scale studies require the implementation of robust metrics to assessmore » the contributions of technical and biological variability in the qualitative and quantitative data. Accordingly, we developed a quantification confidence score based on the quality of each peptide-spectrum match (PSM) to remove quantification outliers from each analysis. After combining confidence score filtering and statistical analysis, reproducible protein identification and quantitative results were achieved from LC-MS/MS datasets collected over a 16 month period.« less

  1. Quantifying aggregated uncertainty in Plasmodium falciparum malaria prevalence and populations at risk via efficient space-time geostatistical joint simulation.

    PubMed

    Gething, Peter W; Patil, Anand P; Hay, Simon I

    2010-04-01

    Risk maps estimating the spatial distribution of infectious diseases are required to guide public health policy from local to global scales. The advent of model-based geostatistics (MBG) has allowed these maps to be generated in a formal statistical framework, providing robust metrics of map uncertainty that enhances their utility for decision-makers. In many settings, decision-makers require spatially aggregated measures over large regions such as the mean prevalence within a country or administrative region, or national populations living under different levels of risk. Existing MBG mapping approaches provide suitable metrics of local uncertainty--the fidelity of predictions at each mapped pixel--but have not been adapted for measuring uncertainty over large areas, due largely to a series of fundamental computational constraints. Here the authors present a new efficient approximating algorithm that can generate for the first time the necessary joint simulation of prevalence values across the very large prediction spaces needed for global scale mapping. This new approach is implemented in conjunction with an established model for P. falciparum allowing robust estimates of mean prevalence at any specified level of spatial aggregation. The model is used to provide estimates of national populations at risk under three policy-relevant prevalence thresholds, along with accompanying model-based measures of uncertainty. By overcoming previously unchallenged computational barriers, this study illustrates how MBG approaches, already at the forefront of infectious disease mapping, can be extended to provide large-scale aggregate measures appropriate for decision-makers.

  2. Coupled SWAT-MODFLOW Model Development for Large Basins

    NASA Astrophysics Data System (ADS)

    Aliyari, F.; Bailey, R. T.; Tasdighi, A.

    2017-12-01

    Water management in semi-arid river basins requires allocating water resources between urban, industrial, energy, and agricultural sectors, with the latter competing for necessary irrigation water to sustain crop yield. Competition between these sectors will intensify due to changes in climate and population growth. In this study, the recently developed SWAT-MODFLOW coupled hydrologic model is modified for application in a large managed river basin that provides both surface water and groundwater resources for urban and agricultural areas. Specific modifications include the linkage of groundwater pumping and irrigation practices and code changes to allow for the large number of SWAT hydrologic response units (HRU) required for a large river basin. The model is applied to the South Platte River Basin (SPRB), a 56,980 km2 basin in northeastern Colorado dominated by large urban areas along the front range of the Rocky Mountains and agriculture regions to the east. Irregular seasonal and annual precipitation and 150 years of urban and agricultural water management history in the basin provide an ideal test case for the SWAT-MODFLOW model. SWAT handles land surface and soil zone processes whereas MODFLOW handles groundwater flow and all sources and sinks (pumping, injection, bedrock inflow, canal seepage, recharge areas, groundwater/surface water interaction), with recharge and stream stage provided by SWAT. The model is tested against groundwater levels, deep percolation estimates, and stream discharge. The model will be used to quantify spatial groundwater vulnerability in the basin under scenarios of climate change and population growth.

  3. A Functional Model for Management of Large Scale Assessments.

    ERIC Educational Resources Information Center

    Banta, Trudy W.; And Others

    This functional model for managing large-scale program evaluations was developed and validated in connection with the assessment of Tennessee's Nutrition Education and Training Program. Management of such a large-scale assessment requires the development of a structure for the organization; distribution and recovery of large quantities of…

  4. Species-Specific Elements in the Large T-Antigen J Domain Are Required for Cellular Transformation and DNA Replication by Simian Virus 40

    PubMed Central

    Sullivan, Christopher S.; Tremblay, James D.; Fewell, Sheara W.; Lewis, John A.; Brodsky, Jeffrey L.; Pipas, James M.

    2000-01-01

    The J domain of simian virus 40 (SV40) large T antigen is required for efficient DNA replication and transformation. Despite previous reports demonstrating the promiscuity of J domains in heterologous systems, results presented here show the requirement for specific J-domain sequences in SV40 large-T-antigen-mediated activities. In particular, chimeric-T-antigen constructs in which the SV40 T-antigen J domain was replaced with that from the yeast Ydj1p or Escherichia coli DnaJ proteins failed to replicate in BSC40 cells and did not transform REF52 cells. However, T antigen containing the JC virus J domain was functional in these assays, although it was less efficient than the wild type. The inability of some large-T-antigen chimeras to promote DNA replication and elicit cellular transformation was not due to a failure to interact with hsc70, since a nonfunctional chimera, containing the DnaJ J domain, bound hsc70. However, this nonfunctional chimeric T antigen was reduced in its ability to stimulate hsc70 ATPase activity and unable to liberate E2F from p130, indicating that transcriptional activation of factors required for cell growth and DNA replication may be compromised. Our data suggest that the T-antigen J domain harbors species-specific elements required for viral activities in vivo. PMID:10891510

  5. Engineering large-scale agent-based systems with consensus

    NASA Technical Reports Server (NTRS)

    Bokma, A.; Slade, A.; Kerridge, S.; Johnson, K.

    1994-01-01

    The paper presents the consensus method for the development of large-scale agent-based systems. Systems can be developed as networks of knowledge based agents (KBA) which engage in a collaborative problem solving effort. The method provides a comprehensive and integrated approach to the development of this type of system. This includes a systematic analysis of user requirements as well as a structured approach to generating a system design which exhibits the desired functionality. There is a direct correspondence between system requirements and design components. The benefits of this approach are that requirements are traceable into design components and code thus facilitating verification. The use of the consensus method with two major test applications showed it to be successful and also provided valuable insight into problems typically associated with the development of large systems.

  6. Emergency egress requirements for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Ray, Paul S.

    1990-01-01

    There is a real concern regarding the requirements for safe emergency egress from the Space Station Freedom (SSF). The possible causes of emergency are depressurization due to breach of the station hull by space debris, meteoroids, seal failure, or vent failure; chemical toxicity; and a large fire. The objectives of the current study are to identify the tasks required to be performed in emergencies, establish the time required to perform these tasks, and to review the human equipment interface in emergencies. It was found that a fixed time value specified for egress has shifted focus from the basic requirements of safe egress, that in some situations the crew members may not be able to complete the emergency egress tasks in three minutes without sacrificing more than half of the station, and that increased focus should be given to human factors aspects of space station design.

  7. Accounting for patient size in the optimization of dose and image quality of pelvis cone beam CT protocols on the Varian OBI system

    PubMed Central

    Moore, Craig S; Horsfield, Carl J; Saunderson, John R; Beavis, Andrew W

    2015-01-01

    Objective: The purpose of this study was to develop size-based radiotherapy kilovoltage cone beam CT (CBCT) protocols for the pelvis. Methods: Image noise was measured in an elliptical phantom of varying size for a range of exposure factors. Based on a previously defined “small pelvis” reference patient and CBCT protocol, appropriate exposure factors for small, medium, large and extra-large patients were derived which approximate the image noise behaviour observed on a Philips CT scanner (Philips Medical Systems, Best, Netherlands) with automatic exposure control (AEC). Selection criteria, based on maximum tube current–time product per rotation selected during the radiotherapy treatment planning scan, were derived based on an audit of patient size. Results: It has been demonstrated that 110 kVp yields acceptable image noise for reduced patient dose in pelvic CBCT scans of small, medium and large patients, when compared with manufacturer's default settings (125 kVp). Conversely, extra-large patients require increased exposure factors to give acceptable images. 57% of patients in the local population now receive much lower radiation doses, whereas 13% require higher doses (but now yield acceptable images). Conclusion: The implementation of size-based exposure protocols has significantly reduced radiation dose to the majority of patients with no negative impact on image quality. Increased doses are required on the largest patients to give adequate image quality. Advances in knowledge: The development of size-based CBCT protocols that use the planning CT scan (with AEC) to determine which protocol is appropriate ensures adequate image quality whilst minimizing patient radiation dose. PMID:26419892

  8. Automatic analysis of nuclear-magnetic-resonance-spectroscopy clinical research data

    NASA Astrophysics Data System (ADS)

    Scott, Katherine N.; Wilson, David C.; Bruner, Angela P.; Lyles, Teresa A.; Underhill, Brandon; Geiser, Edward A.; Ballinger, J. Ray; Scott, James D.; Stopka, Christine B.

    1998-03-01

    A major problem of P-31 nuclear magnetic spectroscopy (MRS) in vivo applications is that when large data sets are acquired, the time invested in data reduction and analysis with currently available technologies may totally overshadow the time required for data acquisition. An example is out MRS monitoring of exercise therapy for patients with peripheral vascular disease. In these, the spectral acquisition requires 90 minutes per patient study, whereas data analysis and reduction requires 6-8 hours. Our laboratory currently uses the proprietary software SA/GE developed by General Electric. However, other software packages have similar limitations. When data analysis takes this long, the researcher does not have the rapid feedback required to ascertain the quality of data acquired nor the result of the study. This highly undesirable even in a research environment, but becomes intolerable in the clinical setting. The purpose of this report is to outline progress towards the development of an automated method for eliminating the spectral analysis burden on the researcher working in the clinical setting.

  9. Transportation node space station conceptual design

    NASA Technical Reports Server (NTRS)

    1988-01-01

    A number of recent studies have addressed the problem of a transportation node space station. How things would change or what addition facilities would be needed to support a major lunar or Mars initiative is a much often asked question. The support of a lunar base, requiring stacks on the order of 200 metric tons each to land 25 m tons on the lunar surface with reusable vehicles is addressed. The problem of maintaining and reusing large single stage Orbit Transfer Vehicles (OTVs) and single stage lander/launchers in space are examined. The required people and equipment needed, to maintain these vehicles are only vaguely known at present. The people and equipment needed depend on how well the OTV and lander/launcher can be designed for easy reuse. Since the OTV and lander/launcher are only conceptually defined at present, the real maintenance and refurbishment requirements are unobtainable. An estimate of what is needed, based on previous studies and obvious requirements was therefore made. An attempt was made to err on the conservative side.

  10. Conceptual Design of a Two Spool Compressor for the NASA Large Civil Tilt Rotor Engine

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.; Thurman, Douglas R.

    2010-01-01

    This paper focuses on the conceptual design of a two spool compressor for the NASA Large Civil Tilt Rotor engine, which has a design-point pressure ratio goal of 30:1 and an inlet weight flow of 30.0 lbm/sec. The compressor notional design requirements of pressure ratio and low-pressure compressor (LPC) and high pressure ratio compressor (HPC) work split were based on a previous engine system study to meet the mission requirements of the NASA Subsonic Rotary Wing Projects Large Civil Tilt Rotor vehicle concept. Three mean line compressor design and flow analysis codes were utilized for the conceptual design of a two-spool compressor configuration. This study assesses the technical challenges of design for various compressor configuration options to meet the given engine cycle results. In the process of sizing, the technical challenges of the compressor became apparent as the aerodynamics were taken into consideration. Mechanical constraints were considered in the study such as maximum rotor tip speeds and conceptual sizing of rotor disks and shafts. The rotor clearance-to-span ratio in the last stage of the LPC is 1.5% and in the last stage of the HPC is 2.8%. Four different configurations to meet the HPC requirements were studied, ranging from a single stage centrifugal, two axi-centrifugals, and all axial stages. Challenges of the HPC design include the high temperature (1,560deg R) at the exit which could limit the maximum allowable peripheral tip speed for centrifugals, and is dependent on material selection. The mean line design also resulted in the definition of the flow path geometry of the axial and centrifugal compressor stages, rotor and stator vane angles, velocity components, and flow conditions at the leading and trailing edges of each blade row at the hub, mean and tip. A mean line compressor analysis code was used to estimate the compressor performance maps at off-design speeds and to determine the required variable geometry reset schedules of the inlet guide vane and variable stators that would result in the transonic stages being aerodynamically matched with high efficiency and acceptable stall margins based on user specified maximum levels of rotor diffusion factor and relative velocity ratio.

  11. Silicon nitride ceramic development in Thales Alenia Space : qualification achievement and further developments for future applications

    NASA Astrophysics Data System (ADS)

    Cornillon, L.; Devilliers, C.; Behar-Lafenetre, S.; Ait-Zaid, S.; Berroth, K.; Bravo, A. C.

    2017-11-01

    Dealing with ceramic materials for more than two decades, Thales Alenia Space - France has identified Silicon Nitride Si3N4 as a high potential material for the manufacturing of stiff, stable and lightweight truss structure for future large telescopes. Indeed, for earth observation or astronomic observation, space mission requires more and more telescopes with high spatial resolution, which leads to the use of large primary mirrors, and a long distance between primary and secondary mirrors. Therefore current and future large space telescopes require a huge truss structure to hold and locate precisely the mirrors. Such large structure requires very strong materials with high specific stiffness and a low coefficient of thermal expansion (CTE). Based on the silicon nitride performances and on the know how of FCT Ingenieurkeramik to manufacture complex parts, Thales Alenia Space (TAS) has engaged, in cooperation with FCT, activities to develop and qualify silicon nitride parts for other applications for space projects.

  12. Silicon nitride ceramic development in Thales Alenia Space: qualification achiement and further developments for future applications

    NASA Astrophysics Data System (ADS)

    Cornillon, L.; Devilliers, C.; Behar-Lafenetre, S.; Ait-Zaid, S.; Berroth, K.; Bravo, A. C.

    2017-11-01

    Dealing with ceramic materials for more than two decades, Thales Alenia Space - France has identified Silicon Nitride Si3N4 as a high potential material for the manufacturing of stiff, stable and lightweight truss structure for future large telescopes. Indeed, for earth observation or astronomic observation, space mission requires more and more telescopes with high spatial resolution, which leads to the use of large primary mirrors, and a long distance between primary and secondary mirrors. Therefore current and future large space telescopes require a huge truss structure to hold and locate precisely the mirrors. Such large structure requires very strong materials with high specific stiffness and a low coefficient of thermal expansion (CTE). Based on the silicon nitride performances and on the know how of FCT Ingenieurkeramik to manufacture complex parts, Thales Alenia Space (TAS) has engaged, in cooperation with FCT, activities to develop and qualify silicon nitride parts for other applications for space projects.

  13. Fabrication of near-net shape graphite/magnesium composites for large mirrors

    NASA Astrophysics Data System (ADS)

    Wendt, Robert; Misra, Mohan

    1990-10-01

    Successful development of space-based surveillance and laser systems will require large precision mirrors which are dimensionally stable under thermal, static, and dynamic (i.e., structural vibrations and retargeting) loading conditions. Among the advanced composites under consideration for large space mirrors, graphite fiber reinforced magnesium (Gr/Mg) is an ideal candidate material that can be tailored to obtain an optimum combination of properties, including a high modulus of elasticity, zero coefficient of thermal expansion, low density, and high thermal conductivity. In addition, an innovative technique, combining conventional filament winding and vacuum casting has been developed to produce near-net shape Gr/Mg composites. This approach can significantly reduce the cost of fabricating large mirrors by decreasing required machining. However, since Gr/Mg cannot be polished to a reflective surface, plating is required. This paper will review research at Martin Marietta Astronautics Group on Gr/Mg mirror blank fabrication and measured mechanical and thermal properties. Also, copper plating and polishing methods, and optical surface characteristics will be presented.

  14. Large Animal Models of an In Vivo Bioreactor for Engineering Vascularized Bone.

    PubMed

    Akar, Banu; Tatara, Alexander M; Sutradhar, Alok; Hsiao, Hui-Yi; Miller, Michael; Cheng, Ming-Huei; Mikos, Antonios G; Brey, Eric M

    2018-04-12

    Reconstruction of large skeletal defects is challenging due to the requirement for large volumes of donor tissue and the often complex surgical procedures. Tissue engineering has the potential to serve as a new source of tissue for bone reconstruction, but current techniques are often limited in regards to the size and complexity of tissue that can be formed. Building tissue using an in vivo bioreactor approach may enable the production of appropriate amounts of specialized tissue, while reducing issues of donor site morbidity and infection. Large animals are required to screen and optimize new strategies for growing clinically appropriate volumes of tissues in vivo. In this article, we review both ovine and porcine models that serve as models of the technique proposed for clinical engineering of bone tissue in vivo. Recent findings are discussed with these systems, as well as description of next steps required for using these models, to develop clinically applicable tissue engineering applications.

  15. ARIES: Acquisition of Requirements and Incremental Evolution of Specifications

    NASA Technical Reports Server (NTRS)

    Roberts, Nancy A.

    1993-01-01

    This paper describes a requirements/specification environment specifically designed for large-scale software systems. This environment is called ARIES (Acquisition of Requirements and Incremental Evolution of Specifications). ARIES provides assistance to requirements analysts for developing operational specifications of systems. This development begins with the acquisition of informal system requirements. The requirements are then formalized and gradually elaborated (transformed) into formal and complete specifications. ARIES provides guidance to the user in validating formal requirements by translating them into natural language representations and graphical diagrams. ARIES also provides ways of analyzing the specification to ensure that it is correct, e.g., testing the specification against a running simulation of the system to be built. Another important ARIES feature, especially when developing large systems, is the sharing and reuse of requirements knowledge. This leads to much less duplication of effort. ARIES combines all of its features in a single environment that makes the process of capturing a formal specification quicker and easier.

  16. Light-Drag Enhancement by a Highly Dispersive Rubidium Vapor.

    PubMed

    Safari, Akbar; De Leon, Israel; Mirhosseini, Mohammad; Magaña-Loaiza, Omar S; Boyd, Robert W

    2016-01-08

    The change in the speed of light as it propagates through a moving material has been a subject of study for almost two centuries. This phenomenon, known as the Fresnel light-drag effect, is quite small and usually requires a large interaction path length and/or a large velocity of the moving medium to be observed. Here, we show experimentally that the observed drag effect can be enhanced by over 2 orders of magnitude when the light beam propagates through a moving slow-light medium. Our results are in good agreement with the theoretical prediction, which indicates that, in the limit of large group indices, the strength of the light-drag effect is proportional to the group index of the moving medium.

  17. A Geosynchronous Synthetic Aperture Provides for Disaster Management, Measurement of Soil Moisture, and Measurement of Earth-Surface Dynamics

    NASA Technical Reports Server (NTRS)

    Madsen, Soren; Komar, George (Technical Monitor)

    2001-01-01

    A GEO-based Synthetic Aperture Radar (SAR) could provide daily coverage of basically all of North and South America with very good temporal coverage within the mapped area. This affords a key capability to disaster management, tectonic mapping and modeling, and vegetation mapping. The fine temporal sampling makes this system particularly useful for disaster management of flooding, hurricanes, and earthquakes. By using a fairly long wavelength, changing water boundaries caused by storms or flooding could be monitored in near real-time. This coverage would also provide revolutionary capabilities in the field of radar interferometry, including the capability to study the interferometric signature immediately before and after an earthquake, thus allowing unprecedented studies of Earth-surface dynamics. Preeruptive volcano dynamics could be studied as well as pre-seismic deformation, one of the most controversial and elusive aspects of earthquakes. Interferometric correlation would similarly allow near real-time mapping of surface changes caused by volcanic eruptions, mud slides, or fires. Finally, a GEO SAR provides an optimum configuration for soil moisture measurement that requires a high temporal sampling rate (1-2 days) with a moderate spatial resolution (1 km or better). From a technological point of view, the largest challenges involved in developing a geosynchronous SAR capability relate to the very large slant range distance from the radar to the mapped area. This leads to requirements for large power or alternatively very large antenna, the ability to steer the mapping area to the left and right of the satellite, and control of the elevation and azimuth angles. The weight of this system is estimated to be 2750 kg and it would require 20 kW of DC-power. Such a system would provide up to a 600 km ground swath in a strip-mapping mode and 4000 km dual-sided mapping in a scan-SAR mode.

  18. Zero-gravity cloud physics.

    NASA Technical Reports Server (NTRS)

    Hollinden, A. B.; Eaton, L. R.; Vaughan, W. W.

    1972-01-01

    The first results of an ongoing preliminary-concept and detailed-feasibility study of a zero-gravity earth-orbital cloud physics research facility are reviewed. Current planning and thinking are being shaped by two major conclusions of this study: (1) there is a strong requirement for and it is feasible to achieve important and significant research in a zero-gravity cloud physics facility; and (2) some very important experiments can be accomplished with 'off-the-shelf' type hardware by astronauts who have no cloud-physics background; the most complicated experiments may require sophisticated observation and motion subsystems and the astronaut may need graduate level cloud physics training; there is a large number of experiments whose complexity varies between these two extremes.

  19. Molecular epidemiology biomarkers-Sample collection and processing considerations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holland, Nina T.; Pfleger, Laura; Berger, Eileen

    2005-08-07

    Biomarker studies require processing and storage of numerous biological samples with the goals of obtaining a large amount of information and minimizing future research costs. An efficient study design includes provisions for processing of the original samples, such as cryopreservation, DNA isolation, and preparation of specimens for exposure assessment. Use of standard, two-dimensional and nanobarcodes and customized electronic databases assure efficient management of large sample collections and tracking results of data analyses. Standard operating procedures and quality control plans help to protect sample quality and to assure validity of the biomarker data. Specific state, federal and international regulations are inmore » place regarding research with human samples, governing areas including custody, safety of handling, and transport of human samples. Appropriate informed consent must be obtained from the study subjects prior to sample collection and confidentiality of results maintained. Finally, examples of three biorepositories of different scale (European Cancer Study, National Cancer Institute and School of Public Health Biorepository, University of California, Berkeley) are used to illustrate challenges faced by investigators and the ways to overcome them. New software and biorepository technologies are being developed by many companies that will help to bring biological banking to a new level required by molecular epidemiology of the 21st century.« less

  20. Fetal metabolic influences of neonatal anthropometry and adiposity.

    PubMed

    Donnelly, Jean M; Lindsay, Karen L; Walsh, Jennifer M; Horan, Mary; Molloy, Eleanor J; McAuliffe, Fionnuala M

    2015-11-10

    Large for gestational age infants have an increased risk of obesity, cardiovascular and metabolic complications during life. Knowledge of the key predictive factors of neonatal adiposity is required to devise targeted antenatal interventions. Our objective was to determine the fetal metabolic factors that influence regional neonatal adiposity in a cohort of women with previous large for gestational age offspring. Data from the ROLO [Randomised COntrol Trial of LOw Glycaemic Index in Pregnancy] study were analysed in the ROLO Kids study. Neonatal anthropometric and skinfold measurements were compared with fetal leptin and C-peptide results from cord blood in 185 cases. Analyses were performed to examine the association between these metabolic factors and birthweight, anthropometry and markers of central and generalised adiposity. Fetal leptin was found to correlate with birthweight, general adiposity and multiple anthropometric measurements. On multiple regression analysis, fetal leptin remained significantly associated with adiposity, independent of gender, maternal BMI, gestational age or study group assignment, while fetal C-peptide was no longer significant. Fetal leptin may be an important predictor of regional neonatal adiposity. Interventional studies are required to assess the impact of neonatal adiposity on the subsequent risk of childhood obesity and to determine whether interventions which reduce circulating leptin levels have a role to play in improving neonatal adiposity measures.

  1. Decoder calibration with ultra small current sample set for intracortical brain-machine interface

    NASA Astrophysics Data System (ADS)

    Zhang, Peng; Ma, Xuan; Chen, Luyao; Zhou, Jin; Wang, Changyong; Li, Wei; He, Jiping

    2018-04-01

    Objective. Intracortical brain-machine interfaces (iBMIs) aim to restore efficient communication and movement ability for paralyzed patients. However, frequent recalibration is required for consistency and reliability, and every recalibration will require relatively large most current sample set. The aim in this study is to develop an effective decoder calibration method that can achieve good performance while minimizing recalibration time. Approach. Two rhesus macaques implanted with intracortical microelectrode arrays were trained separately on movement and sensory paradigm. Neural signals were recorded to decode reaching positions or grasping postures. A novel principal component analysis-based domain adaptation (PDA) method was proposed to recalibrate the decoder with only ultra small current sample set by taking advantage of large historical data, and the decoding performance was compared with other three calibration methods for evaluation. Main results. The PDA method closed the gap between historical and current data effectively, and made it possible to take advantage of large historical data for decoder recalibration in current data decoding. Using only ultra small current sample set (five trials of each category), the decoder calibrated using the PDA method could achieve much better and more robust performance in all sessions than using other three calibration methods in both monkeys. Significance. (1) By this study, transfer learning theory was brought into iBMIs decoder calibration for the first time. (2) Different from most transfer learning studies, the target data in this study were ultra small sample set and were transferred to the source data. (3) By taking advantage of historical data, the PDA method was demonstrated to be effective in reducing recalibration time for both movement paradigm and sensory paradigm, indicating a viable generalization. By reducing the demand for large current training data, this new method may facilitate the application of intracortical brain-machine interfaces in clinical practice.

  2. RadioAstron and millimetron space observatories: Multiverse models and the search for life

    NASA Astrophysics Data System (ADS)

    Kardashev, N. S.

    2017-04-01

    The transition from the radio to the millimeter and submillimeter ranges is very promising for studies of galactic nuclei, as well as detailed studies of processes related to supermassive black holes, wormholes, and possible manifestations of multi-element Universe (Multiverse) models. This is shown by observations with the largest interferometer available—RadioAstron observatory—that will be used for the scientific program forMillimetron observatory. Observations have also shown the promise of this range for studies of the formation and evolution of planetary systems and searches for manifestations of intelligent life. This is caused by the requirements to use a large amount of condensedmatter and energy in large-scale technological activities. This range can also be used efficiently in the organisation of optimal channels for the transmission of information.

  3. Automated and Autonomous Systems for Combat Service Support: Scoping Study and Technology Prioritisation

    DTIC Science & Technology

    2016-10-01

    workshop, and use case development for automated and autonomous systems for CSS. The scoping study covers key concepts and trends, a technology scan, and...requirements and delimiters for the selected technologies. The report goes on to present detailed use cases for two technologies of interest: semi...selected use cases . As a result of the workshop, the large list of technologies and applications from the scoping study was narrowed down to the top

  4. DETECTION OF SMALL LESIONS OF THE LARGE BOWEL—Barium Enema Versus Double Contrast

    PubMed Central

    Robinson, J. Maurice

    1954-01-01

    Roentgen study with the so-called opaque barium enema with some modifications is superior to double contrast study as the primary means of demonstrating polyps in the colon as well as other lesions. The method described combines fluoroscopy, high kilovoltage radiography, fluoroscopically aimed “spot films” taken with compression, suction and evacuation studies. In this way unsuspected as well as suspected polyps can be demonstrated, particularly if attention is directed to the region where polyps are most likely to be found—namely, the distal third of the large bowel. Double contrast study is quite valuable as a supplement to the modified “single contrast” barium enema, but it has not been sufficiently perfected to replace the modified opaque barium enema as a primary procedure. In many instances a combination of methods will, of course, be required. PMID:13209360

  5. Active Mirror Predictive and Requirements Verification Software (AMP-ReVS)

    NASA Technical Reports Server (NTRS)

    Basinger, Scott A.

    2012-01-01

    This software is designed to predict large active mirror performance at various stages in the fabrication lifecycle of the mirror. It was developed for 1-meter class powered mirrors for astronomical purposes, but is extensible to other geometries. The package accepts finite element model (FEM) inputs and laboratory measured data for large optical-quality mirrors with active figure control. It computes phenomenological contributions to the surface figure error using several built-in optimization techniques. These phenomena include stresses induced in the mirror by the manufacturing process and the support structure, the test procedure, high spatial frequency errors introduced by the polishing process, and other process-dependent deleterious effects due to light-weighting of the mirror. Then, depending on the maturity of the mirror, it either predicts the best surface figure error that the mirror will attain, or it verifies that the requirements for the error sources have been met once the best surface figure error has been measured. The unique feature of this software is that it ties together physical phenomenology with wavefront sensing and control techniques and various optimization methods including convex optimization, Kalman filtering, and quadratic programming to both generate predictive models and to do requirements verification. This software combines three distinct disciplines: wavefront control, predictive models based on FEM, and requirements verification using measured data in a robust, reusable code that is applicable to any large optics for ground and space telescopes. The software also includes state-of-the-art wavefront control algorithms that allow closed-loop performance to be computed. It allows for quantitative trade studies to be performed for optical systems engineering, including computing the best surface figure error under various testing and operating conditions. After the mirror manufacturing process and testing have been completed, the software package can be used to verify that the underlying requirements have been met.

  6. A new analytical method for characterizing nonlinear visual processes with stimuli of arbitrary distribution: Theory and applications.

    PubMed

    Hayashi, Ryusuke; Watanabe, Osamu; Yokoyama, Hiroki; Nishida, Shin'ya

    2017-06-01

    Characterization of the functional relationship between sensory inputs and neuronal or observers' perceptual responses is one of the fundamental goals of systems neuroscience and psychophysics. Conventional methods, such as reverse correlation and spike-triggered data analyses are limited in their ability to resolve complex and inherently nonlinear neuronal/perceptual processes because these methods require input stimuli to be Gaussian with a zero mean. Recent studies have shown that analyses based on a generalized linear model (GLM) do not require such specific input characteristics and have advantages over conventional methods. GLM, however, relies on iterative optimization algorithms and its calculation costs become very expensive when estimating the nonlinear parameters of a large-scale system using large volumes of data. In this paper, we introduce a new analytical method for identifying a nonlinear system without relying on iterative calculations and yet also not requiring any specific stimulus distribution. We demonstrate the results of numerical simulations, showing that our noniterative method is as accurate as GLM in estimating nonlinear parameters in many cases and outperforms conventional, spike-triggered data analyses. As an example of the application of our method to actual psychophysical data, we investigated how different spatiotemporal frequency channels interact in assessments of motion direction. The nonlinear interaction estimated by our method was consistent with findings from previous vision studies and supports the validity of our method for nonlinear system identification.

  7. Variability extraction and modeling for product variants.

    PubMed

    Linsbauer, Lukas; Lopez-Herrejon, Roberto Erick; Egyed, Alexander

    2017-01-01

    Fast-changing hardware and software technologies in addition to larger and more specialized customer bases demand software tailored to meet very diverse requirements. Software development approaches that aim at capturing this diversity on a single consolidated platform often require large upfront investments, e.g., time or budget. Alternatively, companies resort to developing one variant of a software product at a time by reusing as much as possible from already-existing product variants. However, identifying and extracting the parts to reuse is an error-prone and inefficient task compounded by the typically large number of product variants. Hence, more disciplined and systematic approaches are needed to cope with the complexity of developing and maintaining sets of product variants. Such approaches require detailed information about the product variants, the features they provide and their relations. In this paper, we present an approach to extract such variability information from product variants. It identifies traces from features and feature interactions to their implementation artifacts, and computes their dependencies. This work can be useful in many scenarios ranging from ad hoc development approaches such as clone-and-own to systematic reuse approaches such as software product lines. We applied our variability extraction approach to six case studies and provide a detailed evaluation. The results show that the extracted variability information is consistent with the variability in our six case study systems given by their variability models and available product variants.

  8. 77 FR 24843 - Approval and Promulgation of Air Quality Implementation Plans; Virginia; Removal of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-26

    ... requirements for large stationary internal combustion engines under the NO X SIP Call. Transco Station 175 has...), large stationary internal combustion engines, and large cement kilns. The NO X SIP Call was challenged... internal combustion engines and large cement kilns. EPA approved Virginia's Phase I NO X SIP Call...

  9. A successful trap design for capturing large terrestrial snakes

    Treesearch

    Shirley J. Burgdorf; D. Craig Rudolph; Richard N. Conner; Daniel Saenz; Richard R. Schaefer

    2005-01-01

    Large scale trapping protocols for snakes can be expensive and require large investments of personnel and time. Typical methods, such as pitfall and small funnel traps, are not useful or suitable for capturing large snakes. A method was needed to survey multiple blocks of habitat for the Louisiana Pine Snake (Pituophis ruthveni), throughout its...

  10. Infrastructure for large space telescopes

    NASA Astrophysics Data System (ADS)

    MacEwen, Howard A.; Lillie, Charles F.

    2016-10-01

    It is generally recognized (e.g., in the National Aeronautics and Space Administration response to recent congressional appropriations) that future space observatories must be serviceable, even if they are orbiting in deep space (e.g., around the Sun-Earth libration point, SEL2). On the basis of this legislation, we believe that budgetary considerations throughout the foreseeable future will require that large, long-lived astrophysics missions must be designed as evolvable semipermanent observatories that will be serviced using an operational, in-space infrastructure. We believe that the development of this infrastructure will include the design and development of a small to mid-sized servicing vehicle (MiniServ) as a key element of an affordable infrastructure for in-space assembly and servicing of future space vehicles. This can be accomplished by the adaptation of technology developed over the past half-century into a vehicle approximately the size of the ascent stage of the Apollo Lunar Module to provide some of the servicing capabilities that will be needed by very large telescopes located in deep space in the near future (2020s and 2030s). We specifically address the need for a detailed study of these servicing requirements and the current proposals for using presently available technologies to provide the appropriate infrastructure.

  11. Scalable Parameter Estimation for Genome-Scale Biochemical Reaction Networks

    PubMed Central

    Kaltenbacher, Barbara; Hasenauer, Jan

    2017-01-01

    Mechanistic mathematical modeling of biochemical reaction networks using ordinary differential equation (ODE) models has improved our understanding of small- and medium-scale biological processes. While the same should in principle hold for large- and genome-scale processes, the computational methods for the analysis of ODE models which describe hundreds or thousands of biochemical species and reactions are missing so far. While individual simulations are feasible, the inference of the model parameters from experimental data is computationally too intensive. In this manuscript, we evaluate adjoint sensitivity analysis for parameter estimation in large scale biochemical reaction networks. We present the approach for time-discrete measurement and compare it to state-of-the-art methods used in systems and computational biology. Our comparison reveals a significantly improved computational efficiency and a superior scalability of adjoint sensitivity analysis. The computational complexity is effectively independent of the number of parameters, enabling the analysis of large- and genome-scale models. Our study of a comprehensive kinetic model of ErbB signaling shows that parameter estimation using adjoint sensitivity analysis requires a fraction of the computation time of established methods. The proposed method will facilitate mechanistic modeling of genome-scale cellular processes, as required in the age of omics. PMID:28114351

  12. Large-scale lattice-Boltzmann simulations over lambda networks

    NASA Astrophysics Data System (ADS)

    Saksena, R.; Coveney, P. V.; Pinning, R.; Booth, S.

    Amphiphilic molecules are of immense industrial importance, mainly due to their tendency to align at interfaces in a solution of immiscible species, e.g., oil and water, thereby reducing surface tension. Depending on the concentration of amphiphiles in the solution, they may assemble into a variety of morphologies, such as lamellae, micelles, sponge and cubic bicontinuous structures exhibiting non-trivial rheological properties. The main objective of this work is to study the rheological properties of very large, defect-containing gyroidal systems (of up to 10243 lattice sites) using the lattice-Boltzmann method. Memory requirements for the simulation of such large lattices exceed that available to us on most supercomputers and so we use MPICH-G2/MPIg to investigate geographically distributed domain decomposition simulations across HPCx in the UK and TeraGrid in the US. Use of MPICH-G2/MPIg requires the port-forwarder to work with the grid middleware on HPCx. Data from the simulations is streamed to a high performance visualisation resource at UCL (London) for rendering and visualisation. Lighting the Blue Touchpaper for UK e-Science - Closing Conference of ESLEA Project March 26-28 2007 The George Hotel, Edinburgh, UK

  13. EvArnoldi: A New Algorithm for Large-Scale Eigenvalue Problems.

    PubMed

    Tal-Ezer, Hillel

    2016-05-19

    Eigenvalues and eigenvectors are an essential theme in numerical linear algebra. Their study is mainly motivated by their high importance in a wide range of applications. Knowledge of eigenvalues is essential in quantum molecular science. Solutions of the Schrödinger equation for the electrons composing the molecule are the basis of electronic structure theory. Electronic eigenvalues compose the potential energy surfaces for nuclear motion. The eigenvectors allow calculation of diople transition matrix elements, the core of spectroscopy. The vibrational dynamics molecule also requires knowledge of the eigenvalues of the vibrational Hamiltonian. Typically in these problems, the dimension of Hilbert space is huge. Practically, only a small subset of eigenvalues is required. In this paper, we present a highly efficient algorithm, named EvArnoldi, for solving the large-scale eigenvalues problem. The algorithm, in its basic formulation, is mathematically equivalent to ARPACK ( Sorensen , D. C. Implicitly Restarted Arnoldi/Lanczos Methods for Large Scale Eigenvalue Calculations ; Springer , 1997 ; Lehoucq , R. B. ; Sorensen , D. C. SIAM Journal on Matrix Analysis and Applications 1996 , 17 , 789 ; Calvetti , D. ; Reichel , L. ; Sorensen , D. C. Electronic Transactions on Numerical Analysis 1994 , 2 , 21 ) (or Eigs of Matlab) but significantly simpler.

  14. Systems definition study for shuttle demonstration flights of large space structures, Volume 2: Technical Report

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The development of large space structure (LSS) technology is discussed, with emphasis on space fabricated structures which are automatically manufactured in space from sheet-strip materials and assembled on-orbit. It is concluded that an LSS flight demonstration using an Automated Beam Builder and the orbiter as a construction base, could be performed in the 1983-1984 time period. The estimated cost is $24 million exclusive of shuttle launch costs. During the mission, a simple space platform could be constructed in-orbit to accommodate user requirements associated with earth viewing and materials exposure experiments needs.

  15. Dislocation Multiplication by Single Cross Slip for FCC at Submicron Scales

    NASA Astrophysics Data System (ADS)

    Cui, Yi-Nan; Liu, Zhan-Li; Zhuang, Zhuo

    2013-04-01

    The operation mechanism of single cross slip multiplication (SCSM) is investigated by studying the response of one dislocation loop expanding in face-centered-cubic (FCC) single crystal using three-dimensional discrete dislocation dynamic (3D-DDD) simulation. The results show that SCSM can trigger highly correlated dislocation generation in a short time, which may shed some light on understanding the large strain burst observed experimentally. Furthermore, we find that there is a critical stress and material size for the operation of SCSM, which agrees with that required to trigger large strain burst in the compression tests of FCC micropillars.

  16. Upper bounds on asymmetric dark matter self annihilation cross sections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ellwanger, Ulrich; Mitropoulos, Pantelis, E-mail: ulrich.ellwanger@th.u-psud.fr, E-mail: pantelis.mitropoulos@th.u-psud.fr

    2012-07-01

    Most models for asymmetric dark matter allow for dark matter self annihilation processes, which can wash out the asymmetry at temperatures near and below the dark matter mass. We study the coupled set of Boltzmann equations for the symmetric and antisymmetric dark matter number densities, and derive conditions applicable to a large class of models for the absence of a significant wash-out of an asymmetry. These constraints are applied to various existing scenarios. In the case of left- or right-handed sneutrinos, very large electroweak gaugino masses, or very small mixing angles are required.

  17. Design of a practical model-observer-based image quality assessment method for x-ray computed tomography imaging systems

    PubMed Central

    Tseng, Hsin-Wu; Fan, Jiahua; Kupinski, Matthew A.

    2016-01-01

    Abstract. The use of a channelization mechanism on model observers not only makes mimicking human visual behavior possible, but also reduces the amount of image data needed to estimate the model observer parameters. The channelized Hotelling observer (CHO) and channelized scanning linear observer (CSLO) have recently been used to assess CT image quality for detection tasks and combined detection/estimation tasks, respectively. Although the use of channels substantially reduces the amount of data required to compute image quality, the number of scans required for CT imaging is still not practical for routine use. It is our desire to further reduce the number of scans required to make CHO or CSLO an image quality tool for routine and frequent system validations and evaluations. This work explores different data-reduction schemes and designs an approach that requires only a few CT scans. Three different kinds of approaches are included in this study: a conventional CHO/CSLO technique with a large sample size, a conventional CHO/CSLO technique with fewer samples, and an approach that we will show requires fewer samples to mimic conventional performance with a large sample size. The mean value and standard deviation of areas under ROC/EROC curve were estimated using the well-validated shuffle approach. The results indicate that an 80% data reduction can be achieved without loss of accuracy. This substantial data reduction is a step toward a practical tool for routine-task-based QA/QC CT system assessment. PMID:27493982

  18. Thermal control requirements for large space structures

    NASA Technical Reports Server (NTRS)

    Manoff, M.

    1978-01-01

    Performance capabilities and weight requirements of large space structure systems will be significantly influenced by thermal response characteristics. Analyses have been performed to determine temperature levels and gradients for structural configurations and elemental concepts proposed for advanced system applications ranging from relatively small, low-power communication antennas to extremely large, high-power Satellite Power Systems (SPS). Results are presented for selected platform configurations, candidate strut elements, and potential mission environments. The analyses also incorporate material and surface optical property variation. The results illustrate many of the thermal problems which may be encountered in the development of three systems.

  19. Machine learning for large-scale wearable sensor data in Parkinson's disease: Concepts, promises, pitfalls, and futures.

    PubMed

    Kubota, Ken J; Chen, Jason A; Little, Max A

    2016-09-01

    For the treatment and monitoring of Parkinson's disease (PD) to be scientific, a key requirement is that measurement of disease stages and severity is quantitative, reliable, and repeatable. The last 50 years in PD research have been dominated by qualitative, subjective ratings obtained by human interpretation of the presentation of disease signs and symptoms at clinical visits. More recently, "wearable," sensor-based, quantitative, objective, and easy-to-use systems for quantifying PD signs for large numbers of participants over extended durations have been developed. This technology has the potential to significantly improve both clinical diagnosis and management in PD and the conduct of clinical studies. However, the large-scale, high-dimensional character of the data captured by these wearable sensors requires sophisticated signal processing and machine-learning algorithms to transform it into scientifically and clinically meaningful information. Such algorithms that "learn" from data have shown remarkable success in making accurate predictions for complex problems in which human skill has been required to date, but they are challenging to evaluate and apply without a basic understanding of the underlying logic on which they are based. This article contains a nontechnical tutorial review of relevant machine-learning algorithms, also describing their limitations and how these can be overcome. It discusses implications of this technology and a practical road map for realizing the full potential of this technology in PD research and practice. © 2016 International Parkinson and Movement Disorder Society. © 2016 International Parkinson and Movement Disorder Society.

  20. Heads, Shoulders, Elbows, Knees, and Toes: Modular Gdf5 Enhancers Control Different Joints in the Vertebrate Skeleton

    PubMed Central

    Schoor, Michael; Mortlock, Doug P.; Reddi, A. Hari; Kingsley, David M.

    2016-01-01

    Synovial joints are crucial for support and locomotion in vertebrates, and are the frequent site of serious skeletal defects and degenerative diseases in humans. Growth and differentiation factor 5 (Gdf5) is one of the earliest markers of joint formation, is required for normal joint development in both mice and humans, and has been genetically linked to risk of common osteoarthritis in Eurasian populations. Here, we systematically survey the mouse Gdf5 gene for regulatory elements controlling expression in synovial joints. We identify separate regions of the locus that control expression in axial tissues, in proximal versus distal joints in the limbs, and in remarkably specific sub-sets of composite joints like the elbow. Predicted transcription factor binding sites within Gdf5 regulatory enhancers are required for expression in particular joints. The multiple enhancers that control Gdf5 expression in different joints are distributed over a hundred kilobases of DNA, including regions both upstream and downstream of Gdf5 coding exons. Functional rescue tests in mice confirm that the large flanking regions are required to restore normal joint formation and patterning. Orthologs of these enhancers are located throughout the large genomic region previously associated with common osteoarthritis risk in humans. The large array of modular enhancers for Gdf5 provide a new foundation for studying the spatial specificity of joint patterning in vertebrates, as well as new candidates for regulatory regions that may also influence osteoarthritis risk in human populations. PMID:27902701

Top