Sample records for large scale operations

  1. Recent Human Factors Contributions to Improve Military Operations (Human Factors and Ergonomics Society Bulletin. Volume 46, Number 12, December 2003)

    DTIC Science & Technology

    2003-12-01

    operations run the full gamut from large-scale, theater-wide combat, as witnessed in Operation Iraqi Freedom, to small-scale operations against terrorists, to... gamut from large-scale, theater-wide combat, as witnessed in Operation Iraqi Freedom, to small-scale operations against terror- ists, to operations

  2. Aquatic Plant Control Research Program. Large-Scale Operations Management Test of Use of the White Amur for Control of Problem Aquatic Plants. Report 5. Synthesis Report.

    DTIC Science & Technology

    1984-06-01

    RD-Rl45 988 AQUATIC PLANT CONTROL RESEARCH PROGRAM LARGE-SCALE 1/2 OPERATIONS MANAGEMENT ..(U) ARMY ENGINEER WATERWAYS EXPERIMENT STATION VICKSBURG MS...REPORT A-78-2 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR FOR -, CONTROL OF PROBLEM AQUATIC PLANTS Report 5 SYNTHESIS REPORT bv Andrew...Corps of Engineers Washington, DC 20314 84 0,_1 oil.. LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR FOR CONTROL OF PROBLEM AQUATIC

  3. Large-Scale Operations Management Test of Use of The White Amur for Control of Problem Aquatic Plants. Report 1. Baseline Studies. Volume V. The Herpetofauna of Lake Conway, Florida.

    DTIC Science & Technology

    1981-06-01

    V ADA02 7414 UNIVERSITY OF SOUTH FLORIDA TAMPA DEPT OF BIOLOGY F/6 6/6 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE MHITE AMUM-ETC(U) JUN 81...Army Engineer Waterways Expiftaton P. 0. Box 631, Vicksburg, Miss. 391( 0 81 8 1102 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR FOR...78-22// 4. TITLE (and Su~btitle) 5 TYPE OF REPORT & PERIOD COVERED LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE. or Report I of a series THE W4HITE

  4. Large-Scale Operations Management Test of Use of the White Amur for Control of Problem Aquatic Plants. Report 2. First Year Poststocking Results. Volume II. The Fish, Mammals, and Waterfowl of Lake Conway, Florida.

    DTIC Science & Technology

    1982-02-01

    7AD-AI3 853 ’FLORIDA SAME AND FRESH WATER FISH COMMISSION ORLANDO F/ 616 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR--ETC(U...of a series of reports documenting a large-scale operations management test of use of the white amur for control of problem aquatic plants in Lake...M. 1982. "Large-Scale Operations Management Test of Use of the White Amur for Control of Problem Aquatic Plants; Report 2, First Year Poststock- ing

  5. Large-Scale Operations Management Test of Use of the White Amur for Control of Problem Aquatic Plants. Report 2. First Year Poststocking Results. Volume VI. The Water and Sediment Quality of Lake Conway, Florida.

    DTIC Science & Technology

    1982-02-01

    AD A113 .5. ORANGE COUNTY POLLUTION CONTROL DEPT ORLANDO FL F/S 6/6 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR-ETC(U) FEB 82 H D...Large-Scale Operations Management Test of use of the white amur for control of problem aquatic plants in Lake Conway, Fla. Report 1 of the series presents...as follows: Miller, D. 1982. "Large-Scale Operations Management Test of Use of the White Amur for Control of Problem Aquatic Plants; Report 2, First

  6. Large-Scale Operations Management Test of Use of the White Amur for Control of Problem Aquatic Plants. Report 4. Third Year Poststocking Results. Volume VI. The Water and Sediment Quality of Lake Conway, Florida.

    DTIC Science & Technology

    1983-01-01

    RAI-RI247443 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE i/i UNITE AMUR FOR CONTR.. (U) MILLER RND MILLER INC ORLANDO FL H D MILLER ET RL...LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR FOR CONTROL OF PROBLEM AQUATIC PLANTS Report 1: Baseline Studies Volume I...Boyd, J. 1983. "Large-Scale Operations Management Test of Use of the White Amur for Control of Problem Aquatic Plants; Report 4, Third Year Poststocking

  7. Large-Scale Operations Management Test of Use of the White Amur for Control of Problem Aquatic Plants. Report 2. First Year Poststocking Results. Volume IV. Nitrogen and Phosphorus Dynamics of the Lake Conway Ecosystem: Loading Budgets and a Dynamic Hydrologic Phosphorus Model.

    DTIC Science & Technology

    1982-08-01

    AD-AIA 700 FLORIDA UN1V GAINESVILLE DEPT OF ENVIRONMENTAL ENGIN -ETC F/G 6/6 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMOR--ENL...Conway ecosystem and is part of the Large- Scale Operations Management Test (LSOMT) of the Aquatic Plant Control Research Program (APCRP) at the WES...should be cited as follows: Blancher, E. C., II, and Fellows, C. R. 1982. "Large-Scale Operations Management Test of Use of the White Amur for Control

  8. Aquatic Plant Control Research Program. Large-Scale Operations Management Test of Use of the White Amur for Control of Problem Aquatic Plants. Reports 2 and 3. First and Second Year Poststocking Results. Volume 5. The Herpetofauna of Lake Conway, Florida: Community Analysis.

    DTIC Science & Technology

    1983-07-01

    TEST CHART NATIONAL BVIREAU OF StANARS-1963- I AQUATIC PLANT CONTROL RESEARCH PROGRAM TECHNICAL REPORT A-78-2 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF...Waterways Experiment Station P. 0. Box 631, Vicksburg, Miss. 39180 83 11 01 018 - I ., lit I III I | LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE...No. 3. RECIPIENT’S CATALOG NUMBER Technical Report A-78-2 Aa 1 Lj 19 ________5!1___ A. TITLE (Ad Subtitle) LARGE-SCALE OPERATIONS MANAGEMENT S. TYPE

  9. Proceedings of the Annual Meeting (14th) Aquatic Plant Control Research Planning and Operations Review, Held at Lake Eufaula, Oklahoma on 26-29 November 1979.

    DTIC Science & Technology

    1980-10-01

    Development; Problem Identification and Assessment for Aquatic Plant Management; Natural Succession of Aquatic Plants; Large-Scale Operations Management Test...of Insects and Pathogens for Control of Waterhyacinth in Louisiana; Large-Scale Operations Management Test to Evaluate Prevention Methodology for...Control of Eurasian Watermilfoil in Washington; Large-Scale Operations Management Test Using the White Amur at Lake Conway, Florida; and Aquatic Plant Control Activities in the Panama Canal Zone.

  10. Efficient On-Demand Operations in Large-Scale Infrastructures

    ERIC Educational Resources Information Center

    Ko, Steven Y.

    2009-01-01

    In large-scale distributed infrastructures such as clouds, Grids, peer-to-peer systems, and wide-area testbeds, users and administrators typically desire to perform "on-demand operations" that deal with the most up-to-date state of the infrastructure. However, the scale and dynamism present in the operating environment make it challenging to…

  11. Why small-scale cannabis growers stay small: five mechanisms that prevent small-scale growers from going large scale.

    PubMed

    Hammersvik, Eirik; Sandberg, Sveinung; Pedersen, Willy

    2012-11-01

    Over the past 15-20 years, domestic cultivation of cannabis has been established in a number of European countries. New techniques have made such cultivation easier; however, the bulk of growers remain small-scale. In this study, we explore the factors that prevent small-scale growers from increasing their production. The study is based on 1 year of ethnographic fieldwork and qualitative interviews conducted with 45 Norwegian cannabis growers, 10 of whom were growing on a large-scale and 35 on a small-scale. The study identifies five mechanisms that prevent small-scale indoor growers from going large-scale. First, large-scale operations involve a number of people, large sums of money, a high work-load and a high risk of detection, and thus demand a higher level of organizational skills than for small growing operations. Second, financial assets are needed to start a large 'grow-site'. Housing rent, electricity, equipment and nutrients are expensive. Third, to be able to sell large quantities of cannabis, growers need access to an illegal distribution network and knowledge of how to act according to black market norms and structures. Fourth, large-scale operations require advanced horticultural skills to maximize yield and quality, which demands greater skills and knowledge than does small-scale cultivation. Fifth, small-scale growers are often embedded in the 'cannabis culture', which emphasizes anti-commercialism, anti-violence and ecological and community values. Hence, starting up large-scale production will imply having to renegotiate or abandon these values. Going from small- to large-scale cannabis production is a demanding task-ideologically, technically, economically and personally. The many obstacles that small-scale growers face and the lack of interest and motivation for going large-scale suggest that the risk of a 'slippery slope' from small-scale to large-scale growing is limited. Possible political implications of the findings are discussed. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. Large-Scale Operations Management Test of Use of the White Amur for Control of Problem Aquatic Plants. Report 2. First Year Poststocking Results. Volume III. The Plankton and Benthos of Lake Conway, Florida,

    DTIC Science & Technology

    1981-11-01

    AD-AI09 516 FLORIDA UNIV GAINESVILLE DEPT OF ENVIRONMENTAL ENGIN--ETC F/G 6/6 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE,WHITE AMUR--ETC(U... OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR FOR CONTROL OF PROBLEM AQUATIC PLANTS Report I: Baseline Studies Volume I: The Aquatic Macropyes of...COVERED LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF Report 2 of a series THE WHITE AMUR FOR CONTROL OF PROBLEM AQUATIC (In 7 volumes) PLANTS

  13. Large-Scale Operations Management Test of Use of the White Amur for Control of Problem Aquatic Plants. Report 3. Second Year Poststocking Results. Volume VI. The Water and Sediment Quality of Lake Conway, Florida.

    DTIC Science & Technology

    1982-08-01

    AD-A-11 701 ORANGE COUNTY POLLUTION CONTROL DEPT ORLANDO FL F/0 6/6 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR--ETC(U) AUG 82 H...8217 OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR FOR CONTROL -OF PROBLEM AQ.UATIC PLANTS SECOND YEAR POSTSTOCKING RESULTS Volume, Vt The Water...and Subetie) S. TYPE OF REPORT & PERIOD COVERED LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF Report 3 of a series THE WHITE AMUR FOR CONTROL OF

  14. Large-Scale Operations Management Test of Use of the White Amur for Control of Problem Aquatic Plants. Report 2. First Year Poststocking Results. Volume VII. A Model for Evaluation of the Response of the Lake Conway, Florida, Ecosystem to Introduction of the White Amur.

    DTIC Science & Technology

    1981-11-01

    OPERATIONS MANAGEMENT S. TYPE OF REPORT A PERIOD COVERED TEST OF THE USE OF THE WHITE AMUR FOR CONTROL OF Report 2 of a series PROBLEM AQUATIC PLANTS...111. 1981. "Large-Scale Operations Management Test of the Use of the White Amur for Control of Problem Aquatic Plants; Report 2, First Year Poststock...Al 3 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR FOR CONTROL OF PROBLEM AQUATIC PLANTS A MODEL FOR EVALUATION OF

  15. Coupling large scale hydrologic-reservoir-hydraulic models for impact studies in data sparse regions

    NASA Astrophysics Data System (ADS)

    O'Loughlin, Fiachra; Neal, Jeff; Wagener, Thorsten; Bates, Paul; Freer, Jim; Woods, Ross; Pianosi, Francesca; Sheffied, Justin

    2017-04-01

    As hydraulic modelling moves to increasingly large spatial domains it has become essential to take reservoirs and their operations into account. Large-scale hydrological models have been including reservoirs for at least the past two decades, yet they cannot explicitly model the variations in spatial extent of reservoirs, and many reservoirs operations in hydrological models are not undertaken during the run-time operation. This requires a hydraulic model, yet to-date no continental scale hydraulic model has directly simulated reservoirs and their operations. In addition to the need to include reservoirs and their operations in hydraulic models as they move to global coverage, there is also a need to link such models to large scale hydrology models or land surface schemes. This is especially true for Africa where the number of river gauges has consistently declined since the middle of the twentieth century. In this study we address these two major issues by developing: 1) a coupling methodology for the VIC large-scale hydrological model and the LISFLOOD-FP hydraulic model, and 2) a reservoir module for the LISFLOOD-FP model, which currently includes four sets of reservoir operating rules taken from the major large-scale hydrological models. The Volta Basin, West Africa, was chosen to demonstrate the capability of the modelling framework as it is a large river basin ( 400,000 km2) and contains the largest man-made lake in terms of area (8,482 km2), Lake Volta, created by the Akosombo dam. Lake Volta also experiences a seasonal variation in water levels of between two and six metres that creates a dynamic shoreline. In this study, we first run our coupled VIC and LISFLOOD-FP model without explicitly modelling Lake Volta and then compare these results with those from model runs where the dam operations and Lake Volta are included. The results show that we are able to obtain variation in the Lake Volta water levels and that including the dam operations and Lake Volta has significant impacts on the water levels across the domain.

  16. Research on the Construction Management and Sustainable Development of Large-Scale Scientific Facilities in China

    NASA Astrophysics Data System (ADS)

    Guiquan, Xi; Lin, Cong; Xuehui, Jin

    2018-05-01

    As an important platform for scientific and technological development, large -scale scientific facilities are the cornerstone of technological innovation and a guarantee for economic and social development. Researching management of large-scale scientific facilities can play a key role in scientific research, sociology and key national strategy. This paper reviews the characteristics of large-scale scientific facilities, and summarizes development status of China's large-scale scientific facilities. At last, the construction, management, operation and evaluation of large-scale scientific facilities is analyzed from the perspective of sustainable development.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trujillo, Angelina Michelle

    Strategy, Planning, Acquiring- very large scale computing platforms come and go and planning for immensely scalable machines often precedes actual procurement by 3 years. Procurement can be another year or more. Integration- After Acquisition, machines must be integrated into the computing environments at LANL. Connection to scalable storage via large scale storage networking, assuring correct and secure operations. Management and Utilization – Ongoing operations, maintenance, and trouble shooting of the hardware and systems software at massive scale is required.

  18. IS THE SMALL-SCALE MAGNETIC FIELD CORRELATED WITH THE DYNAMO CYCLE?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karak, Bidya Binay; Brandenburg, Axel, E-mail: bbkarak@nordita.org

    2016-01-01

    The small-scale magnetic field is ubiquitous at the solar surface—even at high latitudes. From observations we know that this field is uncorrelated (or perhaps even weakly anticorrelated) with the global sunspot cycle. Our aim is to explore the origin, and particularly the cycle dependence, of such a phenomenon using three-dimensional dynamo simulations. We adopt a simple model of a turbulent dynamo in a shearing box driven by helically forced turbulence. Depending on the dynamo parameters, large-scale (global) and small-scale (local) dynamos can be excited independently in this model. Based on simulations in different parameter regimes, we find that, when onlymore » the large-scale dynamo is operating in the system, the small-scale magnetic field generated through shredding and tangling of the large-scale magnetic field is positively correlated with the global magnetic cycle. However, when both dynamos are operating, the small-scale field is produced from both the small-scale dynamo and the tangling of the large-scale field. In this situation, when the large-scale field is weaker than the equipartition value of the turbulence, the small-scale field is almost uncorrelated with the large-scale magnetic cycle. On the other hand, when the large-scale field is stronger than the equipartition value, we observe an anticorrelation between the small-scale field and the large-scale magnetic cycle. This anticorrelation can be interpreted as a suppression of the small-scale dynamo. Based on our studies we conclude that the observed small-scale magnetic field in the Sun is generated by the combined mechanisms of a small-scale dynamo and tangling of the large-scale field.« less

  19. On the large eddy simulation of turbulent flows in complex geometry

    NASA Technical Reports Server (NTRS)

    Ghosal, Sandip

    1993-01-01

    Application of the method of Large Eddy Simulation (LES) to a turbulent flow consists of three separate steps. First, a filtering operation is performed on the Navier-Stokes equations to remove the small spatial scales. The resulting equations that describe the space time evolution of the 'large eddies' contain the subgrid-scale (sgs) stress tensor that describes the effect of the unresolved small scales on the resolved scales. The second step is the replacement of the sgs stress tensor by some expression involving the large scales - this is the problem of 'subgrid-scale modeling'. The final step is the numerical simulation of the resulting 'closed' equations for the large scale fields on a grid small enough to resolve the smallest of the large eddies, but still much larger than the fine scale structures at the Kolmogorov length. In dividing a turbulent flow field into 'large' and 'small' eddies, one presumes that a cut-off length delta can be sensibly chosen such that all fluctuations on a scale larger than delta are 'large eddies' and the remainder constitute the 'small scale' fluctuations. Typically, delta would be a length scale characterizing the smallest structures of interest in the flow. In an inhomogeneous flow, the 'sensible choice' for delta may vary significantly over the flow domain. For example, in a wall bounded turbulent flow, most statistical averages of interest vary much more rapidly with position near the wall than far away from it. Further, there are dynamically important organized structures near the wall on a scale much smaller than the boundary layer thickness. Therefore, the minimum size of eddies that need to be resolved is smaller near the wall. In general, for the LES of inhomogeneous flows, the width of the filtering kernel delta must be considered to be a function of position. If a filtering operation with a nonuniform filter width is performed on the Navier-Stokes equations, one does not in general get the standard large eddy equations. The complication is caused by the fact that a filtering operation with a nonuniform filter width in general does not commute with the operation of differentiation. This is one of the issues that we have looked at in detail as it is basic to any attempt at applying LES to complex geometry flows. Our principal findings are summarized.

  20. The Application of Large-Scale Hypermedia Information Systems to Training.

    ERIC Educational Resources Information Center

    Crowder, Richard; And Others

    1995-01-01

    Discusses the use of hypermedia in electronic information systems that support maintenance operations in large-scale industrial plants. Findings show that after establishing an information system, the same resource base can be used to train personnel how to use the computer system and how to perform operational and maintenance tasks. (Author/JMV)

  1. Large-scale flow experiments for managing river systems

    USGS Publications Warehouse

    Konrad, Christopher P.; Olden, Julian D.; Lytle, David A.; Melis, Theodore S.; Schmidt, John C.; Bray, Erin N.; Freeman, Mary C.; Gido, Keith B.; Hemphill, Nina P.; Kennard, Mark J.; McMullen, Laura E.; Mims, Meryl C.; Pyron, Mark; Robinson, Christopher T.; Williams, John G.

    2011-01-01

    Experimental manipulations of streamflow have been used globally in recent decades to mitigate the impacts of dam operations on river systems. Rivers are challenging subjects for experimentation, because they are open systems that cannot be isolated from their social context. We identify principles to address the challenges of conducting effective large-scale flow experiments. Flow experiments have both scientific and social value when they help to resolve specific questions about the ecological action of flow with a clear nexus to water policies and decisions. Water managers must integrate new information into operating policies for large-scale experiments to be effective. Modeling and monitoring can be integrated with experiments to analyze long-term ecological responses. Experimental design should include spatially extensive observations and well-defined, repeated treatments. Large-scale flow manipulations are only a part of dam operations that affect river systems. Scientists can ensure that experimental manipulations continue to be a valuable approach for the scientifically based management of river systems.

  2. Towards Large-area Field-scale Operational Evapotranspiration for Water Use Mapping

    NASA Astrophysics Data System (ADS)

    Senay, G. B.; Friedrichs, M.; Morton, C.; Huntington, J. L.; Verdin, J.

    2017-12-01

    Field-scale evapotranspiration (ET) estimates are needed for improving surface and groundwater use and water budget studies. Ideally, field-scale ET estimates would be at regional to national levels and cover long time periods. As a result of large data storage and computational requirements associated with processing field-scale satellite imagery such as Landsat, numerous challenges remain to develop operational ET estimates over large areas for detailed water use and availability studies. However, the combination of new science, data availability, and cloud computing technology is enabling unprecedented capabilities for ET mapping. To demonstrate this capability, we used Google's Earth Engine cloud computing platform to create nationwide annual ET estimates with 30-meter resolution Landsat ( 16,000 images) and gridded weather data using the Operational Simplified Surface Energy Balance (SSEBop) model in support of the National Water Census, a USGS research program designed to build decision support capacity for water management agencies and other natural resource managers. By leveraging Google's Earth Engine Application Programming Interface (API) and developing software in a collaborative, open-platform environment, we rapidly advance from research towards applications for large-area field-scale ET mapping. Cloud computing of the Landsat image archive combined with other satellite, climate, and weather data, is creating never imagined opportunities for assessing ET model behavior and uncertainty, and ultimately providing the ability for more robust operational monitoring and assessment of water use at field-scales.

  3. A novel artificial fish swarm algorithm for solving large-scale reliability-redundancy application problem.

    PubMed

    He, Qiang; Hu, Xiangtao; Ren, Hong; Zhang, Hongqi

    2015-11-01

    A novel artificial fish swarm algorithm (NAFSA) is proposed for solving large-scale reliability-redundancy allocation problem (RAP). In NAFSA, the social behaviors of fish swarm are classified in three ways: foraging behavior, reproductive behavior, and random behavior. The foraging behavior designs two position-updating strategies. And, the selection and crossover operators are applied to define the reproductive ability of an artificial fish. For the random behavior, which is essentially a mutation strategy, the basic cloud generator is used as the mutation operator. Finally, numerical results of four benchmark problems and a large-scale RAP are reported and compared. NAFSA shows good performance in terms of computational accuracy and computational efficiency for large scale RAP. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  4. EFFECTS OF LARGE-SCALE POULTRY FARMS ON AQUATIC MICROBIAL COMMUNITIES: A MOLECULAR INVESTIGATION.

    EPA Science Inventory

    The effects of large-scale poultry production operations on water quality and human health are largely unknown. Poultry litter is frequently applied as fertilizer to agricultural lands adjacent to large poultry farms. Run-off from the land introduces a variety of stressors into t...

  5. The atmospheric implications of radiation belt remediation

    NASA Astrophysics Data System (ADS)

    Rodger, C. J.; Clilverd, M. A.; Ulich, Th.; Verronen, P. T.; Turunen, E.; Thomson, N. R.

    2006-08-01

    High altitude nuclear explosions (HANEs) and geomagnetic storms can produce large scale injections of relativistic particles into the inner radiation belts. It is recognised that these large increases in >1 MeV trapped electron fluxes can shorten the operational lifetime of low Earth orbiting satellites, threatening a large, valuable population. Therefore, studies are being undertaken to bring about practical human control of the radiation belts, termed "Radiation Belt Remediation" (RBR). Here we consider the upper atmospheric consequences of an RBR system operating over either 1 or 10 days. The RBR-forced neutral chemistry changes, leading to NOx enhancements and Ox depletions, are significant during the timescale of the precipitation but are generally not long-lasting. The magnitudes, time-scales, and altitudes of these changes are no more significant than those observed during large solar proton events. In contrast, RBR-operation will lead to unusually intense HF blackouts for about the first half of the operation time, producing large scale disruptions to radio communication and navigation systems. While the neutral atmosphere changes are not particularly important, HF disruptions could be an important area for policy makers to consider, particularly for the remediation of natural injections.

  6. NREL, California Independent System Operator, and First Solar | Energy

    Science.gov Websites

    Solar NREL, California Independent System Operator, and First Solar Demonstrate Essential Reliability Services with Utility-Scale Solar NREL, the California Independent System Operator (CAISO), and First Solar conducted a demonstration project on a large utility-scale photovoltaic (PV) power plant in California to

  7. Implementation of the Large-Scale Operations Management Test in the State of Washington.

    DTIC Science & Technology

    1982-12-01

    During FY 79, the U.S. Army Engineer Waterways Experiment Station (WES), Vicksburg, Miss., completed the first phase of its 3-year Large-Scale Operations Management Test (LSOMT). The LSOMT was designed to develop an operational plan to identify methodologies that can be implemented by the U.S. Army Engineer District, Seattle (NPS), to prevent the exotic aquatic macrophyte Eurasian watermilfoil (Myrophyllum spicatum L.) from reaching problem-level proportions in water bodies in the state of Washington. The WES developed specific plans as integral elements

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallarno, George; Rogers, James H; Maxwell, Don E

    The high computational capability of graphics processing units (GPUs) is enabling and driving the scientific discovery process at large-scale. The world s second fastest supercomputer for open science, Titan, has more than 18,000 GPUs that computational scientists use to perform scientific simu- lations and data analysis. Understanding of GPU reliability characteristics, however, is still in its nascent stage since GPUs have only recently been deployed at large-scale. This paper presents a detailed study of GPU errors and their impact on system operations and applications, describing experiences with the 18,688 GPUs on the Titan supercom- puter as well as lessons learnedmore » in the process of efficient operation of GPUs at scale. These experiences are helpful to HPC sites which already have large-scale GPU clusters or plan to deploy GPUs in the future.« less

  9. Study on Thermal Decomposition Characteristics of Ammonium Nitrate Emulsion Explosive in Different Scales

    NASA Astrophysics Data System (ADS)

    Wu, Qiujie; Tan, Liu; Xu, Sen; Liu, Dabin; Min, Li

    2018-04-01

    Numerous accidents of emulsion explosive (EE) are attributed to uncontrolled thermal decomposition of ammonium nitrate emulsion (ANE, the intermediate of EE) and EE in large scale. In order to study the thermal decomposition characteristics of ANE and EE in different scales, a large-scale test of modified vented pipe test (MVPT), and two laboratory-scale tests of differential scanning calorimeter (DSC) and accelerating rate calorimeter (ARC) were applied in the present study. The scale effect and water effect both play an important role in the thermal stability of ANE and EE. The measured decomposition temperatures of ANE and EE in MVPT are 146°C and 144°C, respectively, much lower than those in DSC and ARC. As the size of the same sample in DSC, ARC, and MVPT successively increases, the onset temperatures decrease. In the same test, the measured onset temperature value of ANE is higher than that of EE. The water composition of the sample stabilizes the sample. The large-scale test of MVPT can provide information for the real-life operations. The large-scale operations have more risks, and continuous overheating should be avoided.

  10. A unifying framework for systems modeling, control systems design, and system operation

    NASA Technical Reports Server (NTRS)

    Dvorak, Daniel L.; Indictor, Mark B.; Ingham, Michel D.; Rasmussen, Robert D.; Stringfellow, Margaret V.

    2005-01-01

    Current engineering practice in the analysis and design of large-scale multi-disciplinary control systems is typified by some form of decomposition- whether functional or physical or discipline-based-that enables multiple teams to work in parallel and in relative isolation. Too often, the resulting system after integration is an awkward marriage of different control and data mechanisms with poor end-to-end accountability. System of systems engineering, which faces this problem on a large scale, cries out for a unifying framework to guide analysis, design, and operation. This paper describes such a framework based on a state-, model-, and goal-based architecture for semi-autonomous control systems that guides analysis and modeling, shapes control system software design, and directly specifies operational intent. This paper illustrates the key concepts in the context of a large-scale, concurrent, globally distributed system of systems: NASA's proposed Array-based Deep Space Network.

  11. Private School Chains in Chile: Do Better Schools Scale Up? Policy Analysis. No. 682

    ERIC Educational Resources Information Center

    Elacqua, Gregory; Contreras, Dante; Salazar, Felipe; Santos, Humberto

    2011-01-01

    There is a persistent debate over the role of scale of operations in education. Some argue that school franchises offer educational services more effectively than do small independent schools. Skeptics counter that large, centralized operations create hard-to-manage bureaucracies and foster diseconomies of scale and that small schools are more…

  12. Large-scale data analysis of power grid resilience across multiple US service regions

    NASA Astrophysics Data System (ADS)

    Ji, Chuanyi; Wei, Yun; Mei, Henry; Calzada, Jorge; Carey, Matthew; Church, Steve; Hayes, Timothy; Nugent, Brian; Stella, Gregory; Wallace, Matthew; White, Joe; Wilcox, Robert

    2016-05-01

    Severe weather events frequently result in large-scale power failures, affecting millions of people for extended durations. However, the lack of comprehensive, detailed failure and recovery data has impeded large-scale resilience studies. Here, we analyse data from four major service regions representing Upstate New York during Super Storm Sandy and daily operations. Using non-stationary spatiotemporal random processes that relate infrastructural failures to recoveries and cost, our data analysis shows that local power failures have a disproportionally large non-local impact on people (that is, the top 20% of failures interrupted 84% of services to customers). A large number (89%) of small failures, represented by the bottom 34% of customers and commonplace devices, resulted in 56% of the total cost of 28 million customer interruption hours. Our study shows that extreme weather does not cause, but rather exacerbates, existing vulnerabilities, which are obscured in daily operations.

  13. Summation-by-Parts operators with minimal dispersion error for coarse grid flow calculations

    NASA Astrophysics Data System (ADS)

    Linders, Viktor; Kupiainen, Marco; Nordström, Jan

    2017-07-01

    We present a procedure for constructing Summation-by-Parts operators with minimal dispersion error both near and far from numerical interfaces. Examples of such operators are constructed and compared with a higher order non-optimised Summation-by-Parts operator. Experiments show that the optimised operators are superior for wave propagation and turbulent flows involving large wavenumbers, long solution times and large ranges of resolution scales.

  14. Exploring network operations for data and information networks

    NASA Astrophysics Data System (ADS)

    Yao, Bing; Su, Jing; Ma, Fei; Wang, Xiaomin; Zhao, Xiyang; Yao, Ming

    2017-01-01

    Barabási and Albert, in 1999, formulated scale-free models based on some real networks: World-Wide Web, Internet, metabolic and protein networks, language or sexual networks. Scale-free networks not only appear around us, but also have high qualities in the world. As known, high quality information networks can transfer feasibly and efficiently data, clearly, their topological structures are very important for data safety. We build up network operations for constructing large scale of dynamic networks from smaller scale of network models having good property and high quality. We focus on the simplest operators to formulate complex operations, and are interesting on the closeness of operations to desired network properties.

  15. Ascertaining Validity in the Abstract Realm of PMESII Simulation Models: An Analysis of the Peace Support Operations Model (PSOM)

    DTIC Science & Technology

    2009-06-01

    simulation is the campaign-level Peace Support Operations Model (PSOM). This thesis provides a quantitative analysis of PSOM. The results are based ...multiple potential outcomes , further development and analysis is required before the model is used for large scale analysis . 15. NUMBER OF PAGES 159...multiple potential outcomes , further development and analysis is required before the model is used for large scale analysis . vi THIS PAGE

  16. Large-scale Eucalyptus energy farms and power cogeneration

    Treesearch

    Robert C. Noroña

    1983-01-01

    A thorough evaluation of all factors possibly affecting a large-scale planting of eucalyptus is foremost in determining the cost effectiveness of the planned operation. Seven basic areas of concern must be analyzed:1. Species Selection 2. Site Preparation 3. Planting 4. Weed Control 5....

  17. Opportunities for Breakthroughs in Large-Scale Computational Simulation and Design

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia; Alter, Stephen J.; Atkins, Harold L.; Bey, Kim S.; Bibb, Karen L.; Biedron, Robert T.; Carpenter, Mark H.; Cheatwood, F. McNeil; Drummond, Philip J.; Gnoffo, Peter A.

    2002-01-01

    Opportunities for breakthroughs in the large-scale computational simulation and design of aerospace vehicles are presented. Computational fluid dynamics tools to be used within multidisciplinary analysis and design methods are emphasized. The opportunities stem from speedups and robustness improvements in the underlying unit operations associated with simulation (geometry modeling, grid generation, physical modeling, analysis, etc.). Further, an improved programming environment can synergistically integrate these unit operations to leverage the gains. The speedups result from reducing the problem setup time through geometry modeling and grid generation operations, and reducing the solution time through the operation counts associated with solving the discretized equations to a sufficient accuracy. The opportunities are addressed only at a general level here, but an extensive list of references containing further details is included. The opportunities discussed are being addressed through the Fast Adaptive Aerospace Tools (FAAST) element of the Advanced Systems Concept to Test (ASCoT) and the third Generation Reusable Launch Vehicles (RLV) projects at NASA Langley Research Center. The overall goal is to enable greater inroads into the design process with large-scale simulations.

  18. Vulnerability of China's nearshore ecosystems under intensive mariculture development.

    PubMed

    Liu, Hui; Su, Jilan

    2017-04-01

    Rapid economic development and increasing population in China have exerted tremendous pressures on the coastal ecosystems. In addition to land-based pollutants and reclamation, fast expansion of large-scale intensive mariculture activities has also brought about additional effects. So far, the ecological impact of rapid mariculture development and its large-scale operations has not drawn enough attention. In this paper, the rapid development of mariculture in China is reviewed, China's effort in the application of ecological mariculture is examined, and the vulnerability of marine ecosystem to mariculture impact is evaluated through a number of examples. Removal or reduced large and forage fish, due to both habitat loss to reclamation/mariculture and overfishing for food or fishmeal, may have far-reaching effects on the coastal and shelf ecosystems in the long run. Large-scale intensive mariculture operations carry with them undesirable biological and biochemical characteristics, which may have consequences on natural ecosystems beyond normally perceived spatial and temporal boundaries. As our understanding of possible impacts of large-scale intensive mariculture is lagging far behind its development, much research is urgently needed.

  19. Computing the universe: how large-scale simulations illuminate galaxies and dark energy

    NASA Astrophysics Data System (ADS)

    O'Shea, Brian

    2015-04-01

    High-performance and large-scale computing is absolutely to understanding astronomical objects such as stars, galaxies, and the cosmic web. This is because these are structures that operate on physical, temporal, and energy scales that cannot be reasonably approximated in the laboratory, and whose complexity and nonlinearity often defies analytic modeling. In this talk, I show how the growth of computing platforms over time has facilitated our understanding of astrophysical and cosmological phenomena, focusing primarily on galaxies and large-scale structure in the Universe.

  20. The statistical power to detect cross-scale interactions at macroscales

    USGS Publications Warehouse

    Wagner, Tyler; Fergus, C. Emi; Stow, Craig A.; Cheruvelil, Kendra S.; Soranno, Patricia A.

    2016-01-01

    Macroscale studies of ecological phenomena are increasingly common because stressors such as climate and land-use change operate at large spatial and temporal scales. Cross-scale interactions (CSIs), where ecological processes operating at one spatial or temporal scale interact with processes operating at another scale, have been documented in a variety of ecosystems and contribute to complex system dynamics. However, studies investigating CSIs are often dependent on compiling multiple data sets from different sources to create multithematic, multiscaled data sets, which results in structurally complex, and sometimes incomplete data sets. The statistical power to detect CSIs needs to be evaluated because of their importance and the challenge of quantifying CSIs using data sets with complex structures and missing observations. We studied this problem using a spatially hierarchical model that measures CSIs between regional agriculture and its effects on the relationship between lake nutrients and lake productivity. We used an existing large multithematic, multiscaled database, LAke multiscaled GeOSpatial, and temporal database (LAGOS), to parameterize the power analysis simulations. We found that the power to detect CSIs was more strongly related to the number of regions in the study rather than the number of lakes nested within each region. CSI power analyses will not only help ecologists design large-scale studies aimed at detecting CSIs, but will also focus attention on CSI effect sizes and the degree to which they are ecologically relevant and detectable with large data sets.

  1. Aquatic Plant Control Research Program. Large-Scale Operations Management Test (LSOMT) of Insects and Pathogens for Control of Waterhyacinth in Louisiana. Volume 1. Results for 1979-1981.

    DTIC Science & Technology

    1985-01-01

    RD-Ai56 759 AQUATIC PLANT CONTROL RESEARCH PROGRAM LARGE-SCALE 1/2 OPERATIONS MRNAGEMENT..(U) ARMY ENGINEER WATERAYS EXPERIMENT STATION VICKSBURG MS...PO Box 631, Vicksburg, Aquatic Plant Control Mississippi 39180-0631 and University of Research Program Tennessee-Chattanooga, Chattanooga, 12...19. KEY WORDS (Continue on reverse side if necesary nd identify by block number) - Aquatic plant control Louisiana Biological control Plant

  2. Development of a superconductor magnetic suspension and balance prototype facility for studying the feasibility of applying this technique to large scale aerodynamic testing

    NASA Technical Reports Server (NTRS)

    Zapata, R. N.; Humphris, R. R.; Henderson, K. C.

    1975-01-01

    The basic research and development work towards proving the feasibility of operating an all-superconductor magnetic suspension and balance device for aerodynamic testing is presented. The feasibility of applying a quasi-six-degree-of freedom free support technique to dynamic stability research was studied along with the design concepts and parameters for applying magnetic suspension techniques to large-scale aerodynamic facilities. A prototype aerodynamic test facility was implemented. Relevant aspects of the development of the prototype facility are described in three sections: (1) design characteristics; (2) operational characteristics; and (3) scaling to larger facilities.

  3. Cortical circuitry implementing graphical models.

    PubMed

    Litvak, Shai; Ullman, Shimon

    2009-11-01

    In this letter, we develop and simulate a large-scale network of spiking neurons that approximates the inference computations performed by graphical models. Unlike previous related schemes, which used sum and product operations in either the log or linear domains, the current model uses an inference scheme based on the sum and maximization operations in the log domain. Simulations show that using these operations, a large-scale circuit, which combines populations of spiking neurons as basic building blocks, is capable of finding close approximations to the full mathematical computations performed by graphical models within a few hundred milliseconds. The circuit is general in the sense that it can be wired for any graph structure, it supports multistate variables, and it uses standard leaky integrate-and-fire neuronal units. Following previous work, which proposed relations between graphical models and the large-scale cortical anatomy, we focus on the cortical microcircuitry and propose how anatomical and physiological aspects of the local circuitry may map onto elements of the graphical model implementation. We discuss in particular the roles of three major types of inhibitory neurons (small fast-spiking basket cells, large layer 2/3 basket cells, and double-bouquet neurons), subpopulations of strongly interconnected neurons with their unique connectivity patterns in different cortical layers, and the possible role of minicolumns in the realization of the population-based maximum operation.

  4. The Effectiveness of Private School Franchises in Chile's National Voucher Program

    ERIC Educational Resources Information Center

    Elacqua, Gregory; Contreras, Dante; Salazar, Felipe; Santos, Humberto

    2011-01-01

    There is persistent debate over the role of scale of operations in education. Some argue that school franchises offer educational services more effectively than small independent schools. Skeptics counter that large centralized operations create hard-to-manage bureaucracies and foster diseconomies of scale and that small schools are more effective…

  5. Assessing the weighted multi-objective adaptive surrogate model optimization to derive large-scale reservoir operating rules with sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Jingwen; Wang, Xu; Liu, Pan; Lei, Xiaohui; Li, Zejun; Gong, Wei; Duan, Qingyun; Wang, Hao

    2017-01-01

    The optimization of large-scale reservoir system is time-consuming due to its intrinsic characteristics of non-commensurable objectives and high dimensionality. One way to solve the problem is to employ an efficient multi-objective optimization algorithm in the derivation of large-scale reservoir operating rules. In this study, the Weighted Multi-Objective Adaptive Surrogate Model Optimization (WMO-ASMO) algorithm is used. It consists of three steps: (1) simplifying the large-scale reservoir operating rules by the aggregation-decomposition model, (2) identifying the most sensitive parameters through multivariate adaptive regression splines (MARS) for dimensional reduction, and (3) reducing computational cost and speeding the searching process by WMO-ASMO, embedded with weighted non-dominated sorting genetic algorithm II (WNSGAII). The intercomparison of non-dominated sorting genetic algorithm (NSGAII), WNSGAII and WMO-ASMO are conducted in the large-scale reservoir system of Xijiang river basin in China. Results indicate that: (1) WNSGAII surpasses NSGAII in the median of annual power generation, increased by 1.03% (from 523.29 to 528.67 billion kW h), and the median of ecological index, optimized by 3.87% (from 1.879 to 1.809) with 500 simulations, because of the weighted crowding distance and (2) WMO-ASMO outperforms NSGAII and WNSGAII in terms of better solutions (annual power generation (530.032 billion kW h) and ecological index (1.675)) with 1000 simulations and computational time reduced by 25% (from 10 h to 8 h) with 500 simulations. Therefore, the proposed method is proved to be more efficient and could provide better Pareto frontier.

  6. The JEDBURGHS: Combat Operations Conducted in the Finistere Region of Brittany, France from July-September 1944

    DTIC Science & Technology

    1990-06-01

    commence large scale operations on 2 August 1944. Napoleon’s hat was the local name of a famous rose-red granite rock at the holiday resort of Perros ...SFHQ that the BBC message authorizing large-scale attacks on the Germans in Brittany be "Le Chapeau de Napoleon est-il 40 TouJours a Perros -Guirec...Napoleon eat-il touJours a Perros -Guirec?" Francis, along with teams Hilary, Horace, and Gilbert, (discussed later In Chapters 7, 6, and 5

  7. Teachers' Perceptions of Teaching in Workplace Simulations in Vocational Education

    ERIC Educational Resources Information Center

    Jossberger, Helen; Brand-Gruwel, Saskia; van de Wiel, Margje W.; Boshuizen, Henny P.

    2015-01-01

    In a large-scale top-down innovation operation in the Netherlands, workplace simulations have been implemented in vocational schools, where students are required to work independently and self-direct their learning. However, research has shown that the success of such large-scale top-down innovations depends on how well their execution in schools…

  8. An Alternative Way to Model Population Ability Distributions in Large-Scale Educational Surveys

    ERIC Educational Resources Information Center

    Wetzel, Eunike; Xu, Xueli; von Davier, Matthias

    2015-01-01

    In large-scale educational surveys, a latent regression model is used to compensate for the shortage of cognitive information. Conventionally, the covariates in the latent regression model are principal components extracted from background data. This operational method has several important disadvantages, such as the handling of missing data and…

  9. Finite Mixture Multilevel Multidimensional Ordinal IRT Models for Large Scale Cross-Cultural Research

    ERIC Educational Resources Information Center

    de Jong, Martijn G.; Steenkamp, Jan-Benedict E. M.

    2010-01-01

    We present a class of finite mixture multilevel multidimensional ordinal IRT models for large scale cross-cultural research. Our model is proposed for confirmatory research settings. Our prior for item parameters is a mixture distribution to accommodate situations where different groups of countries have different measurement operations, while…

  10. Integrating weather and geotechnical monitoring data for assessing the stability of large scale surface mining operations

    NASA Astrophysics Data System (ADS)

    Steiakakis, Chrysanthos; Agioutantis, Zacharias; Apostolou, Evangelia; Papavgeri, Georgia; Tripolitsiotis, Achilles

    2016-01-01

    The geotechnical challenges for safe slope design in large scale surface mining operations are enormous. Sometimes one degree of slope inclination can significantly reduce the overburden to ore ratio and therefore dramatically improve the economics of the operation, while large scale slope failures may have a significant impact on human lives. Furthermore, adverse weather conditions, such as high precipitation rates, may unfavorably affect the already delicate balance between operations and safety. Geotechnical, weather and production parameters should be systematically monitored and evaluated in order to safely operate such pits. Appropriate data management, processing and storage are critical to ensure timely and informed decisions. This paper presents an integrated data management system which was developed over a number of years as well as the advantages through a specific application. The presented case study illustrates how the high production slopes of a mine that exceed depths of 100-120 m were successfully mined with an average displacement rate of 10- 20 mm/day, approaching an almost slow to moderate landslide velocity. Monitoring data of the past four years are included in the database and can be analyzed to produce valuable results. Time-series data correlations of movements, precipitation records, etc. are evaluated and presented in this case study. The results can be used to successfully manage mine operations and ensure the safety of the mine and the workforce.

  11. Artificial intelligence issues related to automated computing operations

    NASA Technical Reports Server (NTRS)

    Hornfeck, William A.

    1989-01-01

    Large data processing installations represent target systems for effective applications of artificial intelligence (AI) constructs. The system organization of a large data processing facility at the NASA Marshall Space Flight Center is presented. The methodology and the issues which are related to AI application to automated operations within a large-scale computing facility are described. Problems to be addressed and initial goals are outlined.

  12. Fine resolution probabilistic land cover classification of landscapes in the southeastern United States

    Treesearch

    Joseph St. Peter; John Hogland; Nathaniel Anderson; Jason Drake; Paul Medley

    2018-01-01

    Land cover classification provides valuable information for prioritizing management and conservation operations across large landscapes. Current regional scale land cover geospatial products within the United States have a spatial resolution that is too coarse to provide the necessary information for operations at the local and project scales. This paper describes a...

  13. Impact of Utility-Scale Distributed Wind on Transmission-Level System Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brancucci Martinez-Anido, C.; Hodge, B. M.

    2014-09-01

    This report presents a new renewable integration study that aims to assess the potential for adding distributed wind to the current power system with minimal or no upgrades to the distribution or transmission electricity systems. It investigates the impacts of integrating large amounts of utility-scale distributed wind power on bulk system operations by performing a case study on the power system of the Independent System Operator-New England (ISO-NE).

  14. Overview of Opportunities for Co-Location of Solar Energy Technologies and Vegetation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macknick, Jordan; Beatty, Brenda; Hill, Graham

    2013-12-01

    Large-scale solar facilities have the potential to contribute significantly to national electricity production. Many solar installations are large-scale or utility-scale, with a capacity over 1 MW and connected directly to the electric grid. Large-scale solar facilities offer an opportunity to achieve economies of scale in solar deployment, yet there have been concerns about the amount of land required for solar projects and the impact of solar projects on local habitat. During the site preparation phase for utility-scale solar facilities, developers often grade land and remove all vegetation to minimize installation and operational costs, prevent plants from shading panels, and minimizemore » potential fire or wildlife risks. However, the common site preparation practice of removing vegetation can be avoided in certain circumstances, and there have been successful examples where solar facilities have been co-located with agricultural operations or have native vegetation growing beneath the panels. In this study we outline some of the impacts that large-scale solar facilities can have on the local environment, provide examples of installations where impacts have been minimized through co-location with vegetation, characterize the types of co-location, and give an overview of the potential benefits from co-location of solar energy projects and vegetation. The varieties of co-location can be replicated or modified for site-specific use at other solar energy installations around the world. We conclude with opportunities to improve upon our understanding of ways to reduce the environmental impacts of large-scale solar installations.« less

  15. Use of large-scale silvicultural studies to evaluate management options in Pacific Northwest forests of the United States.

    Treesearch

    Stephen E. Reutebuch; Constance A. Harrington; David D. Marshall; Leslie C. Brodie

    2004-01-01

    A suite of large-scale silvicultural experiments has been established to develop and assess operational silviculture options for the Pacific Northwest Douglas-fir (Pseudotsuga menziesii [Mirb.] Franco vat. menziesii) forests. This paper summarizes three such studies that focus on three major stages in the life of managed stands...

  16. Standard Errors for National Trends in International Large-Scale Assessments in the Case of Cross-National Differential Item Functioning

    ERIC Educational Resources Information Center

    Sachse, Karoline A.; Haag, Nicole

    2017-01-01

    Standard errors computed according to the operational practices of international large-scale assessment studies such as the Programme for International Student Assessment's (PISA) or the Trends in International Mathematics and Science Study (TIMSS) may be biased when cross-national differential item functioning (DIF) and item parameter drift are…

  17. Aqueous Two-Phase Systems at Large Scale: Challenges and Opportunities.

    PubMed

    Torres-Acosta, Mario A; Mayolo-Deloisa, Karla; González-Valdez, José; Rito-Palomares, Marco

    2018-06-07

    Aqueous two-phase systems (ATPS) have proved to be an efficient and integrative operation to enhance recovery of industrially relevant bioproducts. After ATPS discovery, a variety of works have been published regarding their scaling from 10 to 1000 L. Although ATPS have achieved high recovery and purity yields, there is still a gap between their bench-scale use and potential industrial applications. In this context, this review paper critically analyzes ATPS scale-up strategies to enhance the potential industrial adoption. In particular, large-scale operation considerations, different phase separation procedures, the available optimization techniques (univariate, response surface methodology, and genetic algorithms) to maximize recovery and purity and economic modeling to predict large-scale costs, are discussed. ATPS intensification to increase the amount of sample to process at each system, developing recycling strategies and creating highly efficient predictive models, are still areas of great significance that can be further exploited with the use of high-throughput techniques. Moreover, the development of novel ATPS can maximize their specificity increasing the possibilities for the future industry adoption of ATPS. This review work attempts to present the areas of opportunity to increase ATPS attractiveness at industrial levels. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. GENASIS Mathematics : Object-oriented manifolds, operations, and solvers for large-scale physics simulations

    NASA Astrophysics Data System (ADS)

    Cardall, Christian Y.; Budiardja, Reuben D.

    2018-01-01

    The large-scale computer simulation of a system of physical fields governed by partial differential equations requires some means of approximating the mathematical limit of continuity. For example, conservation laws are often treated with a 'finite-volume' approach in which space is partitioned into a large number of small 'cells,' with fluxes through cell faces providing an intuitive discretization modeled on the mathematical definition of the divergence operator. Here we describe and make available Fortran 2003 classes furnishing extensible object-oriented implementations of simple meshes and the evolution of generic conserved currents thereon, along with individual 'unit test' programs and larger example problems demonstrating their use. These classes inaugurate the Mathematics division of our developing astrophysics simulation code GENASIS (Gen eral A strophysical Si mulation S ystem), which will be expanded over time to include additional meshing options, mathematical operations, solver types, and solver variations appropriate for many multiphysics applications.

  19. Research on the impacts of large-scale electric vehicles integration into power grid

    NASA Astrophysics Data System (ADS)

    Su, Chuankun; Zhang, Jian

    2018-06-01

    Because of its special energy driving mode, electric vehicles can improve the efficiency of energy utilization and reduce the pollution to the environment, which is being paid more and more attention. But the charging behavior of electric vehicles is random and intermittent. If the electric vehicle is disordered charging in a large scale, it causes great pressure on the structure and operation of the power grid and affects the safety and economic operation of the power grid. With the development of V2G technology in electric vehicle, the study of the charging and discharging characteristics of electric vehicles is of great significance for improving the safe operation of the power grid and the efficiency of energy utilization.

  20. Operations

    ERIC Educational Resources Information Center

    Wilkins, Jesse L. M.; Norton, Anderson; Boyce, Steven J.

    2013-01-01

    Previous research has documented schemes and operations that undergird students' understanding of fractions. This prior research was based, in large part, on small-group teaching experiments. However, written assessments are needed in order for teachers and researchers to assess students' ways of operating on a whole-class scale. In this study,…

  1. Universal Batch Steganalysis

    DTIC Science & Technology

    2014-06-30

    steganalysis) in large-scale datasets such as might be obtained by monitoring a corporate network or social network. Identifying guilty actors...guilty’ user (of steganalysis) in large-scale datasets such as might be obtained by monitoring a corporate network or social network. Identifying guilty...floating point operations (1 TFLOPs) for a 1 megapixel image. We designed a new implementation using Compute Unified Device Architecture (CUDA) on NVIDIA

  2. Large-scale thermal storage systems. Possibilities of operation and state of the art

    NASA Astrophysics Data System (ADS)

    Jank, R.

    1983-05-01

    The state of the art of large scale thermal energy storage concepts is reviewed. With earth pit storage, the materials question has to be concentrated on. The use of container storage in conventional long distance thermal nets has to be stimulated. Aquifer storage should be tested in a pilot plant to obtain experience in natural aquifer use.

  3. Understanding and Controlling Sialylation in a CHO Fc-Fusion Process

    PubMed Central

    Lewis, Amanda M.; Croughan, William D.; Aranibar, Nelly; Lee, Alison G.; Warrack, Bethanne; Abu-Absi, Nicholas R.; Patel, Rutva; Drew, Barry; Borys, Michael C.; Reily, Michael D.; Li, Zheng Jian

    2016-01-01

    A Chinese hamster ovary (CHO) bioprocess, where the product is a sialylated Fc-fusion protein, was operated at pilot and manufacturing scale and significant variation of sialylation level was observed. In order to more tightly control glycosylation profiles, we sought to identify the cause of variability. Untargeted metabolomics and transcriptomics methods were applied to select samples from the large scale runs. Lower sialylation was correlated with elevated mannose levels, a shift in glucose metabolism, and increased oxidative stress response. Using a 5-L scale model operated with a reduced dissolved oxygen set point, we were able to reproduce the phenotypic profiles observed at manufacturing scale including lower sialylation, higher lactate and lower ammonia levels. Targeted transcriptomics and metabolomics confirmed that reduced oxygen levels resulted in increased mannose levels, a shift towards glycolysis, and increased oxidative stress response similar to the manufacturing scale. Finally, we propose a biological mechanism linking large scale operation and sialylation variation. Oxidative stress results from gas transfer limitations at large scale and the presence of oxygen dead-zones inducing upregulation of glycolysis and mannose biosynthesis, and downregulation of hexosamine biosynthesis and acetyl-CoA formation. The lower flux through the hexosamine pathway and reduced intracellular pools of acetyl-CoA led to reduced formation of N-acetylglucosamine and N-acetylneuraminic acid, both key building blocks of N-glycan structures. This study reports for the first time a link between oxidative stress and mammalian protein sialyation. In this study, process, analytical, metabolomic, and transcriptomic data at manufacturing, pilot, and laboratory scales were taken together to develop a systems level understanding of the process and identify oxygen limitation as the root cause of glycosylation variability. PMID:27310468

  4. A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project.

    PubMed

    Ewers, Robert M; Didham, Raphael K; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L; Turner, Edgar C

    2011-11-27

    Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification.

  5. A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project

    PubMed Central

    Ewers, Robert M.; Didham, Raphael K.; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D.; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L.; Turner, Edgar C.

    2011-01-01

    Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification. PMID:22006969

  6. Operating Reserves and Wind Power Integration: An International Comparison; Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Milligan, M.; Donohoo, P.; Lew, D.

    2010-10-01

    This paper provides a high-level international comparison of methods and key results from both operating practice and integration analysis, based on an informal International Energy Agency Task 25: Large-scale Wind Integration.

  7. Topical report on sources and systems for aquatic plant biomass as an energy resource

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldman, J.C.; Ryther, J.H.; Waaland, R.

    1977-10-21

    Background information is documented on the mass cultivation of aquatic plants and systems design that is available from the literature and through consultation with active research scientists and engineers. The biology of microalgae, macroalgae, and aquatic angiosperms is discussed in terms of morphology, life history, mode of existence, and ecological significance, as they relate to cultivation. The requirements for growth of these plants, which are outlined in the test, suggest that productivity rates are dependent primarily on the availability of light and nutrients. It is concluded that the systems should be run with an excess of nutrients and with lightmore » as the limiting factor. A historical review of the mass cultivation of aquatic plants describes the techniques used in commercial large-scale operations throughout the world and recent small-scale research efforts. This review presents information on the biomass yields that have been attained to date in various geographical locations with different plant species and culture conditions, emphasizing the contrast between high yields in small-scale operations and lower yields in large-scale operations.« less

  8. Very large scale monoclonal antibody purification: the case for conventional unit operations.

    PubMed

    Kelley, Brian

    2007-01-01

    Technology development initiatives targeted for monoclonal antibody purification may be motivated by manufacturing limitations and are often aimed at solving current and future process bottlenecks. A subject under debate in many biotechnology companies is whether conventional unit operations such as chromatography will eventually become limiting for the production of recombinant protein therapeutics. An evaluation of the potential limitations of process chromatography and filtration using today's commercially available resins and membranes was conducted for a conceptual process scaled to produce 10 tons of monoclonal antibody per year from a single manufacturing plant, a scale representing one of the world's largest single-plant capacities for cGMP protein production. The process employs a simple, efficient purification train using only two chromatographic and two ultrafiltration steps, modeled after a platform antibody purification train that has generated 10 kg batches in clinical production. Based on analyses of cost of goods and the production capacity of this very large scale purification process, it is unlikely that non-conventional downstream unit operations would be needed to replace conventional chromatographic and filtration separation steps, at least for recombinant antibodies.

  9. Strengths amidst vulnerabilities: the paradox of resistance in a mining-affected community in Guatemala.

    PubMed

    Caxaj, C Susana; Berman, Helene; Ray, Susan L; Restoule, Jean-Paul; Varcoe, Coleen

    2014-11-01

    The influence of large-scale mining on the psychosocial wellbeing and mental health of diverse Indigenous communities has attracted increased attention. In previous reports, we have discussed the influence of a gold mining operation on the health of a community in the Western highlands of Guatemala. Here, we discuss the community strengths, and acts of resistance of this community, that is, community processes that promoted mental health amidst this context. Using an anti-colonial narrative methodology that incorporated participatory action research principles, we developed a research design in collaboration with community leaders and participants. Data collection involved focus groups, individual interviews and photo-sharing with 54 men and women between the ages of 18 and 67. Data analysis was guided by iterative and ongoing conversations with participants and McCormack's narrative lenses. Study findings revealed key mechanisms and sources of resistance, including a shared cultural identity, a spiritual knowing and being, 'defending our rights, defending our territory,' and, speaking truth to power. These overlapping strengths were identified by participants as key protective factors in facing challenges and adversity. Yet ultimately, these same strengths were often the most eroded or endangered due the influence of large-scale mining operations in the region. These community strengths and acts of resistance reveal important priorities for promoting mental health and wellbeing for populations impacted by large-scale mining operations. Mental health practitioners must attend to both the strengths and parallel vulnerabilities that may be occasioned by large-scale projects of this nature.

  10. Large-scale modular biofiltration system for effective odor removal in a composting facility.

    PubMed

    Lin, Yueh-Hsien; Chen, Yu-Pei; Ho, Kuo-Ling; Lee, Tsung-Yih; Tseng, Ching-Ping

    2013-01-01

    Several different foul odors such as nitrogen-containing groups, sulfur-containing groups, and short-chain fatty-acids commonly emitted from composting facilities. In this study, an experimental laboratory-scale bioreactor was scaled up to build a large-scale modular biofiltration system that can process 34 m(3)min(-1)waste gases. This modular reactor system was proven effective in eliminating odors, with a 97% removal efficiency for 96 ppm ammonia, a 98% removal efficiency for 220 ppm amines, and a 100% removal efficiency of other odorous substances. The results of operational parameters indicate that this modular biofiltration system offers long-term operational stability. Specifically, a low pressure drop (<45 mmH2O m(-1)) was observed, indicating that the packing carrier in bioreactor units does not require frequent replacement. Thus, this modular biofiltration system can be used in field applications to eliminate various odors with compact working volume.

  11. Output Control Technologies for a Large-scale PV System Considering Impacts on a Power Grid

    NASA Astrophysics Data System (ADS)

    Kuwayama, Akira

    The mega-solar demonstration project named “Verification of Grid Stabilization with Large-scale PV Power Generation systems” had been completed in March 2011 at Wakkanai, the northernmost city of Japan. The major objectives of this project were to evaluate adverse impacts of large-scale PV power generation systems connected to the power grid and develop output control technologies with integrated battery storage system. This paper describes the outline and results of this project. These results show the effectiveness of battery storage system and also proposed output control methods for a large-scale PV system to ensure stable operation of power grids. NEDO, New Energy and Industrial Technology Development Organization of Japan conducted this project and HEPCO, Hokkaido Electric Power Co., Inc managed the overall project.

  12. The Cosmology Large Angular Scale Surveyor (CLASS): 38 GHz Detector Array of Bolometric Polarimeters

    NASA Technical Reports Server (NTRS)

    Appel, John W.; Ali, Aamir; Amiri, Mandana; Araujo, Derek; Bennett, Charles L.; Boone, Fletcher; Chan, Manwei; Cho, Hsiao-Mei; Chuss, David T.; Colazo, Felipe; hide

    2014-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) experiment aims to map the polarization of the Cosmic Microwave Background (CMB) at angular scales larger than a few degrees. Operating from Cerro Toco in the Atacama Desert of Chile, it will observe over 65% of the sky at 38, 93, 148, and 217 GHz. In this paper we discuss the design, construction, and characterization of the CLASS 38 GHz detector focal plane, the first ever Q-band bolometric polarimeter array.

  13. The cosmology large angular scale surveyor (CLASS): 38-GHz detector array of bolometric polarimeters

    NASA Astrophysics Data System (ADS)

    Appel, John W.; Ali, Aamir; Amiri, Mandana; Araujo, Derek; Bennet, Charles L.; Boone, Fletcher; Chan, Manwei; Cho, Hsiao-Mei; Chuss, David T.; Colazo, Felipe; Crowe, Erik; Denis, Kevin; Dünner, Rolando; Eimer, Joseph; Essinger-Hileman, Thomas; Gothe, Dominik; Halpern, Mark; Harrington, Kathleen; Hilton, Gene; Hinshaw, Gary F.; Huang, Caroline; Irwin, Kent; Jones, Glenn; Karakula, John; Kogut, Alan J.; Larson, David; Limon, Michele; Lowry, Lindsay; Marriage, Tobias; Mehrle, Nicholas; Miller, Amber D.; Miller, Nathan; Moseley, Samuel H.; Novak, Giles; Reintsema, Carl; Rostem, Karwan; Stevenson, Thomas; Towner, Deborah; U-Yen, Kongpop; Wagner, Emily; Watts, Duncan; Wollack, Edward; Xu, Zhilei; Zeng, Lingzhen

    2014-07-01

    The Cosmology Large Angular Scale Surveyor (CLASS) experiment aims to map the polarization of the Cosmic Microwave Background (CMB) at angular scales larger than a few degrees. Operating from Cerro Toco in the Atacama Desert of Chile, it will observe over 65% of the sky at 38, 93, 148, and 217 GHz. In this paper we discuss the design, construction, and characterization of the CLASS 38 GHz detector focal plane, the first ever Q-band bolometric polarimeter array.

  14. A Systematic Multi-Time Scale Solution for Regional Power Grid Operation

    NASA Astrophysics Data System (ADS)

    Zhu, W. J.; Liu, Z. G.; Cheng, T.; Hu, B. Q.; Liu, X. Z.; Zhou, Y. F.

    2017-10-01

    Many aspects need to be taken into consideration in a regional grid while making schedule plans. In this paper, a systematic multi-time scale solution for regional power grid operation considering large scale renewable energy integration and Ultra High Voltage (UHV) power transmission is proposed. In the time scale aspect, we discuss the problem from month, week, day-ahead, within-day to day-behind, and the system also contains multiple generator types including thermal units, hydro-plants, wind turbines and pumped storage stations. The 9 subsystems of the scheduling system are described, and their functions and relationships are elaborated. The proposed system has been constructed in a provincial power grid in Central China, and the operation results further verified the effectiveness of the system.

  15. Scale and modeling issues in water resources planning

    USGS Publications Warehouse

    Lins, H.F.; Wolock, D.M.; McCabe, G.J.

    1997-01-01

    Resource planners and managers interested in utilizing climate model output as part of their operational activities immediately confront the dilemma of scale discordance. Their functional responsibilities cover relatively small geographical areas and necessarily require data of relatively high spatial resolution. Climate models cover a large geographical, i.e. global, domain and produce data at comparatively low spatial resolution. Although the scale differences between model output and planning input are large, several techniques have been developed for disaggregating climate model output to a scale appropriate for use in water resource planning and management applications. With techniques in hand to reduce the limitations imposed by scale discordance, water resource professionals must now confront a more fundamental constraint on the use of climate models-the inability to produce accurate representations and forecasts of regional climate. Given the current capabilities of climate models, and the likelihood that the uncertainty associated with long-term climate model forecasts will remain high for some years to come, the water resources planning community may find it impractical to utilize such forecasts operationally.

  16. How much a galaxy knows about its large-scale environment?: An information theoretic perspective

    NASA Astrophysics Data System (ADS)

    Pandey, Biswajit; Sarkar, Suman

    2017-05-01

    The small-scale environment characterized by the local density is known to play a crucial role in deciding the galaxy properties but the role of large-scale environment on galaxy formation and evolution still remain a less clear issue. We propose an information theoretic framework to investigate the influence of large-scale environment on galaxy properties and apply it to the data from the Galaxy Zoo project that provides the visual morphological classifications of ˜1 million galaxies from the Sloan Digital Sky Survey. We find a non-zero mutual information between morphology and environment that decreases with increasing length-scales but persists throughout the entire length-scales probed. We estimate the conditional mutual information and the interaction information between morphology and environment by conditioning the environment on different length-scales and find a synergic interaction between them that operates up to at least a length-scales of ˜30 h-1 Mpc. Our analysis indicates that these interactions largely arise due to the mutual information shared between the environments on different length-scales.

  17. Measuring large scale space perception in literary texts

    NASA Astrophysics Data System (ADS)

    Rossi, Paolo

    2007-07-01

    A center and radius of “perception” (in the sense of environmental cognition) can be formally associated with a written text and operationally defined. Simple algorithms for their computation are presented, and indicators for anisotropy in large scale space perception are introduced. The relevance of these notions for the analysis of literary and historical records is briefly discussed and illustrated with an example taken from medieval historiography.

  18. Fuzzy-based propagation of prior knowledge to improve large-scale image analysis pipelines

    PubMed Central

    Mikut, Ralf

    2017-01-01

    Many automatically analyzable scientific questions are well-posed and a variety of information about expected outcomes is available a priori. Although often neglected, this prior knowledge can be systematically exploited to make automated analysis operations sensitive to a desired phenomenon or to evaluate extracted content with respect to this prior knowledge. For instance, the performance of processing operators can be greatly enhanced by a more focused detection strategy and by direct information about the ambiguity inherent in the extracted data. We present a new concept that increases the result quality awareness of image analysis operators by estimating and distributing the degree of uncertainty involved in their output based on prior knowledge. This allows the use of simple processing operators that are suitable for analyzing large-scale spatiotemporal (3D+t) microscopy images without compromising result quality. On the foundation of fuzzy set theory, we transform available prior knowledge into a mathematical representation and extensively use it to enhance the result quality of various processing operators. These concepts are illustrated on a typical bioimage analysis pipeline comprised of seed point detection, segmentation, multiview fusion and tracking. The functionality of the proposed approach is further validated on a comprehensive simulated 3D+t benchmark data set that mimics embryonic development and on large-scale light-sheet microscopy data of a zebrafish embryo. The general concept introduced in this contribution represents a new approach to efficiently exploit prior knowledge to improve the result quality of image analysis pipelines. The generality of the concept makes it applicable to practically any field with processing strategies that are arranged as linear pipelines. The automated analysis of terabyte-scale microscopy data will especially benefit from sophisticated and efficient algorithms that enable a quantitative and fast readout. PMID:29095927

  19. Utility-Scale Solar 2014. An Empirical Analysis of Project Cost, Performance, and Pricing Trends in the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolinger, Mark; Seel, Joachim

    2015-09-01

    Other than the nine Solar Energy Generation Systems (“SEGS”) parabolic trough projects built in the 1980s, virtually no large-scale or “utility-scale” solar projects – defined here to include any groundmounted photovoltaic (“PV”), concentrating photovoltaic (“CPV”), or concentrating solar thermal power (“CSP”) project larger than 5 MW AC – existed in the United States prior to 2007. By 2012 – just five years later – utility-scale had become the largest sector of the overall PV market in the United States, a distinction that was repeated in both 2013 and 2014 and that is expected to continue for at least the nextmore » few years. Over this same short period, CSP also experienced a bit of a renaissance in the United States, with a number of large new parabolic trough and power tower systems – some including thermal storage – achieving commercial operation. With this critical mass of new utility-scale projects now online and in some cases having operated for a number of years (generating not only electricity, but also empirical data that can be mined), the rapidly growing utility-scale sector is ripe for analysis. This report, the third edition in an ongoing annual series, meets this need through in-depth, annually updated, data-driven analysis of not just installed project costs or prices – i.e., the traditional realm of solar economics analyses – but also operating costs, capacity factors, and power purchase agreement (“PPA”) prices from a large sample of utility-scale solar projects in the United States. Given its current dominance in the market, utility-scale PV also dominates much of this report, though data from CPV and CSP projects are presented where appropriate.« less

  20. Avoiding and tolerating latency in large-scale next-generation shared-memory multiprocessors

    NASA Technical Reports Server (NTRS)

    Probst, David K.

    1993-01-01

    A scalable solution to the memory-latency problem is necessary to prevent the large latencies of synchronization and memory operations inherent in large-scale shared-memory multiprocessors from reducing high performance. We distinguish latency avoidance and latency tolerance. Latency is avoided when data is brought to nearby locales for future reference. Latency is tolerated when references are overlapped with other computation. Latency-avoiding locales include: processor registers, data caches used temporally, and nearby memory modules. Tolerating communication latency requires parallelism, allowing the overlap of communication and computation. Latency-tolerating techniques include: vector pipelining, data caches used spatially, prefetching in various forms, and multithreading in various forms. Relaxing the consistency model permits increased use of avoidance and tolerance techniques. Each model is a mapping from the program text to sets of partial orders on program operations; it is a convention about which temporal precedences among program operations are necessary. Information about temporal locality and parallelism constrains the use of avoidance and tolerance techniques. Suitable architectural primitives and compiler technology are required to exploit the increased freedom to reorder and overlap operations in relaxed models.

  1. Environmental performance evaluation of large-scale municipal solid waste incinerators using data envelopment analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, H.-W.; Chang, N.-B., E-mail: nchang@mail.ucf.ed; Chen, J.-C.

    2010-07-15

    Limited to insufficient land resources, incinerators are considered in many countries such as Japan and Germany as the major technology for a waste management scheme capable of dealing with the increasing demand for municipal and industrial solid waste treatment in urban regions. The evaluation of these municipal incinerators in terms of secondary pollution potential, cost-effectiveness, and operational efficiency has become a new focus in the highly interdisciplinary area of production economics, systems analysis, and waste management. This paper aims to demonstrate the application of data envelopment analysis (DEA) - a production economics tool - to evaluate performance-based efficiencies of 19more » large-scale municipal incinerators in Taiwan with different operational conditions. A 4-year operational data set from 2002 to 2005 was collected in support of DEA modeling using Monte Carlo simulation to outline the possibility distributions of operational efficiency of these incinerators. Uncertainty analysis using the Monte Carlo simulation provides a balance between simplifications of our analysis and the soundness of capturing the essential random features that complicate solid waste management systems. To cope with future challenges, efforts in the DEA modeling, systems analysis, and prediction of the performance of large-scale municipal solid waste incinerators under normal operation and special conditions were directed toward generating a compromised assessment procedure. Our research findings will eventually lead to the identification of the optimal management strategies for promoting the quality of solid waste incineration, not only in Taiwan, but also elsewhere in the world.« less

  2. Exploring Unidimensional Proficiency Classification Accuracy from Multidimensional Data in a Vertical Scaling Context

    ERIC Educational Resources Information Center

    Kroopnick, Marc Howard

    2010-01-01

    When Item Response Theory (IRT) is operationally applied for large scale assessments, unidimensionality is typically assumed. This assumption requires that the test measures a single latent trait. Furthermore, when tests are vertically scaled using IRT, the assumption of unidimensionality would require that the battery of tests across grades…

  3. An Expected Value Air Combat Model Simulation Algorithm to Predict Missions Performance in Tactical Air Operations.

    DTIC Science & Technology

    1983-09-01

    Approved by: Me<i W4 1tsZ7 CaifI ,KDpartmento I inistrative Science 3 ( ABSTRACT >This thesis intends to create the basic...a need for a small scale model which allows a student analyst of tactical air operations to create his own battles and to test his own strategies with...iconic model is a large or small-scale repre- sentation of states-objects, or events. For example a scale model airplance resembles the system under the

  4. Process model comparison and transferability across bioreactor scales and modes of operation for a mammalian cell bioprocess.

    PubMed

    Craven, Stephen; Shirsat, Nishikant; Whelan, Jessica; Glennon, Brian

    2013-01-01

    A Monod kinetic model, logistic equation model, and statistical regression model were developed for a Chinese hamster ovary cell bioprocess operated under three different modes of operation (batch, bolus fed-batch, and continuous fed-batch) and grown on two different bioreactor scales (3 L bench-top and 15 L pilot-scale). The Monod kinetic model was developed for all modes of operation under study and predicted cell density, glucose glutamine, lactate, and ammonia concentrations well for the bioprocess. However, it was computationally demanding due to the large number of parameters necessary to produce a good model fit. The transferability of the Monod kinetic model structure and parameter set across bioreactor scales and modes of operation was investigated and a parameter sensitivity analysis performed. The experimentally determined parameters had the greatest influence on model performance. They changed with scale and mode of operation, but were easily calculated. The remaining parameters, which were fitted using a differential evolutionary algorithm, were not as crucial. Logistic equation and statistical regression models were investigated as alternatives to the Monod kinetic model. They were less computationally intensive to develop due to the absence of a large parameter set. However, modeling of the nutrient and metabolite concentrations proved to be troublesome due to the logistic equation model structure and the inability of both models to incorporate a feed. The complexity, computational load, and effort required for model development has to be balanced with the necessary level of model sophistication when choosing which model type to develop for a particular application. Copyright © 2012 American Institute of Chemical Engineers (AIChE).

  5. Breaking barriers through collaboration: the example of the Cell Migration Consortium.

    PubMed

    Horwitz, Alan Rick; Watson, Nikki; Parsons, J Thomas

    2002-10-15

    Understanding complex integrated biological processes, such as cell migration, requires interdisciplinary approaches. The Cell Migration Consortium, funded by a Large-Scale Collaborative Project Award from the National Institute of General Medical Science, develops and disseminates new technologies, data, reagents, and shared information to a wide audience. The development and operation of this Consortium may provide useful insights for those who plan similarly large-scale, interdisciplinary approaches.

  6. Large-Scale Aerosol Modeling and Analysis

    DTIC Science & Technology

    2007-09-30

    deserts of the world: Arabian Gulf, Sea of Japan, China Sea , Mediterranean Sea , and the Tropical Atlantic Ocean. NAAPS also accurately predicts the...fate of large-scale smoke and pollution plumes. With its global and continuous coverage, 1 Report Documentation Page Form ApprovedOMB No. 0704-0188...origin of dust plumes impacting naval operations in the Red Sea , Mediterranean, eastern Atlantic, Gulf of Guinea, Sea of Japan, Yellow Sea , and East

  7. A Bayesian Hierarchical Model for Large-Scale Educational Surveys: An Application to the National Assessment of Educational Progress. Research Report. ETS RR-04-38

    ERIC Educational Resources Information Center

    Johnson, Matthew S.; Jenkins, Frank

    2005-01-01

    Large-scale educational assessments such as the National Assessment of Educational Progress (NAEP) sample examinees to whom an exam will be administered. In most situations the sampling design is not a simple random sample and must be accounted for in the estimating model. After reviewing the current operational estimation procedure for NAEP, this…

  8. Hierarchical Engine for Large-scale Infrastructure Co-Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2017-04-24

    HELICS is designed to support very-large-scale (100,000+ federates) cosimulations with off-the-shelf power-system, communication, market, and end-use tools. Other key features include cross platform operating system support, the integration of both event driven (e.g., packetized communication) and time-series (e.g., power flow) simulations, and the ability to co-iterate among federates to ensure physical model convergence at each time step.

  9. Concurrent Schedules of Positive and Negative Reinforcement: Differential-Impact and Differential-Outcomes Hypotheses

    ERIC Educational Resources Information Center

    Magoon, Michael A.; Critchfield, Thomas S.

    2008-01-01

    Considerable evidence from outside of operant psychology suggests that aversive events exert greater influence over behavior than equal-sized positive-reinforcement events. Operant theory is largely moot on this point, and most operant research is uninformative because of a scaling problem that prevents aversive events and those based on positive…

  10. Small organic molecule based flow battery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huskinson, Brian; Marshak, Michael; Aziz, Michael J.

    The invention provides an electrochemical cell based on a new chemistry for a flow battery for large scale, e.g., gridscale, electrical energy storage. Electrical energy is stored chemically at an electrochemical electrode by the protonation of small organic molecules called quinones to hydroquinones. The proton is provided by a complementary electrochemical reaction at the other electrode. These reactions are reversed to deliver electrical energy. A flow battery based on this concept can operate as a closed system. The flow battery architecture has scaling advantages over solid electrode batteries for large scale energy storage.

  11. Study on safety operation for large hydroelectric generator unit

    NASA Astrophysics Data System (ADS)

    Yan, Z. G.; Cui, T.; Zhou, L. J.; Zhi, F. L.; Wang, Z. W.

    2012-11-01

    Hydroelectric generator unit is a complex mechanical system which is composed of hydraulic turbine and electric generator. Rotary system is supported by the bearing bracket and the reinforced concrete structures, and vibration problem can't be avoided in the process of operating. Many large-scale hydroelectric units have been damaged because of the vibration problem in recent years. As the increase of the hydraulic turbine unit capacity and water head, the safe operation of hydraulic turbine has become a focus research in many countries. The operating characteristics of the hydraulic turbine have obvious differences at different working conditions. Based on the combination of field measurement and theoretical calculation, this paper shows a deep research on the safe operation of a large-scale Francis turbine unit. Firstly, the measurements of vibration, swing, pressure fluctuation and noise were carried out at 4 different heads. And also the relationships between vibrations and pressure fluctuations at different heads and working conditions were analysed deeply. Then the scientific prediction of safe operation for the unit at high head were done based on the CFD numerical calculation. Finally, this paper shows the division of the operating zone for the hydroelectric unit. According to the experimental results (vibrations, swings, pressure fluctuations and noise) as well as the theoretical results, the operating zone of the unit has been divided into three sections: prohibited operating zone, transition operating zone and safe operating zone. After this research was applied in the hydropower station, the security and economic efficiency of unit increased greatly, and enormous economic benefits and social benefits have been obtained.

  12. Developing Renewable Energy Projects Larger Than 10 MWs at Federal Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2013-03-01

    To accomplish Federal goals for renewable energy, sustainability, and energy security, large-scale renewable energy projects must be developed and constructed on Federal sites at a significant scale with significant private investment. For the purposes of this Guide, large-scale Federal renewable energy projects are defined as renewable energy facilities larger than 10 megawatts (MW) that are sited on Federal property and lands and typically financed and owned by third parties.1 The U.S. Department of Energy’s Federal Energy Management Program (FEMP) helps Federal agencies meet these goals and assists agency personnel navigate the complexities of developing such projects and attract the necessarymore » private capital to complete them. This Guide is intended to provide a general resource that will begin to develop the Federal employee’s awareness and understanding of the project developer’s operating environment and the private sector’s awareness and understanding of the Federal environment. Because the vast majority of the investment that is required to meet the goals for large-scale renewable energy projects will come from the private sector, this Guide has been organized to match Federal processes with typical phases of commercial project development. FEMP collaborated with the National Renewable Energy Laboratory (NREL) and professional project developers on this Guide to ensure that Federal projects have key elements recognizable to private sector developers and investors. The main purpose of this Guide is to provide a project development framework to allow the Federal Government, private developers, and investors to work in a coordinated fashion on large-scale renewable energy projects. The framework includes key elements that describe a successful, financially attractive large-scale renewable energy project. This framework begins the translation between the Federal and private sector operating environments. When viewing the overall« less

  13. Unmanned Aircraft Systems Traffic Management (UTM)

    NASA Technical Reports Server (NTRS)

    Johnson, Ronald D.

    2018-01-01

    UTM is an 'air traffic management' ecosystem for uncontrolled operations. UTM utilizes industry's ability to supply services under FAA's regulatory authority where these services do not exist. UTM development will ultimately enable the management of large scale, low-altitude UAS operations. Operational concept will address beyond visual line of sight UAS operations under 400 ft. AGL. Information architecture, data exchange protocols, software functions. Roles/responsibilities of FAA and operators. Performance requirements.

  14. Central Asia: Regional Developments and Implications for U.S. Interests

    DTIC Science & Technology

    2007-07-05

    Cooperation Organization (SCO; see below, Regional Tensions) that stated that “as large - scale military operations against terrorism have come to an end...the world’s top producers of low enriched uranium. Kazakhstan had a fast breeder reactor at Aktau that was the world’s only nuclear desalinization...Central Asia, Afghanistan, and eventually Pakistan and India.56 All the states of the region possess large - scale resources that could contribute to the

  15. Central Asia: Regional Developments and Implications for U.S. Interests

    DTIC Science & Technology

    2009-11-20

    Central Asia. Other officials have stated that a large - scale influx has not yet occurred. Actions of the IMU and IJU in Germany and Elsewhere Officials...Independence: Regional Tensions and Conflicts”) that stated that “as large - scale military operations against terrorism have come to an end in... breeder reactor at Aktau that was the world’s only nuclear desalinization facility. In 1997 and 1999, U.S.-Kazakh accords were signed on

  16. Central Asia: Regional Developments and Implications for U.S. Interests

    DTIC Science & Technology

    2009-09-21

    Service 25 to Peace and Independence: Regional Tensions and Conflicts”) that stated that “as large - scale military operations against terrorism have...Kazakhstan had a fast breeder reactor at Aktau that was the world’s only nuclear desalinization facility. In 1997 and 1999, U.S.-Kazakh accords...Trade and Investment. All the states of the region possess large - scale resources that could contribute to the region becoming a “new silk road” of

  17. Central Asia: Regional Developments and Implications for U.S. Interests

    DTIC Science & Technology

    2008-07-10

    United Press International, December 13, 2005. Organization (SCO; see above, Regional Tensions) that stated that “as large - scale military operations...reserves, and Kazakhstan and Uzbekistan have been among the world’s top producers of low enriched uranium. Kazakhstan had a fast breeder reactor at Aktau...Asia, Afghanistan, and eventually Pakistan and India.72 All the states of the region possess large - scale resources that could contribute to the region

  18. Equipment characterization to mitigate risks during transfers of cell culture manufacturing processes.

    PubMed

    Sieblist, Christian; Jenzsch, Marco; Pohlscheidt, Michael

    2016-08-01

    The production of monoclonal antibodies by mammalian cell culture in bioreactors up to 25,000 L is state of the art technology in the biotech industry. During the lifecycle of a product, several scale up activities and technology transfers are typically executed to enable the supply chain strategy of a global pharmaceutical company. Given the sensitivity of mammalian cells to physicochemical culture conditions, process and equipment knowledge are critical to avoid impacts on timelines, product quantity and quality. Especially, the fluid dynamics of large scale bioreactors versus small scale models need to be described, and similarity demonstrated, in light of the Quality by Design approach promoted by the FDA. This approach comprises an associated design space which is established during process characterization and validation in bench scale bioreactors. Therefore the establishment of predictive models and simulation tools for major operating conditions of stirred vessels (mixing, mass transfer, and shear force.), based on fundamental engineering principles, have experienced a renaissance in the recent years. This work illustrates the systematic characterization of a large variety of bioreactor designs deployed in a global manufacturing network ranging from small bench scale equipment to large scale production equipment (25,000 L). Several traditional methods to determine power input, mixing, mass transfer and shear force have been used to create a data base and identify differences for various impeller types and configurations in operating ranges typically applied in cell culture processes at manufacturing scale. In addition, extrapolation of different empirical models, e.g. Cooke et al. (Paper presented at the proceedings of the 2nd international conference of bioreactor fluid dynamics, Cranfield, UK, 1988), have been assessed for their validity in these operational ranges. Results for selected designs are shown and serve as examples of structured characterization to enable fast and agile process transfers, scale up and troubleshooting.

  19. Seismic safety in conducting large-scale blasts

    NASA Astrophysics Data System (ADS)

    Mashukov, I. V.; Chaplygin, V. V.; Domanov, V. P.; Semin, A. A.; Klimkin, M. A.

    2017-09-01

    In mining enterprises to prepare hard rocks for excavation a drilling and blasting method is used. With the approach of mining operations to settlements the negative effect of large-scale blasts increases. To assess the level of seismic impact of large-scale blasts the scientific staff of Siberian State Industrial University carried out expertise for coal mines and iron ore enterprises. Determination of the magnitude of surface seismic vibrations caused by mass explosions was performed using seismic receivers, an analog-digital converter with recording on a laptop. The registration results of surface seismic vibrations during production of more than 280 large-scale blasts at 17 mining enterprises in 22 settlements are presented. The maximum velocity values of the Earth’s surface vibrations are determined. The safety evaluation of seismic effect was carried out according to the permissible value of vibration velocity. For cases with exceedance of permissible values recommendations were developed to reduce the level of seismic impact.

  20. ENSO detection and use to inform the operation of large scale water systems

    NASA Astrophysics Data System (ADS)

    Pham, Vuong; Giuliani, Matteo; Castelletti, Andrea

    2016-04-01

    El Nino Southern Oscillation (ENSO) is a large-scale, coupled ocean-atmosphere phenomenon occurring in the tropical Pacific Ocean, and is considered one of the most significant factors causing hydro-climatic anomalies throughout the world. Water systems operations could benefit from a better understanding of this global phenomenon, which has the potential for enhancing the accuracy and lead-time of long-range streamflow predictions. In turn, these are key to design interannual water transfers in large scale water systems to contrast increasingly frequent extremes induced by changing climate. Despite the ENSO teleconnection is well defined in some locations such as Western USA and Australia, there is no consensus on how it can be detected and used in other river basins, particularly in Europe, Africa, and Asia. In this work, we contribute a general framework relying on Input Variable Selection techniques for detecting ENSO teleconnection and using this information for improving water reservoir operations. Core of our procedure is the Iterative Input variable Selection (IIS) algorithm, which is employed to find the most relevant determinants of streamflow variability for deriving predictive models based on the selected inputs as well as to find the most valuable information for conditioning operating decisions. Our framework is applied to the multipurpose operations of the Hoa Binh reservoir in the Red River basin (Vietnam), taking into account hydropower production, water supply for irrigation, and flood mitigation during the monsoon season. Numerical results show that our framework is able to quantify the relationship between the ENSO fluctuations and the Red River basin hydrology. Moreover, we demonstrate that such ENSO teleconnection represents valuable information for improving the operations of Hoa Binh reservoir.

  1. Environmental performance evaluation of large-scale municipal solid waste incinerators using data envelopment analysis.

    PubMed

    Chen, Ho-Wen; Chang, Ni-Bin; Chen, Jeng-Chung; Tsai, Shu-Ju

    2010-07-01

    Limited to insufficient land resources, incinerators are considered in many countries such as Japan and Germany as the major technology for a waste management scheme capable of dealing with the increasing demand for municipal and industrial solid waste treatment in urban regions. The evaluation of these municipal incinerators in terms of secondary pollution potential, cost-effectiveness, and operational efficiency has become a new focus in the highly interdisciplinary area of production economics, systems analysis, and waste management. This paper aims to demonstrate the application of data envelopment analysis (DEA)--a production economics tool--to evaluate performance-based efficiencies of 19 large-scale municipal incinerators in Taiwan with different operational conditions. A 4-year operational data set from 2002 to 2005 was collected in support of DEA modeling using Monte Carlo simulation to outline the possibility distributions of operational efficiency of these incinerators. Uncertainty analysis using the Monte Carlo simulation provides a balance between simplifications of our analysis and the soundness of capturing the essential random features that complicate solid waste management systems. To cope with future challenges, efforts in the DEA modeling, systems analysis, and prediction of the performance of large-scale municipal solid waste incinerators under normal operation and special conditions were directed toward generating a compromised assessment procedure. Our research findings will eventually lead to the identification of the optimal management strategies for promoting the quality of solid waste incineration, not only in Taiwan, but also elsewhere in the world. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  2. A new resource for developing and strengthening large-scale community health worker programs.

    PubMed

    Perry, Henry; Crigler, Lauren; Lewin, Simon; Glenton, Claire; LeBan, Karen; Hodgins, Steve

    2017-01-12

    Large-scale community health worker programs are now growing in importance around the world in response to the resurgence of interest and growing evidence of the importance of community-based primary health care for improving the health of populations in resource-constrained, high-mortality settings. These programs, because of their scale and operational challenges, merit special consideration by the global health community, national policy-makers, and program implementers. A new online resource is now available to assist in that effort: Developing and Strengthening Community Health Worker Programs at Scale: A Reference Guide and Case Studies for Program Managers and Policymakers ( http://www.mchip.net/CHWReferenceGuide ). This CHW Reference Guide is the product of 27 different collaborators who, collectively, have a formidable breadth and depth of experience and knowledge about CHW programming around the world. It provides a thoughtful discussion about the many operational issues that large-scale CHW programs need to address as they undergo the process of development, expansion or strengthening. Detailed case studies of 12 national CHW programs are included in the Appendix-the most current and complete cases studies as a group that are currently available. Future articles in this journal will highlight many of the themes in the CHW Reference Guide and provide an update of recent advances and experiences. These articles will serve, we hope, to (1) increase awareness about the CHW Reference Guide and its usefulness and (2) connect a broader audience to the critical importance of strengthening large-scale CHW programs for the health benefits that they can bring to underserved populations around the world.

  3. Guide for preparing active solar heating systems operation and maintenance manuals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-01-01

    This book presents a systematic and standardized approach to the preparation of operation and maintenance manuals for active solar heating systems. Provides an industry consensus of the best operating and maintenance procedures for large commercial-scale solar service water and space heating systems. A sample O M manual is included. 3-ring binder included.

  4. Development of a Two-Stage Microalgae Dewatering Process – A Life Cycle Assessment Approach

    PubMed Central

    Soomro, Rizwan R.; Zeng, Xianhai; Lu, Yinghua; Lin, Lu; Danquah, Michael K.

    2016-01-01

    Even though microalgal biomass is leading the third generation biofuel research, significant effort is required to establish an economically viable commercial-scale microalgal biofuel production system. Whilst a significant amount of work has been reported on large-scale cultivation of microalgae using photo-bioreactors and pond systems, research focus on establishing high performance downstream dewatering operations for large-scale processing under optimal economy is limited. The enormous amount of energy and associated cost required for dewatering large-volume microalgal cultures has been the primary hindrance to the development of the needed biomass quantity for industrial-scale microalgal biofuels production. The extremely dilute nature of large-volume microalgal suspension and the small size of microalgae cells in suspension create a significant processing cost during dewatering and this has raised major concerns towards the economic success of commercial-scale microalgal biofuel production as an alternative to conventional petroleum fuels. This article reports an effective framework to assess the performance of different dewatering technologies as the basis to establish an effective two-stage dewatering system. Bioflocculation coupled with tangential flow filtration (TFF) emerged a promising technique with total energy input of 0.041 kWh, 0.05 kg CO2 emissions and a cost of $ 0.0043 for producing 1 kg of microalgae biomass. A streamlined process for operational analysis of two-stage microalgae dewatering technique, encompassing energy input, carbon dioxide emission, and process cost, is presented. PMID:26904075

  5. Transmission Technologies and Operational Characteristic Analysis of Hybrid UHV AC/DC Power Grids in China

    NASA Astrophysics Data System (ADS)

    Tian, Zhang; Yanfeng, Gong

    2017-05-01

    In order to solve the contradiction between demand and distribution range of primary energy resource, Ultra High Voltage (UHV) power grids should be developed rapidly to meet development of energy bases and accessing of large-scale renewable energy. This paper reviewed the latest research processes of AC/DC transmission technologies, summarized the characteristics of AC/DC power grids, concluded that China’s power grids certainly enter a new period of large -scale hybrid UHV AC/DC power grids and characteristics of “strong DC and weak AC” becomes increasingly pro minent; possible problems in operation of AC/DC power grids was discussed, and interaction or effect between AC/DC power grids was made an intensive study of; according to above problems in operation of power grids, preliminary scheme is summarized as fo llows: strengthening backbone structures, enhancing AC/DC transmission technologies, promoting protection measures of clean energ y accessing grids, and taking actions to solve stability problems of voltage and frequency etc. It’s valuable for making hybrid UHV AC/DC power grids adapt to operating mode of large power grids, thus guaranteeing security and stability of power system.

  6. Wind-Tunnel Experiments for Gas Dispersion in an Atmospheric Boundary Layer with Large-Scale Turbulent Motion

    NASA Astrophysics Data System (ADS)

    Michioka, Takenobu; Sato, Ayumu; Sada, Koichi

    2011-10-01

    Large-scale turbulent motions enhancing horizontal gas spread in an atmospheric boundary layer are simulated in a wind-tunnel experiment. The large-scale turbulent motions can be generated using an active grid installed at the front of the test section in the wind tunnel, when appropriate parameters for the angular deflection and the rotation speed are chosen. The power spectra of vertical velocity fluctuations are unchanged with and without the active grid because they are strongly affected by the surface. The power spectra of both streamwise and lateral velocity fluctuations with the active grid increase in the low frequency region, and are closer to the empirical relations inferred from field observations. The large-scale turbulent motions do not affect the Reynolds shear stress, but change the balance of the processes involved. The relative contributions of ejections to sweeps are suppressed by large-scale turbulent motions, indicating that the motions behave as sweep events. The lateral gas spread is enhanced by the lateral large-scale turbulent motions generated by the active grid. The large-scale motions, however, do not affect the vertical velocity fluctuations near the surface, resulting in their having a minimal effect on the vertical gas spread. The peak concentration normalized using the root-mean-squared value of concentration fluctuation is remarkably constant over most regions of the plume irrespective of the operation of the active grid.

  7. Evaluation of fuel preparation systems for lean premixing-prevaporizing combustors

    NASA Technical Reports Server (NTRS)

    Dodds, W. J.; Ekstedt, E. E.

    1985-01-01

    A series of experiments was carried out in order to produce design data for a premixing prevaporizing fuel-air mixture preparation system for aircraft gas turbine engine combustors. The fuel-air mixture uniformity of four different system design concepts was evaluated over a range of conditions representing the cruise operation of a modern commercial turbofan engine. Operating conditions including pressure, temperature, fuel-to-air ratio, and velocity, exhibited no clear effect on mixture uniformity of systems using pressure-atomizing fuel nozzles and large-scale mixing devices. However, the performance of systems using atomizing fuel nozzles and large-scale mixing devices was found to be sensitive to operating conditions. Variations in system design variables were also evaluated and correlated. Mixing uniformity was found to improve with system length, pressure drop, and the number of fuel injection points per unit area. A premixing system capable of providing mixing uniformity to within 15 percent over a typical range of cruise operating conditions is demonstrated.

  8. High Temperature Electrolysis 4 kW Experiment Design, Operation, and Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.E. O'Brien; X. Zhang; K. DeWall

    2012-09-01

    This report provides results of long-term stack testing completed in the new high-temperature steam electrolysis multi-kW test facility recently developed at INL. The report includes detailed descriptions of the piping layout, steam generation and delivery system, test fixture, heat recuperation system, hot zone, instrumentation, and operating conditions. This facility has provided a demonstration of high-temperature steam electrolysis operation at the 4 kW scale with advanced cell and stack technology. This successful large-scale demonstration of high-temperature steam electrolysis will help to advance the technology toward near-term commercialization.

  9. Users matter : multi-agent systems model of high performance computing cluster users.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    North, M. J.; Hood, C. S.; Decision and Information Sciences

    2005-01-01

    High performance computing clusters have been a critical resource for computational science for over a decade and have more recently become integral to large-scale industrial analysis. Despite their well-specified components, the aggregate behavior of clusters is poorly understood. The difficulties arise from complicated interactions between cluster components during operation. These interactions have been studied by many researchers, some of whom have identified the need for holistic multi-scale modeling that simultaneously includes network level, operating system level, process level, and user level behaviors. Each of these levels presents its own modeling challenges, but the user level is the most complex duemore » to the adaptability of human beings. In this vein, there are several major user modeling goals, namely descriptive modeling, predictive modeling and automated weakness discovery. This study shows how multi-agent techniques were used to simulate a large-scale computing cluster at each of these levels.« less

  10. Performance of ceramic superconductors in magnetic bearings

    NASA Technical Reports Server (NTRS)

    Kirtley, James L., Jr.; Downer, James R.

    1993-01-01

    Magnetic bearings are large-scale applications of magnet technology, quite similar in certain ways to synchronous machinery. They require substantial flux density over relatively large volumes of space. Large flux density is required to have satisfactory force density. Satisfactory dynamic response requires that magnetic circuit permeances not be too large, implying large air gaps. Superconductors, which offer large magnetomotive forces and high flux density in low permeance circuits, appear to be desirable in these situations. Flux densities substantially in excess of those possible with iron can be produced, and no ferromagnetic material is required. Thus the inductance of active coils can be made low, indicating good dynamic response of the bearing system. The principal difficulty in using superconductors is, of course, the deep cryogenic temperatures at which they must operate. Because of the difficulties in working with liquid helium, the possibility of superconductors which can be operated in liquid nitrogen is thought to extend the number and range of applications of superconductivity. Critical temperatures of about 98 degrees Kelvin were demonstrated in a class of materials which are, in fact, ceramics. Quite a bit of public attention was attracted to these new materials. There is a difficulty with the ceramic superconducting materials which were developed to date. Current densities sufficient for use in large-scale applications have not been demonstrated. In order to be useful, superconductors must be capable of carrying substantial currents in the presence of large magnetic fields. The possible use of ceramic superconductors in magnetic bearings is investigated and discussed and requirements that must be achieved by superconductors operating at liquid nitrogen temperatures to make their use comparable with niobium-titanium superconductors operating at liquid helium temperatures are identified.

  11. Southern Argentina Agile Meteor Radar: System design and initial measurements of large-scale winds and tides

    NASA Astrophysics Data System (ADS)

    Fritts, D. C.; Janches, D.; Iimura, H.; Hocking, W. K.; Mitchell, N. J.; Stockwell, R. G.; Fuller, B.; Vandepeer, B.; Hormaechea, J.; Brunini, C.; Levato, H.

    2010-09-01

    The Southern Argentina Agile Meteor Radar (SAAMER) was installed at Rio Grande on Tierra del Fuego (53.8°S, 67.8°W) in May 2008 and has been operational for ˜24 months. This paper describes the motivations for the radar design and its placement at the southern tip of South America, its operating modes and capabilities, and observations of the mean winds, planetary waves, and tides during its first ˜20 months of operation. SAAMER was specifically designed to provide very high resolution of large-scale motions and hopefully enable direct measurements of the vertical momentum flux by gravity waves, which have only been possible previously with dual- or multiple-beam radars and lidars or in situ measurements. SAAMER was placed on Tierra del Fuego because it was a region devoid of similar measurements, the latitude was anticipated to provide high sensitivity to an expected large semidiurnal tide, and the region is now recognized to be a "hot spot" of small-scale gravity wave activity extending from the troposphere into the mesosphere and lower thermosphere, perhaps the most dynamically active location on Earth. SAAMER was also intended to permit simultaneous enhanced meteor studies, including "head echo" and "nonspecular" measurements, which were previously possible only with high-power large-aperture radars. Initial measurements have defined the mean circulation and structure, exhibited planetary waves at various periods, and revealed large semidiurnal tide amplitudes and variability, with maximum amplitudes at higher altitudes often exceeding 60 m s-1 and amplitude modulations at periods from a few to ˜30 days.

  12. Advances in Multi-Sensor Scanning and Visualization of Complex Plants: the Utmost Case of a Reactor Building

    NASA Astrophysics Data System (ADS)

    Hullo, J.-F.; Thibault, G.; Boucheny, C.

    2015-02-01

    In a context of increased maintenance operations and workers generational renewal, a nuclear owner and operator like Electricité de France (EDF) is interested in the scaling up of tools and methods of "as-built virtual reality" for larger buildings and wider audiences. However, acquisition and sharing of as-built data on a large scale (large and complex multi-floored buildings) challenge current scientific and technical capacities. In this paper, we first present a state of the art of scanning tools and methods for industrial plants with very complex architecture. Then, we introduce the inner characteristics of the multi-sensor scanning and visualization of the interior of the most complex building of a power plant: a nuclear reactor building. We introduce several developments that made possible a first complete survey of such a large building, from acquisition, processing and fusion of multiple data sources (3D laser scans, total-station survey, RGB panoramic, 2D floor plans, 3D CAD as-built models). In addition, we present the concepts of a smart application developed for the painless exploration of the whole dataset. The goal of this application is to help professionals, unfamiliar with the manipulation of such datasets, to take into account spatial constraints induced by the building complexity while preparing maintenance operations. Finally, we discuss the main feedbacks of this large experiment, the remaining issues for the generalization of such large scale surveys and the future technical and scientific challenges in the field of industrial "virtual reality".

  13. Biology-Inspired Distributed Consensus in Massively-Deployed Sensor Networks

    NASA Technical Reports Server (NTRS)

    Jones, Kennie H.; Lodding, Kenneth N.; Olariu, Stephan; Wilson, Larry; Xin, Chunsheng

    2005-01-01

    Promises of ubiquitous control of the physical environment by large-scale wireless sensor networks open avenues for new applications that are expected to redefine the way we live and work. Most of recent research has concentrated on developing techniques for performing relatively simple tasks in small-scale sensor networks assuming some form of centralized control. The main contribution of this work is to propose a new way of looking at large-scale sensor networks, motivated by lessons learned from the way biological ecosystems are organized. Indeed, we believe that techniques used in small-scale sensor networks are not likely to scale to large networks; that such large-scale networks must be viewed as an ecosystem in which the sensors/effectors are organisms whose autonomous actions, based on local information, combine in a communal way to produce global results. As an example of a useful function, we demonstrate that fully distributed consensus can be attained in a scalable fashion in massively deployed sensor networks where individual motes operate based on local information, making local decisions that are aggregated across the network to achieve globally-meaningful effects.

  14. Quantifying the Impacts of Large Scale Integration of Renewables in Indian Power Sector

    NASA Astrophysics Data System (ADS)

    Kumar, P.; Mishra, T.; Banerjee, R.

    2017-12-01

    India's power sector is responsible for nearly 37 percent of India's greenhouse gas emissions. For a fast emerging economy like India whose population and energy consumption are poised to rise rapidly in the coming decades, renewable energy can play a vital role in decarbonizing power sector. In this context, India has targeted 33-35 percent emission intensity reduction (with respect to 2005 levels) along with large scale renewable energy targets (100GW solar, 60GW wind, and 10GW biomass energy by 2022) in INDCs submitted at Paris agreement. But large scale integration of renewable energy is a complex process which faces a number of problems like capital intensiveness, matching intermittent loads with least storage capacity and reliability. In this context, this study attempts to assess the technical feasibility of integrating renewables into Indian electricity mix by 2022 and analyze its implications on power sector operations. This study uses TIMES, a bottom up energy optimization model with unit commitment and dispatch features. We model coal and gas fired units discretely with region-wise representation of wind and solar resources. The dispatch features are used for operational analysis of power plant units under ramp rate and minimum generation constraints. The study analyzes India's electricity sector transition for the year 2022 with three scenarios. The base case scenario (no RE addition) along with INDC scenario (with 100GW solar, 60GW wind, 10GW biomass) and low RE scenario (50GW solar, 30GW wind) have been created to analyze the implications of large scale integration of variable renewable energy. The results provide us insights on trade-offs involved in achieving mitigation targets and investment decisions involved. The study also examines operational reliability and flexibility requirements of the system for integrating renewables.

  15. Central Asia: Regional Developments and Implications for U.S. Interests

    DTIC Science & Technology

    2007-08-30

    Regional Tensions) that stated that “as large - scale military operations against terrorism have come to an end in Afghanistan, the SCO member states...been among the world’s top producers of low enriched uranium. Kazakhstan had a fast breeder reactor at Aktau that was the world’s only nuclear...eventually Pakistan and India.58 All the states of the region possess large - scale resources that could contribute to the region becoming a “new silk

  16. On a Game of Large-Scale Projects Competition

    NASA Astrophysics Data System (ADS)

    Nikonov, Oleg I.; Medvedeva, Marina A.

    2009-09-01

    The paper is devoted to game-theoretical control problems motivated by economic decision making situations arising in realization of large-scale projects, such as designing and putting into operations the new gas or oil pipelines. A non-cooperative two player game is considered with payoff functions of special type for which standard existence theorems and algorithms for searching Nash equilibrium solutions are not applicable. The paper is based on and develops the results obtained in [1]-[5].

  17. Central Asia: Regional Developments and Implications for U.S. Interests

    DTIC Science & Technology

    2008-08-06

    during a meeting of the Shanghai Cooperation Organization (SCO; see above, Regional Tensions) that stated that “as large - scale military operations...Uzbekistan have been among the world’s top producers of low enriched uranium. Kazakhstan had a fast breeder reactor at Aktau that was the world’s only...the states of the region possess large - scale resources that could contribute to the region becoming a “new silk road” of trade and commerce. The

  18. Central Asia: Regional Developments and Implications for U.S. Interests

    DTIC Science & Technology

    2009-04-17

    see above, “Obstacles to Peace and Independence: Regional Tensions and Conflicts”) that stated that “as large - scale military operations against...and Kazakhstan and Uzbekistan have been among the world’s top producers of low enriched uranium. Kazakhstan had a fast breeder reactor at Aktau that...climate.86 All the states of the region possess large - scale resources that could contribute to the region becoming a “new silk road” of trade and

  19. 5 CFR 9301.7 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... and that operates solely for the purpose of conducting scientific research the results of which are... employees who perform the work and costs of conducting large-scale computer searches. (c) Duplicate means to... education, that operates a program or programs of scholarly research. (e) Fee category means one of the...

  20. 5 CFR 9301.7 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... and that operates solely for the purpose of conducting scientific research the results of which are... employees who perform the work and costs of conducting large-scale computer searches. (c) Duplicate means to... education, that operates a program or programs of scholarly research. (e) Fee category means one of the...

  1. SHEAR-DRIVEN DYNAMO WAVES IN THE FULLY NONLINEAR REGIME

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pongkitiwanichakul, P.; Nigro, G.; Cattaneo, F.

    2016-07-01

    Large-scale dynamo action is well understood when the magnetic Reynolds number ( Rm ) is small, but becomes problematic in the astrophysically relevant large Rm limit since the fluctuations may control the operation of the dynamo, obscuring the large-scale behavior. Recent works by Tobias and Cattaneo demonstrated numerically the existence of large-scale dynamo action in the form of dynamo waves driven by strongly helical turbulence and shear. Their calculations were carried out in the kinematic regime in which the back-reaction of the Lorentz force on the flow is neglected. Here, we have undertaken a systematic extension of their work tomore » the fully nonlinear regime. Helical turbulence and large-scale shear are produced self-consistently by prescribing body forces that, in the kinematic regime, drive flows that resemble the original velocity used by Tobias and Cattaneo. We have found four different solution types in the nonlinear regime for various ratios of the fluctuating velocity to the shear and Reynolds numbers. Some of the solutions are in the form of propagating waves. Some solutions show large-scale helical magnetic structure. Both waves and structures are permanent only when the kinetic helicity is non-zero on average.« less

  2. Pretest predictions for the response of a 1:8-scale steel LWR containment building model to static overpressurization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clauss, D.B.

    The analyses used to predict the behavior of a 1:8-scale model of a steel LWR containment building to static overpressurization are described and results are presented. Finite strain, large displacement, and nonlinear material properties were accounted for using finite element methods. Three-dimensional models were needed to analyze the penetrations, which included operable equipment hatches, personnel lock representations, and a constrained pipe. It was concluded that the scale model would fail due to leakage caused by large deformations of the equipment hatch sleeves. 13 refs., 34 figs., 1 tab.

  3. Cytology of DNA Replication Reveals Dynamic Plasticity of Large-Scale Chromatin Fibers.

    PubMed

    Deng, Xiang; Zhironkina, Oxana A; Cherepanynets, Varvara D; Strelkova, Olga S; Kireev, Igor I; Belmont, Andrew S

    2016-09-26

    In higher eukaryotic interphase nuclei, the 100- to >1,000-fold linear compaction of chromatin is difficult to reconcile with its function as a template for transcription, replication, and repair. It is challenging to imagine how DNA and RNA polymerases with their associated molecular machinery would move along the DNA template without transient decondensation of observed large-scale chromatin "chromonema" fibers [1]. Transcription or "replication factory" models [2], in which polymerases remain fixed while DNA is reeled through, are similarly difficult to conceptualize without transient decondensation of these chromonema fibers. Here, we show how a dynamic plasticity of chromatin folding within large-scale chromatin fibers allows DNA replication to take place without significant changes in the global large-scale chromatin compaction or shape of these large-scale chromatin fibers. Time-lapse imaging of lac-operator-tagged chromosome regions shows no major change in the overall compaction of these chromosome regions during their DNA replication. Improved pulse-chase labeling of endogenous interphase chromosomes yields a model in which the global compaction and shape of large-Mbp chromatin domains remains largely invariant during DNA replication, with DNA within these domains undergoing significant movements and redistribution as they move into and then out of adjacent replication foci. In contrast to hierarchical folding models, this dynamic plasticity of large-scale chromatin organization explains how localized changes in DNA topology allow DNA replication to take place without an accompanying global unfolding of large-scale chromatin fibers while suggesting a possible mechanism for maintaining epigenetic programming of large-scale chromatin domains throughout DNA replication. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Some aspects of wind tunnel magnetic suspension systems with special application at large physical scales

    NASA Technical Reports Server (NTRS)

    Britcher, C. P.

    1983-01-01

    Wind tunnel magnetic suspension and balance systems (MSBSs) have so far failed to find application at the large physical scales necessary for the majority of aerodynamic testing. Three areas of technology relevant to such application are investigated. Two variants of the Spanwise Magnet roll torque generation scheme are studied. Spanwise Permanent Magnets are shown to be practical and are experimentally demonstrated. Extensive computations of the performance of the Spanwise Iron Magnet scheme indicate powerful capability, limited principally be electromagnet technology. Aerodynamic testing at extreme attitudes is shown to be practical in relatively conventional MSBSs. Preliminary operation of the MSBS over a wide range of angles of attack is demonstrated. The impact of a requirement for highly reliable operation on the overall architecture of Large MSBSs is studied and it is concluded that system cost and complexity need not be seriously increased.

  5. Modeling Relevant to Safe Operations of U.S. Navy Vessels in Arctic Conditions: Physical Modeling of Ice Loads

    DTIC Science & Technology

    2016-06-01

    zones with ice concentrations up to 40%. To achieve this goal, the Navy must determine safe operational speeds as a function of ice concen- tration...and full-scale experience with ice-capable hull forms that have shallow entry angles to promote flexural ice failure preferentially over crushing...plan view) of the proposed large-scale ice–hull impact experiment to be conducted in CRREL’s refrigerated towing basin. Shown here is a side-panel

  6. Flexible, High-Speed CdSe Nanocrystal Integrated Circuits.

    PubMed

    Stinner, F Scott; Lai, Yuming; Straus, Daniel B; Diroll, Benjamin T; Kim, David K; Murray, Christopher B; Kagan, Cherie R

    2015-10-14

    We report large-area, flexible, high-speed analog and digital colloidal CdSe nanocrystal integrated circuits operating at low voltages. Using photolithography and a newly developed process to fabricate vertical interconnect access holes, we scale down device dimensions, reducing parasitic capacitances and increasing the frequency of circuit operation, and scale up device fabrication over 4 in. flexible substrates. We demonstrate amplifiers with ∼7 kHz bandwidth, ring oscillators with <10 μs stage delays, and NAND and NOR logic gates.

  7. Wafer-scale pixelated detector system

    DOEpatents

    Fahim, Farah; Deptuch, Grzegorz; Zimmerman, Tom

    2017-10-17

    A large area, gapless, detection system comprises at least one sensor; an interposer operably connected to the at least one sensor; and at least one application specific integrated circuit operably connected to the sensor via the interposer wherein the detection system provides high dynamic range while maintaining small pixel area and low power dissipation. Thereby the invention provides methods and systems for a wafer-scale gapless and seamless detector systems with small pixels, which have both high dynamic range and low power dissipation.

  8. Modelling Situation Awareness Information for Naval Decision Support Design

    DTIC Science & Technology

    2003-10-01

    Modelling Situation Awareness Information for Naval Decision Support Design Dr.-Ing. Bernhard Doering, Dipl.-Ing. Gert Doerfel, Dipl.-Ing... knowledge -based user interfaces. For developing such interfaces information of the three different SA levels which operators need in performing their...large scale on situation awareness of operators which is defined as the state of operator knowledge about the external environment resulting from

  9. Identification of critical equipment and determination of operational limits in helium refrigerators under pulsed heat load

    NASA Astrophysics Data System (ADS)

    Dutta, Rohan; Ghosh, Parthasarathi; Chowdhury, Kanchan

    2014-01-01

    Large-scale helium refrigerators are subjected to pulsed heat load from tokamaks. As these plants are designed for constant heat loads, operation under such varying load may lead to instability in plants thereby tripping the operation of different equipment. To understand the behavior of the plant subjected to pulsed heat load, an existing plant of 120 W at 4.2 K and another large-scale plant of 18 kW at 4.2 K have been analyzed using a commercial process simulator Aspen Hysys®. A similar heat load characteristic has been applied in both quasi steady state and dynamic analysis to determine critical stages and equipment of these plants from operational point of view. It has been found that the coldest part of both the cycles consisting JT-stage and its preceding reverse Brayton stage are the most affected stages of the cycles. Further analysis of the above stages and constituting equipment revealed limits of operation with respect to variation of return stream flow rate resulted from such heat load variations. The observations on the outcome of the analysis can be used for devising techniques for steady operation of the plants subjected to pulsed heat load.

  10. A distributed parallel storage architecture and its potential application within EOSDIS

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Tierney, Brian; Feuquay, Jay; Butzer, Tony

    1994-01-01

    We describe the architecture, implementation, use of a scalable, high performance, distributed-parallel data storage system developed in the ARPA funded MAGIC gigabit testbed. A collection of wide area distributed disk servers operate in parallel to provide logical block level access to large data sets. Operated primarily as a network-based cache, the architecture supports cooperation among independently owned resources to provide fast, large-scale, on-demand storage to support data handling, simulation, and computation.

  11. Tools for Large-Scale Mobile Malware Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bierma, Michael

    Analyzing mobile applications for malicious behavior is an important area of re- search, and is made di cult, in part, by the increasingly large number of appli- cations available for the major operating systems. There are currently over 1.2 million apps available in both the Google Play and Apple App stores (the respec- tive o cial marketplaces for the Android and iOS operating systems)[1, 2]. Our research provides two large-scale analysis tools to aid in the detection and analysis of mobile malware. The rst tool we present, Andlantis, is a scalable dynamic analysis system capa- ble of processing over 3000more » Android applications per hour. Traditionally, Android dynamic analysis techniques have been relatively limited in scale due to the compu- tational resources required to emulate the full Android system to achieve accurate execution. Andlantis is the most scalable Android dynamic analysis framework to date, and is able to collect valuable forensic data, which helps reverse-engineers and malware researchers identify and understand anomalous application behavior. We discuss the results of running 1261 malware samples through the system, and provide examples of malware analysis performed with the resulting data. While techniques exist to perform static analysis on a large number of appli- cations, large-scale analysis of iOS applications has been relatively small scale due to the closed nature of the iOS ecosystem, and the di culty of acquiring appli- cations for analysis. The second tool we present, iClone, addresses the challenges associated with iOS research in order to detect application clones within a dataset of over 20,000 iOS applications.« less

  12. Non-linear scale interactions in a forced turbulent boundary layer

    NASA Astrophysics Data System (ADS)

    Duvvuri, Subrahmanyam; McKeon, Beverley

    2015-11-01

    A strong phase-organizing influence exerted by a single synthetic large-scale spatio-temporal mode on directly-coupled (through triadic interactions) small scales in a turbulent boundary layer forced by a spatially-impulsive dynamic wall-roughness patch was previously demonstrated by the authors (J. Fluid Mech. 2015, vol. 767, R4). The experimental set-up was later enhanced to allow for simultaneous forcing of multiple scales in the flow. Results and analysis are presented from a new set of novel experiments where two distinct large scales are forced in the flow by a dynamic wall-roughness patch. The internal non-linear forcing of two other scales with triadic consistency to the artificially forced large scales, corresponding to sum and difference in wavenumbers, is dominated by the latter. This allows for a forcing-response (input-output) type analysis of the two triadic scales, and naturally lends itself to a resolvent operator based model (e.g. McKeon & Sharma, J. Fluid Mech. 2010, vol. 658, pp. 336-382) of the governing Navier-Stokes equations. The support of AFOSR (grant #FA 9550-12-1-0469, program manager D. Smith) is gratefully acknowledged.

  13. Infrastructure for Large-Scale Tests in Marine Autonomy

    DTIC Science & Technology

    2012-02-01

    suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis...8217+!0$%+()!()+($+!15+$! (#.%$&$)$-!%-!.BK*3$-(+$!$)&$-!.%$&$)+ *$+$+-3$)$$!. NHI

  14. RAID-2: Design and implementation of a large scale disk array controller

    NASA Technical Reports Server (NTRS)

    Katz, R. H.; Chen, P. M.; Drapeau, A. L.; Lee, E. K.; Lutz, K.; Miller, E. L.; Seshan, S.; Patterson, D. A.

    1992-01-01

    We describe the implementation of a large scale disk array controller and subsystem incorporating over 100 high performance 3.5 inch disk drives. It is designed to provide 40 MB/s sustained performance and 40 GB capacity in three 19 inch racks. The array controller forms an integral part of a file server that attaches to a Gb/s local area network. The controller implements a high bandwidth interconnect between an interleaved memory, an XOR calculation engine, the network interface (HIPPI), and the disk interfaces (SCSI). The system is now functionally operational, and we are tuning its performance. We review the design decisions, history, and lessons learned from this three year university implementation effort to construct a truly large scale system assembly.

  15. Scale-invariant streamline equations and strings of singular vorticity for perturbed anisotropic solutions of the Navier-Stokes equation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Libin, A., E-mail: a_libin@netvision.net.il

    2012-12-15

    A linear combination of a pair of dual anisotropic decaying Beltrami flows with spatially constant amplitudes (the Trkal solutions) with the same eigenvalue of the curl operator and of a constant velocity orthogonal vector to the Beltrami pair yields a triplet solution of the force-free Navier-Stokes equation. The amplitudes slightly variable in space (large scale perturbations) yield the emergence of a time-dependent phase between the dual Beltrami flows and of the upward velocity, which are unstable at large values of the Reynolds number. They also lead to the formation of large-scale curved prisms of streamlines with edges being the stringsmore » of singular vorticity.« less

  16. A convenient method for large-scale STM mapping of freestanding atomically thin conductive membranes

    NASA Astrophysics Data System (ADS)

    Uder, B.; Hartmann, U.

    2017-06-01

    Two-dimensional atomically flat sheets with a high flexibility are very attractive as ultrathin membranes but are also inherently challenging for microscopic investigations. We report on a method using Scanning Tunneling Microscopy (STM) under ultra-high vacuum conditions for large-scale mapping of several-micrometer-sized freestanding single and multilayer graphene membranes. This is achieved by operating the STM at unusual parameters. We found that large-scale scanning on atomically thin membranes delivers valuable results using very high tip-scan speeds combined with high feedback-loop gain and low tunneling currents. The method ultimately relies on the particular behavior of the freestanding membrane in the STM which is much different from that of a solid substrate.

  17. Cosmological constant implementing Mach principle in general relativity

    NASA Astrophysics Data System (ADS)

    Namavarian, Nadereh; Farhoudi, Mehrdad

    2016-10-01

    We consider the fact that noticing on the operational meaning of the physical concepts played an impetus role in the appearance of general relativity (GR). Thus, we have paid more attention to the operational definition of the gravitational coupling constant in this theory as a dimensional constant which is gained through an experiment. However, as all available experiments just provide the value of this constant locally, this coupling constant can operationally be meaningful only in a local area. Regarding this point, to obtain an extension of GR for the large scale, we replace it by a conformal invariant model and then, reduce this model to a theory for the cosmological scale via breaking down the conformal symmetry through singling out a specific conformal frame which is characterized by the large scale characteristics of the universe. Finally, we come to the same field equations that historically were proposed by Einstein for the cosmological scale (GR plus the cosmological constant) as the result of his endeavor for making GR consistent with the Mach principle. However, we declare that the obtained field equations in this alternative approach do not carry the problem of the field equations proposed by Einstein for being consistent with Mach's principle (i.e., the existence of de Sitter solution), and can also be considered compatible with this principle in the Sciama view.

  18. PROBLEM OF FORMING IN A MAN-OPERATOR A HABIT OF TRACKING A MOVING TARGET,

    DTIC Science & Technology

    Cybernetics stimulated the large-scale use of the method of functional analogy which makes it possible to compare technical and human activity systems...interesting and highly efficient human activity because of the psychological control factor involved in its operation. The human tracking system is

  19. Applying Bayesian Modeling and Receiver Operating Characteristic Methodologies for Test Utility Analysis

    ERIC Educational Resources Information Center

    Wang, Qiu; Diemer, Matthew A.; Maier, Kimberly S.

    2013-01-01

    This study integrated Bayesian hierarchical modeling and receiver operating characteristic analysis (BROCA) to evaluate how interest strength (IS) and interest differentiation (ID) predicted low–socioeconomic status (SES) youth's interest-major congruence (IMC). Using large-scale Kuder Career Search online-assessment data, this study fit three…

  20. Theoretical Definition of Instructor Role in Computer-Managed Instruction.

    ERIC Educational Resources Information Center

    McCombs, Barbara L.; Dobrovolny, Jacqueline L.

    This report describes the results of a theoretical analysis of the ideal role functions of the Computer Managed Instruction (CMI) instructor. Concepts relevant to instructor behavior are synthesized from both cognitive and operant learning theory perspectives, and the roles allocated to instructors by seven large-scale operational CMI systems are…

  1. Photonic crystal lasers using wavelength-scale embedded active region

    NASA Astrophysics Data System (ADS)

    Matsuo, Shinji; Sato, Tomonari; Takeda, Koji; Shinya, Akihiko; Nozaki, Kengo; Kuramochi, Eiichi; Taniyama, Hideaki; Notomi, Masaya; Fujii, Takuro; Hasebe, Koichi; Kakitsuka, Takaaki

    2014-01-01

    Lasers with ultra-low operating energy are desired for use in chip-to-chip and on-chip optical interconnects. If we are to reduce the operating energy, we must reduce the active volume. Therefore, a photonic crystal (PhC) laser with a wavelength-scale cavity has attracted a lot of attention because a PhC provides a large Q-factor with a small volume. To improve this device's performance, we employ an embedded active region structure in which the wavelength-scale active region is buried with an InP PhC slab. This structure enables us to achieve effective confinement of both carriers and photons, and to improve the thermal resistance of the device. Thus, we have obtained a large external differential quantum efficiency of 55% and an output power of -10 dBm by optical pumping. For electrical pumping, we use a lateral p-i-n structure that employs Zn diffusion and Si ion implantation for p-type and n-type doping, respectively. We have achieved room-temperature continuous-wave operation with a threshold current of 7.8 µA and a maximum 3 dB bandwidth of 16.2 GHz. The results of an experimental bit error rate measurement with a 10 Gbit s-1 NRZ signal reveal the minimum operating energy for transferring a single bit of 5.5 fJ. These results show the potential of this laser to be used for very short reach interconnects. We also describe the optimal design of cavity quality (Q) factor in terms of achieving a large output power with a low operating energy using a calculation based on rate equations. When we assume an internal absorption loss of 20 cm-1, the optimized coupling Q-factor is 2000.

  2. Development of a Spot-Application Tool for Rapid, High-Resolution Simulation of Wave-Driven Nearshore Hydrodynamics

    DTIC Science & Technology

    2013-09-30

    flow models, such as Delft3D, with our developed Boussinesq -type model. The vision of this project is to develop an operational tool for the...situ measurements or large-scale wave models. This information will be used to drive the offshore wave boundary condition. • Execute the Boussinesq ...model to match with the Boussinesq -type theory would be one which can simulate sheared and stratified currents due to large-scale (non-wave) forcings

  3. Central Asia: Regional Developments and Implications for U.S. Interests

    DTIC Science & Technology

    2008-11-13

    Regional Tensions) that stated that “as large - scale military operations against terrorism have come to an end in Afghanistan, the SCO member states... breeder reactor at Aktau that was the world’s only nuclear desalinization facility. Shut down in 1999, it had nearly 300 metric tons of uranium and...investment climate.77 All the states of the region possess large - scale resources that could contribute to the region becoming a “new silk road” of trade and

  4. Foster Wheeler's Solutions for Large Scale CFB Boiler Technology: Features and Operational Performance of Łagisza 460 MWe CFB Boiler

    NASA Astrophysics Data System (ADS)

    Hotta, Arto

    During recent years, once-through supercritical (OTSC) CFB technology has been developed, enabling the CFB technology to proceed to medium-scale (500 MWe) utility projects such as Łagisza Power Plant in Poland owned by Poludniowy Koncern Energetyczny SA. (PKE), with net efficiency nearly 44%. Łagisza power plant is currently under commissioning and has reached full load operation in March 2009. The initial operation shows very good performance and confirms, that the CFB process has no problems with the scaling up to this size. Also the once-through steam cycle utilizing Siemens' vertical tube Benson technology has performed as predicted in the CFB process. Foster Wheeler has developed the CFB design further up to 800 MWe with net efficiency of ≥45%.

  5. Transitioning a home telehealth project into a sustainable, large-scale service: a qualitative study.

    PubMed

    Wade, Victoria A; Taylor, Alan D; Kidd, Michael R; Carati, Colin

    2016-05-16

    This study was a component of the Flinders Telehealth in the Home project, which tested adding home telehealth to existing rehabilitation, palliative care and geriatric outreach services. Due to the known difficulty of transitioning telehealth projects services, a qualitative study was conducted to produce a preferred implementation approach for sustainable and large-scale operations, and a process model that offers practical advice for achieving this goal. Initially, semi-structured interviews were conducted with senior clinicians, health service managers and policy makers, and a thematic analysis of the interview transcripts was undertaken to identify the range of options for ongoing operations, plus the factors affecting sustainability. Subsequently, the interviewees and other decision makers attended a deliberative forum in which participants were asked to select a preferred model for future implementation. Finally, all data from the study was synthesised by the researchers to produce a process model. 19 interviews with senior clinicians, managers, and service development staff were conducted, finding strong support for home telehealth but a wide diversity of views on governance, models of clinical care, technical infrastructure operations, and data management. The deliberative forum worked through these options and recommended a collaborative consortium approach for large-scale implementation. The process model proposes that the key factor for large-scale implementation is leadership support, which is enabled by 1) showing solutions to the problems of service demand, budgetary pressure and the relationship between hospital and primary care, 2) demonstrating how home telehealth aligns with health service policies, and 3) achieving clinician acceptance through providing evidence of benefit and developing new models of clinical care. Two key actions to enable change were marketing telehealth to patients, clinicians and policy-makers, and building a community of practice. The implementation of home telehealth services is still in an early stage. Change agents and a community of practice can contribute by marketing telehealth, demonstrating policy alignment and providing potential solutions for difficult health services problems. This should assist health leaders to move from trials to large-scale services.

  6. Large-scale modeling of rain fields from a rain cell deterministic model

    NASA Astrophysics Data System (ADS)

    FéRal, Laurent; Sauvageot, Henri; Castanet, Laurent; Lemorton, JoëL.; Cornet, FréDéRic; Leconte, Katia

    2006-04-01

    A methodology to simulate two-dimensional rain rate fields at large scale (1000 × 1000 km2, the scale of a satellite telecommunication beam or a terrestrial fixed broadband wireless access network) is proposed. It relies on a rain rate field cellular decomposition. At small scale (˜20 × 20 km2), the rain field is split up into its macroscopic components, the rain cells, described by the Hybrid Cell (HYCELL) cellular model. At midscale (˜150 × 150 km2), the rain field results from the conglomeration of rain cells modeled by HYCELL. To account for the rain cell spatial distribution at midscale, the latter is modeled by a doubly aggregative isotropic random walk, the optimal parameterization of which is derived from radar observations at midscale. The extension of the simulation area from the midscale to the large scale (1000 × 1000 km2) requires the modeling of the weather frontal area. The latter is first modeled by a Gaussian field with anisotropic covariance function. The Gaussian field is then turned into a binary field, giving the large-scale locations over which it is raining. This transformation requires the definition of the rain occupation rate over large-scale areas. Its probability distribution is determined from observations by the French operational radar network ARAMIS. The coupling with the rain field modeling at midscale is immediate whenever the large-scale field is split up into midscale subareas. The rain field thus generated accounts for the local CDF at each point, defining a structure spatially correlated at small scale, midscale, and large scale. It is then suggested that this approach be used by system designers to evaluate diversity gain, terrestrial path attenuation, or slant path attenuation for different azimuth and elevation angle directions.

  7. An Integrated Assessment of Location-Dependent Scaling for Microalgae Biofuel Production Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Andre M.; Abodeely, Jared; Skaggs, Richard

    Successful development of a large-scale microalgae-based biofuels industry requires comprehensive analysis and understanding of the feedstock supply chain—from facility siting/design through processing/upgrading of the feedstock to a fuel product. The evolution from pilot-scale production facilities to energy-scale operations presents many multi-disciplinary challenges, including a sustainable supply of water and nutrients, operational and infrastructure logistics, and economic competitiveness with petroleum-based fuels. These challenges are addressed in part by applying the Integrated Assessment Framework (IAF)—an integrated multi-scale modeling, analysis, and data management suite—to address key issues in developing and operating an open-pond facility by analyzing how variability and uncertainty in space andmore » time affect algal feedstock production rates, and determining the site-specific “optimum” facility scale to minimize capital and operational expenses. This approach explicitly and systematically assesses the interdependence of biofuel production potential, associated resource requirements, and production system design trade-offs. The IAF was applied to a set of sites previously identified as having the potential to cumulatively produce 5 billion-gallons/year in the southeastern U.S. and results indicate costs can be reduced by selecting the most effective processing technology pathway and scaling downstream processing capabilities to fit site-specific growing conditions, available resources, and algal strains.« less

  8. Consultancy on Large-Scale Submerged Aerobic Cultivation Process Design - Final Technical Report: February 1, 2016 -- June 30, 2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crater, Jason; Galleher, Connor; Lievense, Jeff

    NREL is developing an advanced aerobic bubble column model using Aspen Custom Modeler (ACM). The objective of this work is to integrate the new fermentor model with existing techno-economic models in Aspen Plus and Excel to establish a new methodology for guiding process design. To assist this effort, NREL has contracted Genomatica to critique and make recommendations for improving NREL's bioreactor model and large scale aerobic bioreactor design for biologically producing lipids at commercial scale. Genomatica has highlighted a few areas for improving the functionality and effectiveness of the model. Genomatica recommends using a compartment model approach with an integratedmore » black-box kinetic model of the production microbe. We also suggest including calculations for stirred tank reactors to extend the models functionality and adaptability for future process designs. Genomatica also suggests making several modifications to NREL's large-scale lipid production process design. The recommended process modifications are based on Genomatica's internal techno-economic assessment experience and are focused primarily on minimizing capital and operating costs. These recommendations include selecting/engineering a thermotolerant yeast strain with lipid excretion; using bubble column fermentors; increasing the size of production fermentors; reducing the number of vessels; employing semi-continuous operation; and recycling cell mass.« less

  9. Theory based scaling of edge turbulence and implications for the scrape-off layer width

    NASA Astrophysics Data System (ADS)

    Myra, J. R.; Russell, D. A.; Zweben, S. J.

    2016-11-01

    Turbulence and plasma parameter data from the National Spherical Torus Experiment (NSTX) [Ono et al., Nucl. Fusion 40, 557 (2000)] is examined and interpreted based on various theoretical estimates. In particular, quantities of interest for assessing the role of turbulent transport on the midplane scrape-off layer heat flux width are assessed. Because most turbulence quantities exhibit large scatter and little scaling within a given operation mode, this paper focuses on length and time scales and dimensionless parameters between operational modes including Ohmic, low (L), and high (H) modes using a large NSTX edge turbulence database [Zweben et al., Nucl. Fusion 55, 093035 (2015)]. These are compared with theoretical estimates for drift and interchange rates, profile modification saturation levels, a resistive ballooning condition, and dimensionless parameters characterizing L and H mode conditions. It is argued that the underlying instability physics governing edge turbulence in different operational modes is, in fact, similar, and is consistent with curvature-driven drift ballooning. Saturation physics, however, is dependent on the operational mode. Five dimensionless parameters for drift-interchange turbulence are obtained and employed to assess the importance of turbulence in setting the scrape-off layer heat flux width λq and its scaling. An explicit proportionality of the width λq to the safety factor and major radius (qR) is obtained under these conditions. Quantitative estimates and reduced model numerical simulations suggest that the turbulence mechanism is not negligible in determining λq in NSTX, at least for high plasma current discharges.

  10. Theory based scaling of edge turbulence and implications for the scrape-off layer width

    DOE PAGES

    Myra, J. R.; Russell, D. A.; Zweben, S. J.

    2016-11-01

    Turbulence and plasma parameter data from the National Spherical Torus Experiment (NSTX) is examined and interpreted based on various theoretical estimates. In particular, quantities of interest for assessing the role of turbulent transport on the midplane scrape-off layer heat flux width are assessed. Because most turbulence quantities exhibit large scatter and little scaling within a given operation mode, this paper focuses on length and time scales and dimensionless parameters between operational modes including Ohmic, low (L), and high (H) modes using a large NSTX edge turbulence database. These are compared with theoretical estimates for drift and interchange rates, profile modificationmore » saturation levels, a resistive ballooning condition, and dimensionless parameters characterizing L and H mode conditions. It is argued that the underlying instability physics governing edge turbulence in different operational modes is, in fact, similar, and is consistent with curvature-driven drift ballooning. Saturation physics, however, is dependent on the operational mode. Five dimensionless parameters for drift-interchange turbulence are obtained and employed to assess the importance of turbulence in setting the scrape-off layer heat flux width λ q and its scaling. An explicit proportionality of the width λ q to the safety factor and major radius (qR) is obtained under these conditions. Lastly, quantitative estimates and reduced model numerical simulations suggest that the turbulence mechanism is not negligible in determining λ q in NSTX, at least for high plasma current discharges.« less

  11. Antimicrobial residues in animal waste and water resources proximal to large-scale swine and poultry feeding operations

    USGS Publications Warehouse

    Campagnolo, E.R.; Johnson, K.R.; Karpati, A.; Rubin, C.S.; Kolpin, D.W.; Meyer, M.T.; Esteban, J. Emilio; Currier, R.W.; Smith, K.; Thu, K.M.; McGeehin, M.

    2002-01-01

    Expansion and intensification of large-scale animal feeding operations (AFOs) in the United States has resulted in concern about environmental contamination and its potential public health impacts. The objective of this investigation was to obtain background data on a broad profile of antimicrobial residues in animal wastes and surface water and groundwater proximal to large-scale swine and poultry operations. The samples were measured for antimicrobial compounds using both radioimmunoassay and liquid chromatography/electrospray ionization-mass spectrometry (LC/ESI-MS) techniques. Multiple classes of antimicrobial compounds (commonly at concentrations of >100 μg/l) were detected in swine waste storage lagoons. In addition, multiple classes of antimicrobial compounds were detected in surface and groundwater samples collected proximal to the swine and poultry farms. This information indicates that animal waste used as fertilizer for crops may serve as a source of antimicrobial residues for the environment. Further research is required to determine if the levels of antimicrobials detected in this study are of consequence to human and/or environmental ecosystems. A comparison of the radioimmunoassay and LC/ESI-MS analytical methods documented that radioimmunoassay techniques were only appropriate for measuring residues in animal waste samples likely to contain high levels of antimicrobials. More sensitive LC/ESI-MS techniques are required in environmental samples, where low levels of antimicrobial residues are more likely.

  12. Die-off rates of Cryptosporidium parvum oocysts in a swine lagoon and in a spray field

    USDA-ARS?s Scientific Manuscript database

    Background: Because of several large-scale outbreaks of cryptosporidiosis in humans, Cryptosporidium has become a public health concern. Commercial swine operations apply large volumes of effluent from lagoons to spray fields as a waste management practice. This effluent is a source of Cryptosporidi...

  13. Environmental management of small-scale and artisanal mining: the Portovelo-Zaruma goldmining area, southern Ecuador.

    PubMed

    Tarras-Wahlberg, N H

    2002-06-01

    This paper considers technical measures and policy initiatives needed to improve environmental management in the Portovelo-Zaruma mining district of southern Ecuador. In this area, gold is mined by a large number of small-scale and artisanal operators, and discharges of cyanide and metal-laden tailings have had a severe impact on the shared Ecuadorian-Peruvian Puyango river system. It is shown to be technically possible to confine mining waste and tailings at a reasonable cost. However, the complex topography of the mining district forces tailings management to be communal, where all operators are connected to one central tailings impoundment. This, in turn, implies two things: (i) that a large number of operators must agree to pool resources to bring such a facility into reality; and (ii) that miners must move away from rudimentary operations that survive on a day-to-day basis, towards bigger, mechanized and longer-term sustainable operations that are based on proven ore reserves. It is deemed unlikely that existing environmental regulations and the provision of technical solutions will be sufficient to resolve the environmental problems. Important impediments relate to the limited financial resources available to each individual miner and the problems of pooling these resources, and to the fact that the main impacts of pollution are suffered downstream of the mining district and, hence, do not affect the miners themselves. Three policy measures are therefore suggested. First, the enforcement of existing regulations must be improved, and this may be achieved by the strengthening of the central authority charged with supervision and control of mining activities. Second, local government involvement and local public participation in environmental management needs to be promoted. Third, a clear policy should be defined which promotes the reorganisation of small operations into larger units that are strong enough to sustain rational exploration and environmental obligations. The case study suggests that mining policy in lesser-developed countries should develop to enable small-scale and artisanal miners to form entities that are of a sufficiently large scale to allow adequate and cost-effective environmental protection.

  14. Investigating the dependence of SCM simulated precipitation and clouds on the spatial scale of large-scale forcing at SGP [Investigating the scale dependence of SCM simulated precipitation and cloud by using gridded forcing data at SGP

    DOE PAGES

    Tang, Shuaiqi; Zhang, Minghua; Xie, Shaocheng

    2017-08-05

    Large-scale forcing data, such as vertical velocity and advective tendencies, are required to drive single-column models (SCMs), cloud-resolving models, and large-eddy simulations. Previous studies suggest that some errors of these model simulations could be attributed to the lack of spatial variability in the specified domain-mean large-scale forcing. This study investigates the spatial variability of the forcing and explores its impact on SCM simulated precipitation and clouds. A gridded large-scale forcing data during the March 2000 Cloud Intensive Operational Period at the Atmospheric Radiation Measurement program's Southern Great Plains site is used for analysis and to drive the single-column version ofmore » the Community Atmospheric Model Version 5 (SCAM5). When the gridded forcing data show large spatial variability, such as during a frontal passage, SCAM5 with the domain-mean forcing is not able to capture the convective systems that are partly located in the domain or that only occupy part of the domain. This problem has been largely reduced by using the gridded forcing data, which allows running SCAM5 in each subcolumn and then averaging the results within the domain. This is because the subcolumns have a better chance to capture the timing of the frontal propagation and the small-scale systems. As a result, other potential uses of the gridded forcing data, such as understanding and testing scale-aware parameterizations, are also discussed.« less

  15. First results of the ITER-relevant negative ion beam test facility ELISE (invited).

    PubMed

    Fantz, U; Franzen, P; Heinemann, B; Wünderlich, D

    2014-02-01

    An important step in the European R&D roadmap towards the neutral beam heating systems of ITER is the new test facility ELISE (Extraction from a Large Ion Source Experiment) for large-scale extraction from a half-size ITER RF source. The test facility was constructed in the last years at Max-Planck-Institut für Plasmaphysik Garching and is now operational. ELISE is gaining early experience of the performance and operation of large RF-driven negative hydrogen ion sources with plasma illumination of a source area of 1 × 0.9 m(2) and an extraction area of 0.1 m(2) using 640 apertures. First results in volume operation, i.e., without caesium seeding, are presented.

  16. A Hybrid, Large-Scale Wireless Sensor Network for Real-Time Acquisition and Tracking

    DTIC Science & Technology

    2007-06-01

    multicolor, Quantum Well Infrared Photodetector ( QWIP ), step-stare, large-format Focal Plane Array (FPA) is proposed and evaluated through performance...Photodetector ( QWIP ), step-stare, large-format Focal Plane Array (FPA) is proposed and evaluated through performance analysis. The thesis proposes...7 1. Multi-color IR Sensors - Operational Advantages ...........................8 2. Quantum-Well IR Photodetector ( QWIP

  17. Project BALLOTS: Bibliographic Automation of Large Library Operations Using a Time-Sharing System. Progress Report (3/27/69 - 6/26/69).

    ERIC Educational Resources Information Center

    Veaner, Allen B.

    Project BALLOTS is a large-scale library automation development project of the Stanford University Libraries which has demonstrated the feasibility of conducting on-line interactive searches of complex bibliographic files, with a large number of users working simultaneously in the same or different files. This report documents the continuing…

  18. Fast Combinatorial Algorithm for the Solution of Linearly Constrained Least Squares Problems

    DOEpatents

    Van Benthem, Mark H.; Keenan, Michael R.

    2008-11-11

    A fast combinatorial algorithm can significantly reduce the computational burden when solving general equality and inequality constrained least squares problems with large numbers of observation vectors. The combinatorial algorithm provides a mathematically rigorous solution and operates at great speed by reorganizing the calculations to take advantage of the combinatorial nature of the problems to be solved. The combinatorial algorithm exploits the structure that exists in large-scale problems in order to minimize the number of arithmetic operations required to obtain a solution.

  19. The impact of large-scale, long-term optical surveys on pulsating star research

    NASA Astrophysics Data System (ADS)

    Soszyński, Igor

    2017-09-01

    The era of large-scale photometric variability surveys began a quarter of a century ago, when three microlensing projects - EROS, MACHO, and OGLE - started their operation. These surveys initiated a revolution in the field of variable stars and in the next years they inspired many new observational projects. Large-scale optical surveys multiplied the number of variable stars known in the Universe. The huge, homogeneous and complete catalogs of pulsating stars, such as Cepheids, RR Lyrae stars, or long-period variables, offer an unprecedented opportunity to calibrate and test the accuracy of various distance indicators, to trace the three-dimensional structure of the Milky Way and other galaxies, to discover exotic types of intrinsically variable stars, or to study previously unknown features and behaviors of pulsators. We present historical and recent findings on various types of pulsating stars obtained from the optical large-scale surveys, with particular emphasis on the OGLE project which currently offers the largest photometric database among surveys for stellar variability.

  20. Economically viable large-scale hydrogen liquefaction

    NASA Astrophysics Data System (ADS)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  1. Studies of Sub-Synchronous Oscillations in Large-Scale Wind Farm Integrated System

    NASA Astrophysics Data System (ADS)

    Yue, Liu; Hang, Mend

    2018-01-01

    With the rapid development and construction of large-scale wind farms and grid-connected operation, the series compensation wind power AC transmission is gradually becoming the main way of power usage and improvement of wind power availability and grid stability, but the integration of wind farm will change the SSO (Sub-Synchronous oscillation) damping characteristics of synchronous generator system. Regarding the above SSO problem caused by integration of large-scale wind farms, this paper focusing on doubly fed induction generator (DFIG) based wind farms, aim to summarize the SSO mechanism in large-scale wind power integrated system with series compensation, which can be classified as three types: sub-synchronous control interaction (SSCI), sub-synchronous torsional interaction (SSTI), sub-synchronous resonance (SSR). Then, SSO modelling and analysis methods are categorized and compared by its applicable areas. Furthermore, this paper summarizes the suppression measures of actual SSO projects based on different control objectives. Finally, the research prospect on this field is explored.

  2. Charter Operators Spell Out Barriers to "Scaling Up"

    ERIC Educational Resources Information Center

    Zehr, Mary Ann

    2011-01-01

    The pace at which the highest-performing charter-management organizations (CMOs) are "scaling up" is being determined largely by how rapidly they can develop and hire strong leaders and acquire physical space, and by the level of support they receive for growth from city or state policies, say leaders from some charter organizations…

  3. Applications of Magnetic Suspension Technology to Large Scale Facilities: Progress, Problems and Promises

    NASA Technical Reports Server (NTRS)

    Britcher, Colin P.

    1997-01-01

    This paper will briefly review previous work in wind tunnel Magnetic Suspension and Balance Systems (MSBS) and will examine the handful of systems around the world currently known to be in operational condition or undergoing recommissioning. Technical developments emerging from research programs at NASA and elsewhere will be reviewed briefly, where there is potential impact on large-scale MSBSS. The likely aerodynamic applications for large MSBSs will be addressed, since these applications should properly drive system designs. A recently proposed application to ultra-high Reynolds number testing will then be addressed in some detail. Finally, some opinions on the technical feasibility and usefulness of a large MSBS will be given.

  4. Large-scale behaviour of local and entanglement entropy of the free Fermi gas at any temperature

    NASA Astrophysics Data System (ADS)

    Leschke, Hajo; Sobolev, Alexander V.; Spitzer, Wolfgang

    2016-07-01

    The leading asymptotic large-scale behaviour of the spatially bipartite entanglement entropy (EE) of the free Fermi gas infinitely extended in multidimensional Euclidean space at zero absolute temperature, T = 0, is by now well understood. Here, we present and discuss the first rigorous results for the corresponding EE of thermal equilibrium states at T> 0. The leading large-scale term of this thermal EE turns out to be twice the first-order finite-size correction to the infinite-volume thermal entropy (density). Not surprisingly, this correction is just the thermal entropy on the interface of the bipartition. However, it is given by a rather complicated integral derived from a semiclassical trace formula for a certain operator on the underlying one-particle Hilbert space. But in the zero-temperature limit T\\downarrow 0, the leading large-scale term of the thermal EE considerably simplifies and displays a {ln}(1/T)-singularity which one may identify with the known logarithmic enhancement at T = 0 of the so-called area-law scaling. birthday of the ideal Fermi gas.

  5. Large-scale machine learning and evaluation platform for real-time traffic surveillance

    NASA Astrophysics Data System (ADS)

    Eichel, Justin A.; Mishra, Akshaya; Miller, Nicholas; Jankovic, Nicholas; Thomas, Mohan A.; Abbott, Tyler; Swanson, Douglas; Keller, Joel

    2016-09-01

    In traffic engineering, vehicle detectors are trained on limited datasets, resulting in poor accuracy when deployed in real-world surveillance applications. Annotating large-scale high-quality datasets is challenging. Typically, these datasets have limited diversity; they do not reflect the real-world operating environment. There is a need for a large-scale, cloud-based positive and negative mining process and a large-scale learning and evaluation system for the application of automatic traffic measurements and classification. The proposed positive and negative mining process addresses the quality of crowd sourced ground truth data through machine learning review and human feedback mechanisms. The proposed learning and evaluation system uses a distributed cloud computing framework to handle data-scaling issues associated with large numbers of samples and a high-dimensional feature space. The system is trained using AdaBoost on 1,000,000 Haar-like features extracted from 70,000 annotated video frames. The trained real-time vehicle detector achieves an accuracy of at least 95% for 1/2 and about 78% for 19/20 of the time when tested on ˜7,500,000 video frames. At the end of 2016, the dataset is expected to have over 1 billion annotated video frames.

  6. MANGO Imager Network Observations of Geomagnetic Storm Impact on Midlatitude 630 nm Airglow Emissions

    NASA Astrophysics Data System (ADS)

    Kendall, E. A.; Bhatt, A.

    2017-12-01

    The Midlatitude Allsky-imaging Network for GeoSpace Observations (MANGO) is a network of imagers filtered at 630 nm spread across the continental United States. MANGO is used to image large-scale airglow and aurora features and observes the generation, propagation, and dissipation of medium and large-scale wave activity in the subauroral, mid and low-latitude thermosphere. This network consists of seven all-sky imagers providing continuous coverage over the United States and extending south into Mexico. This network sees high levels of medium and large scale wave activity due to both neutral and geomagnetic storm forcing. The geomagnetic storm observations largely fall into two categories: Stable Auroral Red (SAR) arcs and Large-scale traveling ionospheric disturbances (LSTIDs). In addition, less-often observed effects include anomalous airglow brightening, bright swirls, and frozen-in traveling structures. We will present an analysis of multiple events observed over four years of MANGO network operation. We will provide both statistics on the cumulative observations and a case study of the "Memorial Day Storm" on May 27, 2017.

  7. Visualization of the Eastern Renewable Generation Integration Study: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gruchalla, Kenny; Novacheck, Joshua; Bloom, Aaron

    The Eastern Renewable Generation Integration Study (ERGIS), explores the operational impacts of the wide spread adoption of wind and solar photovoltaics (PV) resources in the U.S. Eastern Interconnection and Quebec Interconnection (collectively, EI). In order to understand some of the economic and reliability challenges of managing hundreds of gigawatts of wind and PV generation, we developed state of the art tools, data, and models for simulating power system operations using hourly unit commitment and 5-minute economic dispatch over an entire year. Using NREL's high-performance computing capabilities and new methodologies to model operations, we found that the EI, as simulated withmore » evolutionary change in 2026, could balance the variability and uncertainty of wind and PV at a 5-minute level under a variety of conditions. A large-scale display and a combination of multiple coordinated views and small multiples were used to visually analyze the four large highly multivariate scenarios with high spatial and temporal resolutions. state of the art tools, data, and models for simulating power system operations using hourly unit commitment and 5-minute economic dispatch over an entire year. Using NRELs high-performance computing capabilities and new methodologies to model operations, we found that the EI, as simulated with evolutionary change in 2026, could balance the variability and uncertainty of wind and PV at a 5-minute level under a variety of conditions. A large-scale display and a combination of multiple coordinated views and small multiples were used to visually analyze the four large highly multivariate scenarios with high spatial and temporal resolutions.« less

  8. Designing, Evaluating, and Deploying Automated Scoring Systems with Validity in Mind: Methodological Design Decisions

    ERIC Educational Resources Information Center

    Rupp, André A.

    2018-01-01

    This article discusses critical methodological design decisions for collecting, interpreting, and synthesizing empirical evidence during the design, deployment, and operational quality-control phases for automated scoring systems. The discussion is inspired by work on operational large-scale systems for automated essay scoring but many of the…

  9. Process and Learning Outcomes from Remotely-Operated, Simulated, and Hands-on Student Laboratories

    ERIC Educational Resources Information Center

    Corter, James E.; Esche, Sven K.; Chassapis, Constantin; Ma, Jing; Nickerson, Jeffrey V.

    2011-01-01

    A large-scale, multi-year, randomized study compared learning activities and outcomes for hands-on, remotely-operated, and simulation-based educational laboratories in an undergraduate engineering course. Students (N = 458) worked in small-group lab teams to perform two experiments involving stress on a cantilever beam. Each team conducted the…

  10. Extreme Makeover: IT Edition

    ERIC Educational Resources Information Center

    Schaffhauser, Dian

    2011-01-01

    Air-traffic controller might be a relaxing second career for anyone who's coordinated IT operations at a large research university. Just ask administrators at the University of Michigan in Ann Arbor. As at most big universities, IT operations on the academic side are decentralized on a major scale. When a faculty member in one of Michigan's 19…

  11. Prediction of Large Vessel Occlusions in Acute Stroke: National Institute of Health Stroke Scale Is Hard to Beat.

    PubMed

    Vanacker, Peter; Heldner, Mirjam R; Amiguet, Michael; Faouzi, Mohamed; Cras, Patrick; Ntaios, George; Arnold, Marcel; Mattle, Heinrich P; Gralla, Jan; Fischer, Urs; Michel, Patrik

    2016-06-01

    Endovascular treatment for acute ischemic stroke with a large vessel occlusion was recently shown to be effective. We aimed to develop a score capable of predicting large vessel occlusion eligible for endovascular treatment in the early hospital management. Retrospective, cohort study. Two tertiary, Swiss stroke centers. Consecutive acute ischemic stroke patients (1,645 patients; Acute STroke Registry and Analysis of Lausanne registry), who had CT angiography within 6 and 12 hours of symptom onset, were categorized according to the occlusion site. Demographic and clinical information was used in logistic regression analysis to derive predictors of large vessel occlusion (defined as intracranial carotid, basilar, and M1 segment of middle cerebral artery occlusions). Based on logistic regression coefficients, an integer score was created and validated internally and externally (848 patients; Bernese Stroke Registry). None. Large vessel occlusions were present in 316 patients (21%) in the derivation and 566 (28%) in the external validation cohort. Five predictors added significantly to the score: National Institute of Health Stroke Scale at admission, hemineglect, female sex, atrial fibrillation, and no history of stroke and prestroke handicap (modified Rankin Scale score, < 2). Diagnostic accuracy in internal and external validation cohorts was excellent (area under the receiver operating characteristic curve, 0.84 both). The score performed slightly better than National Institute of Health Stroke Scale alone regarding prediction error (Wilcoxon signed rank test, p < 0.001) and regarding discriminatory power in derivation and pooled cohorts (area under the receiver operating characteristic curve, 0.81 vs 0.80; DeLong test, p = 0.02). Our score accurately predicts the presence of emergent large vessel occlusions, which are eligible for endovascular treatment. However, incorporation of additional demographic and historical information available on hospital arrival provides minimal incremental predictive value compared with the National Institute of Health Stroke Scale alone.

  12. Volunteers Oriented Interface Design for the Remote Navigation of Rescue Robots at Large-Scale Disaster Sites

    NASA Astrophysics Data System (ADS)

    Yang, Zhixiao; Ito, Kazuyuki; Saijo, Kazuhiko; Hirotsune, Kazuyuki; Gofuku, Akio; Matsuno, Fumitoshi

    This paper aims at constructing an efficient interface being similar to those widely used in human daily life, to fulfill the need of many volunteer rescuers operating rescue robots at large-scale disaster sites. The developed system includes a force feedback steering wheel interface and an artificial neural network (ANN) based mouse-screen interface. The former consists of a force feedback steering control and a six monitors’ wall. It provides a manual operation like driving cars to navigate a rescue robot. The latter consists of a mouse and a camera’s view displayed in a monitor. It provides a semi-autonomous operation by mouse clicking to navigate a rescue robot. Results of experiments show that a novice volunteer can skillfully navigate a tank rescue robot through both interfaces after 20 to 30 minutes of learning their operation respectively. The steering wheel interface has high navigating speed in open areas, without restriction of terrains and surface conditions of a disaster site. The mouse-screen interface is good at exact navigation in complex structures, while bringing little tension to operators. The two interfaces are designed to switch into each other at any time to provide a combined efficient navigation method.

  13. Large-scale Advanced Prop-fan (LAP) technology assessment report

    NASA Technical Reports Server (NTRS)

    Degeorge, C. L.

    1988-01-01

    The technologically significant findings and accomplishments of the Large Scale Advanced Prop-Fan (LAP) program in the areas of aerodynamics, aeroelasticity, acoustics and materials and fabrication are described. The extent to which the program goals related to these disciplines were achieved is discussed, and recommendations for additional research are presented. The LAP program consisted of the design, manufacture and testing of a near full-scale Prop-Fan or advanced turboprop capable of operating efficiently at speeds to Mach .8. An aeroelastically scaled model of the LAP was also designed and fabricated. The goal of the program was to acquire data on Prop-Fan performance that would indicate the technology readiness of Prop-Fans for practical applications in commercial and military aviation.

  14. Large scale, synchronous variability of marine fish populations driven by commercial exploitation.

    PubMed

    Frank, Kenneth T; Petrie, Brian; Leggett, William C; Boyce, Daniel G

    2016-07-19

    Synchronous variations in the abundance of geographically distinct marine fish populations are known to occur across spatial scales on the order of 1,000 km and greater. The prevailing assumption is that this large-scale coherent variability is a response to coupled atmosphere-ocean dynamics, commonly represented by climate indexes, such as the Atlantic Multidecadal Oscillation and North Atlantic Oscillation. On the other hand, it has been suggested that exploitation might contribute to this coherent variability. This possibility has been generally ignored or dismissed on the grounds that exploitation is unlikely to operate synchronously at such large spatial scales. Our analysis of adult fishing mortality and spawning stock biomass of 22 North Atlantic cod (Gadus morhua) stocks revealed that both the temporal and spatial scales in fishing mortality and spawning stock biomass were equivalent to those of the climate drivers. From these results, we conclude that greater consideration must be given to the potential of exploitation as a driving force behind broad, coherent variability of heavily exploited fish species.

  15. Entanglement renormalization, quantum error correction, and bulk causality

    NASA Astrophysics Data System (ADS)

    Kim, Isaac H.; Kastoryano, Michael J.

    2017-04-01

    Entanglement renormalization can be viewed as an encoding circuit for a family of approximate quantum error correcting codes. The logical information becomes progres-sively more well-protected against erasure errors at larger length scales. In particular, an approximate variant of holographic quantum error correcting code emerges at low energy for critical systems. This implies that two operators that are largely separated in scales behave as if they are spatially separated operators, in the sense that they obey a Lieb-Robinson type locality bound under a time evolution generated by a local Hamiltonian.

  16. Large increase in fracture resistance of stishovite with crack extension less than one micrometer

    PubMed Central

    Yoshida, Kimiko; Wakai, Fumihiro; Nishiyama, Norimasa; Sekine, Risako; Shinoda, Yutaka; Akatsu, Takashi; Nagoshi, Takashi; Sone, Masato

    2015-01-01

    The development of strong, tough, and damage-tolerant ceramics requires nano/microstructure design to utilize toughening mechanisms operating at different length scales. The toughening mechanisms so far known are effective in micro-scale, then, they require the crack extension of more than a few micrometers to increase the fracture resistance. Here, we developed a micro-mechanical test method using micro-cantilever beam specimens to determine the very early part of resistance-curve of nanocrystalline SiO2 stishovite, which exhibited fracture-induced amorphization. We revealed that this novel toughening mechanism was effective even at length scale of nanometer due to narrow transformation zone width of a few tens of nanometers and large dilatational strain (from 60 to 95%) associated with the transition of crystal to amorphous state. This testing method will be a powerful tool to search for toughening mechanisms that may operate at nanoscale for attaining both reliability and strength of structural materials. PMID:26051871

  17. Nanomanufacturing : nano-structured materials made layer-by-layer.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cox, James V.; Cheng, Shengfeng; Grest, Gary Stephen

    Large-scale, high-throughput production of nano-structured materials (i.e. nanomanufacturing) is a strategic area in manufacturing, with markets projected to exceed $1T by 2015. Nanomanufacturing is still in its infancy; process/product developments are costly and only touch on potential opportunities enabled by growing nanoscience discoveries. The greatest promise for high-volume manufacturing lies in age-old coating and imprinting operations. For materials with tailored nm-scale structure, imprinting/embossing must be achieved at high speeds (roll-to-roll) and/or over large areas (batch operation) with feature sizes less than 100 nm. Dispersion coatings with nanoparticles can also tailor structure through self- or directed-assembly. Layering films structured with thesemore » processes have tremendous potential for efficient manufacturing of microelectronics, photovoltaics and other topical nano-structured devices. This project is designed to perform the requisite R and D to bring Sandia's technology base in computational mechanics to bear on this scale-up problem. Project focus is enforced by addressing a promising imprinting process currently being commercialized.« less

  18. Quantum information processing with long-wavelength radiation

    NASA Astrophysics Data System (ADS)

    Murgia, David; Weidt, Sebastian; Randall, Joseph; Lekitsch, Bjoern; Webster, Simon; Navickas, Tomas; Grounds, Anton; Rodriguez, Andrea; Webb, Anna; Standing, Eamon; Pearce, Stuart; Sari, Ibrahim; Kiang, Kian; Rattanasonti, Hwanjit; Kraft, Michael; Hensinger, Winfried

    To this point, the entanglement of ions has predominantly been performed using lasers. Using long wavelength radiation with static magnetic field gradients provides an architecture to simplify construction of a large scale quantum computer. The use of microwave-dressed states protects against decoherence from fluctuating magnetic fields, with radio-frequency fields used for qubit manipulation. I will report the realisation of spin-motion entanglement using long-wavelength radiation, and a new method to efficiently prepare dressed-state qubits and qutrits, reducing experimental complexity of gate operations. I will also report demonstration of ground state cooling using long wavelength radiation, which may increase two-qubit entanglement fidelity. I will then report demonstration of a high-fidelity long-wavelength two-ion quantum gate using dressed states. Combining these results with microfabricated ion traps allows for scaling towards a large scale ion trap quantum computer, and provides a platform for quantum simulations of fundamental physics. I will report progress towards the operation of microchip ion traps with extremely high magnetic field gradients for multi-ion quantum gates.

  19. Energy: the microfluidic frontier.

    PubMed

    Sinton, David

    2014-09-07

    Global energy is largely a fluids problem. It is also large-scale, in stark contrast to microchannels. Microfluidic energy technologies must offer either massive scalability or direct relevance to energy processes already operating at scale. We have to pick our fights. Highlighted here are the exceptional opportunities I see, including some recent successes and areas where much more attention is needed. The most promising directions are those that leverage high surface-to-volume ratios, rapid diffusive transport, capacity for high temperature and high pressure experiments, and length scales characteristic of microbes and fluids (hydrocarbons, CO2) underground. The most immediate areas of application are where information is the product; either fluid sample analysis (e.g. oil analysis); or informing operations (e.g. CO2 transport in microporous media). I'll close with aspects that differentiate energy from traditional microfluidics applications, the uniquely important role of engineering in energy, and some thoughts for the research community forming at the nexus of lab-on-a-chip and energy--a microfluidic frontier.

  20. Coastal ocean forecasting with an unstructured grid model in the southern Adriatic and northern Ionian seas

    NASA Astrophysics Data System (ADS)

    Federico, Ivan; Pinardi, Nadia; Coppini, Giovanni; Oddo, Paolo; Lecci, Rita; Mossa, Michele

    2017-01-01

    SANIFS (Southern Adriatic Northern Ionian coastal Forecasting System) is a coastal-ocean operational system based on the unstructured grid finite-element three-dimensional hydrodynamic SHYFEM model, providing short-term forecasts. The operational chain is based on a downscaling approach starting from the large-scale system for the entire Mediterranean Basin (MFS, Mediterranean Forecasting System), which provides initial and boundary condition fields to the nested system. The model is configured to provide hydrodynamics and active tracer forecasts both in open ocean and coastal waters of southeastern Italy using a variable horizontal resolution from the open sea (3-4 km) to coastal areas (50-500 m). Given that the coastal fields are driven by a combination of both local (also known as coastal) and deep-ocean forcings propagating along the shelf, the performance of SANIFS was verified both in forecast and simulation mode, first (i) on the large and shelf-coastal scales by comparing with a large-scale survey CTD (conductivity-temperature-depth) in the Gulf of Taranto and then (ii) on the coastal-harbour scale (Mar Grande of Taranto) by comparison with CTD, ADCP (acoustic doppler current profiler) and tide gauge data. Sensitivity tests were performed on initialization conditions (mainly focused on spin-up procedures) and on surface boundary conditions by assessing the reliability of two alternative datasets at different horizontal resolution (12.5 and 6.5 km). The SANIFS forecasts at a lead time of 1 day were compared with the MFS forecasts, highlighting that SANIFS is able to retain the large-scale dynamics of MFS. The large-scale dynamics of MFS are correctly propagated to the shelf-coastal scale, improving the forecast accuracy (+17 % for temperature and +6 % for salinity compared to MFS). Moreover, the added value of SANIFS was assessed on the coastal-harbour scale, which is not covered by the coarse resolution of MFS, where the fields forecasted by SANIFS reproduced the observations well (temperature RMSE equal to 0.11 °C). Furthermore, SANIFS simulations were compared with hourly time series of temperature, sea level and velocity measured on the coastal-harbour scale, showing a good agreement. Simulations in the Gulf of Taranto described a circulation mainly characterized by an anticyclonic gyre with the presence of cyclonic vortexes in shelf-coastal areas. A surface water inflow from the open sea to Mar Grande characterizes the coastal-harbour scale.

  1. Technoeconomic analysis of large scale production of pre-emergent Pseudomonas fluorescens microbial bioherbicide in Canada.

    PubMed

    Mupondwa, Edmund; Li, Xue; Boyetchko, Susan; Hynes, Russell; Geissler, Jon

    2015-01-01

    The study presents an ex ante technoeconomic analysis of commercial production of Pseudomonas fluorescens BRG100 bioherbicide in Canada. An engineering economic model is designed in SuperPro Designer® to investigate capital investment scaling and profitability. Total capital investment for a stand-alone BRG100 fermentation plant at baseline capacity (two 33,000L fermenters; 3602tonnesannum(-1)) is $17.55million. Total annual operating cost is $14.76million. Raw materials account for 50% of operating cost. The fermentation plant is profitable over wide operating scale, evaluated over a range of BRG100 prices and costs of capital. Smaller plants require higher NPV breakeven prices. However, larger plants are more sensitive to changes in the cost of capital. Unit production costs decrease as plant capacity increases, indicating scale economies. A plant operating for less than one year approaches positive NPV for periods as low as 2months. These findings can support bioherbicide R&D investment and commercialization strategies. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  2. An Assessment of Water Demand and Availability to meet Construction and Operational Needs for Large Utility-Scale Solar Projects in the Southwestern United States

    NASA Astrophysics Data System (ADS)

    Klise, G. T.; Tidwell, V. C.; Macknick, J.; Reno, M. D.; Moreland, B. D.; Zemlick, K. M.

    2013-12-01

    In the Southwestern United States, there are many large utility-scale solar photovoltaic (PV) and concentrating solar power (CSP) facilities currently in operation, with even more under construction and planned for future development. These are locations with high solar insolation and access to large metropolitan areas and existing grid infrastructure. The Bureau of Land Management, under a reasonably foreseeable development scenario, projects a total of almost 32 GW of installed utility-scale solar project capacity in the Southwest by 2030. To determine the potential impacts to water resources and the potential limitations water resources may have on development, we utilized methods outlined by the Bureau of Land Management (BLM) to determine potential water use in designated solar energy zones (SEZs) for construction and operations & maintenance (O&M), which is then evaluated according to water availability in six Southwestern states. Our results indicate that PV facilities overall use less water, however water for construction is high compared to lifetime operational water needs. There is a transition underway from wet cooled to dry cooled CSP facilities and larger PV facilities due to water use concerns, though some water is still necessary for construction, operations, and maintenance. Overall, ten watersheds, 9 in California, and one in New Mexico were identified as being of particular concern because of limited water availability. Understanding the location of potentially available water sources can help the solar industry determine locations that minimize impacts to existing water resources, and help understand potential costs when utilizing non-potable water sources or purchasing existing appropriated water. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  3. Development and analysis of prognostic equations for mesoscale kinetic energy and mesoscale (subgrid scale) fluxes for large-scale atmospheric models

    NASA Technical Reports Server (NTRS)

    Avissar, Roni; Chen, Fei

    1993-01-01

    Generated by landscape discontinuities (e.g., sea breezes) mesoscale circulation processes are not represented in large-scale atmospheric models (e.g., general circulation models), which have an inappropiate grid-scale resolution. With the assumption that atmospheric variables can be separated into large scale, mesoscale, and turbulent scale, a set of prognostic equations applicable in large-scale atmospheric models for momentum, temperature, moisture, and any other gaseous or aerosol material, which includes both mesoscale and turbulent fluxes is developed. Prognostic equations are also developed for these mesoscale fluxes, which indicate a closure problem and, therefore, require a parameterization. For this purpose, the mean mesoscale kinetic energy (MKE) per unit of mass is used, defined as E-tilde = 0.5 (the mean value of u'(sub i exp 2), where u'(sub i) represents the three Cartesian components of a mesoscale circulation (the angle bracket symbol is the grid-scale, horizontal averaging operator in the large-scale model, and a tilde indicates a corresponding large-scale mean value). A prognostic equation is developed for E-tilde, and an analysis of the different terms of this equation indicates that the mesoscale vertical heat flux, the mesoscale pressure correlation, and the interaction between turbulence and mesoscale perturbations are the major terms that affect the time tendency of E-tilde. A-state-of-the-art mesoscale atmospheric model is used to investigate the relationship between MKE, landscape discontinuities (as characterized by the spatial distribution of heat fluxes at the earth's surface), and mesoscale sensible and latent heat fluxes in the atmosphere. MKE is compared with turbulence kinetic energy to illustrate the importance of mesoscale processes as compared to turbulent processes. This analysis emphasizes the potential use of MKE to bridge between landscape discontinuities and mesoscale fluxes and, therefore, to parameterize mesoscale fluxes generated by such subgrid-scale landscape discontinuities in large-scale atmospheric models.

  4. Large Scale Landslide Database System Established for the Reservoirs in Southern Taiwan

    NASA Astrophysics Data System (ADS)

    Tsai, Tsai-Tsung; Tsai, Kuang-Jung; Shieh, Chjeng-Lun

    2017-04-01

    Typhoon Morakot seriously attack southern Taiwan awaken the public awareness of large scale landslide disasters. Large scale landslide disasters produce large quantity of sediment due to negative effects on the operating functions of reservoirs. In order to reduce the risk of these disasters within the study area, the establishment of a database for hazard mitigation / disaster prevention is necessary. Real time data and numerous archives of engineering data, environment information, photo, and video, will not only help people make appropriate decisions, but also bring the biggest concern for people to process and value added. The study tried to define some basic data formats / standards from collected various types of data about these reservoirs and then provide a management platform based on these formats / standards. Meanwhile, in order to satisfy the practicality and convenience, the large scale landslide disasters database system is built both provide and receive information abilities, which user can use this large scale landslide disasters database system on different type of devices. IT technology progressed extreme quick, the most modern system might be out of date anytime. In order to provide long term service, the system reserved the possibility of user define data format /standard and user define system structure. The system established by this study was based on HTML5 standard language, and use the responsive web design technology. This will make user can easily handle and develop this large scale landslide disasters database system.

  5. Validation of self-reported figural drawing scales against anthropometric measurements in adults.

    PubMed

    Dratva, Julia; Bertelsen, Randi; Janson, Christer; Johannessen, Ane; Benediktsdóttir, Bryndis; Bråbäck, Lennart; Dharmage, Shyamali C; Forsberg, Bertil; Gislason, Thorarinn; Jarvis, Debbie; Jogi, Rain; Lindberg, Eva; Norback, Dan; Omenaas, Ernst; Skorge, Trude D; Sigsgaard, Torben; Toren, Kjell; Waatevik, Marie; Wieslander, Gundula; Schlünssen, Vivi; Svanes, Cecilie; Real, Francisco Gomez

    2016-08-01

    The aim of the present study was to validate figural drawing scales depicting extremely lean to extremely obese subjects to obtain proxies for BMI and waist circumference in postal surveys. Reported figural scales and anthropometric data from a large population-based postal survey were validated with measured anthropometric data from the same individuals by means of receiver-operating characteristic curves and a BMI prediction model. Adult participants in a Scandinavian cohort study first recruited in 1990 and followed up twice since. Individuals aged 38-66 years with complete data for BMI (n 1580) and waist circumference (n 1017). Median BMI and waist circumference increased exponentially with increasing figural scales. Receiver-operating characteristic curve analyses showed a high predictive ability to identify individuals with BMI > 25·0 kg/m2 in both sexes. The optimal figural scales for identifying overweight or obese individuals with a correct detection rate were 4 and 5 in women, and 5 and 6 in men, respectively. The prediction model explained 74 % of the variance among women and 62 % among men. Predicted BMI differed only marginally from objectively measured BMI. Figural drawing scales explained a large part of the anthropometric variance in this population and showed a high predictive ability for identifying overweight/obese subjects. These figural scales can be used with confidence as proxies of BMI and waist circumference in settings where objective measures are not feasible.

  6. Site Environmental Report for Calendar Year 1999. DOE Operations at The Boeing Company, Rocketdyne

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2000-09-01

    OAK A271 Site Environmental Report for Calendar Year 1999. DOE Operations at The Boeing Company, Rocketdyne. This Annual Site Environmental Report (ASER) for 1999 describes the environmental conditions related to work performed for the Department of Energy (DOE) at Area IV of the Rocketdyne Santa Susana Field Laboratory (SSFL). In the past, these operations included development, fabrication, and disassembly of nuclear reactors, reactor fuel, and other radioactive materials under the former Atomics International Division. Other activities included the operation of large-scale liquid metal facilities for testing of liquid metal fast breeder components at the Energy Technology Engineering Center (ETEC), amore » government-owned, company-operated test facility within Area IV. All nuclear work was terminated in 1988, and subsequently, all radiological work has been directed toward decontamination and decommissioning (D&D) of the previously used nuclear facilities and associated site areas. Large-scale D&D activities of the sodium test facilities began in 1996. This Annual Site Environmental Report provides information showing that there are no indications of any potential impact on public health and safety due to the operations conducted at the SSFL. All measures and calculations of off-site conditions demonstrate compliance with applicable regulations, which provide for protection of human health and the environment.« less

  7. The operations manual: a mechanism for improving the research process.

    PubMed

    Bowman, Ann; Wyman, Jean F; Peters, Jennifer

    2002-01-01

    The development and use of an operations manual has the potential to improve the capacity of nurse scientists to address the complex, multifaceted issues associated with conducting research in today's healthcare environment. An operations manual facilitates communication, standardizes training and evaluation, and enhances the development and standard implementation of clear policies, processes, and protocols. A 10-year review of methodology articles in relevant nursing journals revealed no attention to this topic. This article will discuss how an operations manual can improve the conduct of research methods and outcomes for both small-scale and large-scale research studies. It also describes the purpose and components of a prototype operations manual for use in quantitative research. The operations manual increases reliability and reproducibility of the research while improving the management of study processes. It can prevent costly and untimely delays or errors in the conduct of research.

  8. In-situ device integration of large-area patterned organic nanowire arrays for high-performance optical sensors

    PubMed Central

    Wu, Yiming; Zhang, Xiujuan; Pan, Huanhuan; Deng, Wei; Zhang, Xiaohong; Zhang, Xiwei; Jie, Jiansheng

    2013-01-01

    Single-crystalline organic nanowires (NWs) are important building blocks for future low-cost and efficient nano-optoelectronic devices due to their extraordinary properties. However, it remains a critical challenge to achieve large-scale organic NW array assembly and device integration. Herein, we demonstrate a feasible one-step method for large-area patterned growth of cross-aligned single-crystalline organic NW arrays and their in-situ device integration for optical image sensors. The integrated image sensor circuitry contained a 10 × 10 pixel array in an area of 1.3 × 1.3 mm2, showing high spatial resolution, excellent stability and reproducibility. More importantly, 100% of the pixels successfully operated at a high response speed and relatively small pixel-to-pixel variation. The high yield and high spatial resolution of the operational pixels, along with the high integration level of the device, clearly demonstrate the great potential of the one-step organic NW array growth and device construction approach for large-scale optoelectronic device integration. PMID:24287887

  9. Are large-scale manipulations of streamflow for ecological outcomes effective either as experiments or management actions? (Invited)

    NASA Astrophysics Data System (ADS)

    Konrad, C. P.; Olden, J.

    2013-12-01

    Dams impose a host of impacts on freshwater and estuary ecosystems. In recent decades, dam releases for ecological outcomes have been increasingly implemented to mitigate for these impacts and are gaining global scope. Many are designed and conducted using an experimental framework. A recent review of large-scale flow experiments (FE) evaluates their effectiveness and identifies ways to enhance their scientific and management value. At least 113 large-scale flow experiments affecting 98 river systems globally have been documented over the last 50 years. These experiments span a range of flow manipulations from single pulse events to comprehensive changes in flow regime across all seasons and different water year types. Clear articulation of experimental objectives, while not universally practiced, was crucial for achieving management outcomes and changing dam operating policies. We found a strong disparity between the recognized ecological importance of a multi faceted flow regimes and discrete flow events that characterized 80% of FEs. Over three quarters of FEs documented both abiotic and biotic outcomes, but only one third examined multiple trophic groups, thus limiting how this information informs future dam management. Large-scale flow experiments represent a unique opportunity for integrated biophysical investigations for advancing ecosystem science. Nonetheless, they must remain responsive to site-specific issues regarding water management, evolving societal values and changing environmental conditions and, in particular, can characterize the incremental benefits from and necessary conditions for changing dam operations to improve ecological outcomes. This type of information is essential for understanding the full context of value based trade-offs in benefits and costs from different dam operations that can serve as an empirical basis for societal decisions regarding water and ecosystem management. FE may be the best approach available to managers for resolving critical uncertainties that impede decision making in adaptive settings, for example, when we lack sufficient understanding to model biophysical responses to alternative operations. Integrated long term monitoring of biotic abiotic responses and defining clear management based objectives highlight ways for improving the efficiency and value of FEs.

  10. Power grid operation risk management: V2G deployment for sustainable development

    NASA Astrophysics Data System (ADS)

    Haddadian, Ghazale J.

    The production, transmission, and delivery of cost--efficient energy to supply ever-increasing peak loads along with a quest for developing a low-carbon economy require significant evolutions in the power grid operations. Lower prices of vast natural gas resources in the United States, Fukushima nuclear disaster, higher and more intense energy consumptions in China and India, issues related to energy security, and recent Middle East conflicts, have urged decisions makers throughout the world to look into other means of generating electricity locally. As the world look to combat climate changes, a shift from carbon-based fuels to non-carbon based fuels is inevitable. However, the variability of distributed generation assets in the electricity grid has introduced major reliability challenges for power grid operators. While spearheading sustainable and reliable power grid operations, this dissertation develops a multi-stakeholder approach to power grid operation design; aiming to address economic, security, and environmental challenges of the constrained electricity generation. It investigates the role of Electric Vehicle (EV) fleets integration, as distributed and mobile storage assets to support high penetrations of renewable energy sources, in the power grid. The vehicle-to-grid (V2G) concept is considered to demonstrate the bidirectional role of EV fleets both as a provider and consumer of energy in securing a sustainable power grid operation. The proposed optimization modeling is the application of Mixed-Integer Linear Programing (MILP) to large-scale systems to solve the hourly security-constrained unit commitment (SCUC) -- an optimal scheduling concept in the economic operation of electric power systems. The Monte Carlo scenario-based approach is utilized to evaluate different scenarios concerning the uncertainties in the operation of power grid system. Further, in order to expedite the real-time solution of the proposed approach for large-scale power systems, it considers a two-stage model using the Benders Decomposition (BD). The numerical simulation demonstrate that the utilization of smart EV fleets in power grid systems would ensure a sustainable grid operation with lower carbon footprints, smoother integration of renewable sources, higher security, and lower power grid operation costs. The results, additionally, illustrate the effectiveness of the proposed MILP approach and its potentials as an optimization tool for sustainable operation of large scale electric power systems.

  11. A GLOBAL GALACTIC DYNAMO WITH A CORONA CONSTRAINED BY RELATIVE HELICITY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prasad, A.; Mangalam, A., E-mail: avijeet@iiap.res.in, E-mail: mangalam@iiap.res.in

    We present a model for a global axisymmetric turbulent dynamo operating in a galaxy with a corona that treats the parameters of turbulence driven by supernovae and by magneto-rotational instability under a common formalism. The nonlinear quenching of the dynamo is alleviated by the inclusion of small-scale advective and diffusive magnetic helicity fluxes, which allow the gauge-invariant magnetic helicity to be transferred outside the disk and consequently to build up a corona during the course of dynamo action. The time-dependent dynamo equations are expressed in a separable form and solved through an eigenvector expansion constructed using the steady-state solutions ofmore » the dynamo equation. The parametric evolution of the dynamo solution allows us to estimate the final structure of the global magnetic field and the saturated value of the turbulence parameter α{sub m}, even before solving the dynamical equations for evolution of magnetic fields in the disk and the corona, along with α-quenching. We then solve these equations simultaneously to study the saturation of the large-scale magnetic field, its dependence on the small-scale magnetic helicity fluxes, and the corresponding evolution of the force-free field in the corona. The quadrupolar large-scale magnetic field in the disk is found to reach equipartition strength within a timescale of 1 Gyr. The large-scale magnetic field in the corona obtained is much weaker than the field inside the disk and has only a weak impact on the dynamo operation.« less

  12. The role of ocean climate data in operational Naval oceanography

    NASA Technical Reports Server (NTRS)

    Chesbrough, Radm G.

    1992-01-01

    Local application of global-scale models describes the U.S. Navy's basic philosophy for operational oceanography in support of fleet operations. Real-time data, climatologies, coupled air/ocean models, and large scale computers are the essential components of the Navy's system for providing the war fighters with the performance predictions and tactical decision aids they need to operate safely and efficiently. In peacetime, these oceanographic predictions are important for safety of navigation and flight. The paucity and uneven distribution of real-time data mean we have to fall back on climatology to provide the basic data to operate our models. The Navy is both a producer and user of climatologies; it provides observations to the national archives and in turn employs data from these archives to establish data bases. Suggestions for future improvements to ocean climate data are offered.

  13. Mechanisms Affecting the Sustainability and Scale-up of a System-Wide Numeracy Reform

    ERIC Educational Resources Information Center

    Bobis, Janette

    2011-01-01

    With deliberate system-level reform now being acted upon around the world, both successful and unsuccessful cases provide a rich source of knowledge from which we can learn to improve large-scale reform. Research surrounding the effectiveness of a theory-based system-wide numeracy reform operating in primary schools across Australia is examined to…

  14. Desalination: Status and Federal Issues

    DTIC Science & Technology

    2009-12-30

    on one side and lets purified water through. Reverse osmosis plants have fewer problems with corrosion and usually have lower energy requirements...Texas) and cities are actively researching and investigating the feasibility of large-scale desalination plants for municipal water supplies...desalination research and development, and in construction and operational costs of desalination demonstration projects and full-scale plants

  15. Optimal Wind Energy Integration in Large-Scale Electric Grids

    NASA Astrophysics Data System (ADS)

    Albaijat, Mohammad H.

    The major concern in electric grid operation is operating under the most economical and reliable fashion to ensure affordability and continuity of electricity supply. This dissertation investigates the effects of such challenges, which affect electric grid reliability and economic operations. These challenges are: 1. Congestion of transmission lines, 2. Transmission lines expansion, 3. Large-scale wind energy integration, and 4. Phaser Measurement Units (PMUs) optimal placement for highest electric grid observability. Performing congestion analysis aids in evaluating the required increase of transmission line capacity in electric grids. However, it is necessary to evaluate expansion of transmission line capacity on methods to ensure optimal electric grid operation. Therefore, the expansion of transmission line capacity must enable grid operators to provide low-cost electricity while maintaining reliable operation of the electric grid. Because congestion affects the reliability of delivering power and increases its cost, the congestion analysis in electric grid networks is an important subject. Consequently, next-generation electric grids require novel methodologies for studying and managing congestion in electric grids. We suggest a novel method of long-term congestion management in large-scale electric grids. Owing to the complication and size of transmission line systems and the competitive nature of current grid operation, it is important for electric grid operators to determine how many transmission lines capacity to add. Traditional questions requiring answers are "Where" to add, "How much of transmission line capacity" to add, and "Which voltage level". Because of electric grid deregulation, transmission lines expansion is more complicated as it is now open to investors, whose main interest is to generate revenue, to build new transmission lines. Adding a new transmission capacity will help the system to relieve the transmission system congestion, create profit for investors for renting their transmission capacity, and cheaper electricity for end users. We propose a hybrid method based on a heuristic and deterministic method to attain new transmission lines additions and increase transmission capacity. Renewable energy resources (RES) have zero operating cost, which makes them very attractive for generation companies and market participants. In addition, RES have zero carbon emission, which helps relieve the concerns of environmental impacts of electric generation resources' carbon emission. RES are wind, solar, hydro, biomass, and geothermal. By 2030, the expectation is that more than 30% of electricity in the U.S. will come from RES. One major contributor of RES generation will be from wind energy resources (WES). Furthermore, WES will be an important component of the future generation portfolio. However, the nature of WES is that it experiences a high intermittency and volatility. Because of the great expectation of high WES penetration and the nature of such resources, researchers focus on studying the effects of such resources on the electric grid operation and its adequacy from different aspects. Additionally, current market operations of electric grids add another complication to consider while integrating RES (e.g., specifically WES). Mandates by market rules and long-term analysis of renewable penetration in large-scale electric grid are also the focus of researchers in recent years. We advocate a method for high-wind resources penetration study on large-scale electric grid operations. PMU is a geographical positioning system (GPS) based device, which provides immediate and precise measurements of voltage angle in a high-voltage transmission system. PMUs can update the status of a transmission line and related measurements (e.g., voltage magnitude and voltage phase angle) more frequently. Every second, a PMU can provide 30 samples of measurements compared to traditional systems (e.g., supervisory control and data acquisition [SCADA] system), which provides one sample of measurement every 2 to 5 seconds. Because PMUs provide more measurement data samples, PMU can improve electric grid reliability and observability. (Abstract shortened by UMI.)

  16. Ergonomics in industrialized dairy operations.

    PubMed

    Douphrate, David I; Nonnenmann, Matthew W; Rosecrance, John C

    2009-01-01

    This paper presents a summary of a panel presentation by agriculture health and safety scientists on ergonomics of industrialized dairy parlor operations in the United States. Dairy industry trends in the United States were discussed in the panel presentation, which took place during the New Paths: Health and Safety in Western Agriculture conference, November 11-13, 2008. Dairy production is steadily moving to large-herd operations because of associated economies of scale and other economic and social conditions. Large-herd operations utilize a parlor milking system, as compared to a stanchion system used primarily in smaller operations. Each milking system presents different risks for worker injury. Low back, knee, and shoulder musculoskeletal symptoms were most frequently reported among workers in smaller dairy operations. Another study analyzing workers' compensation (WC) data from large-herd milking operations found nearly 50% of livestock-handling injury claims involved parlor milking activities. Nearly 27% of injuries were to the wrist, hand, and fingers, nearly 13% to the head or face, and 11% to the chest. Results indicated the vulnerability of these body parts to injury due to the worker-livestock interface during milking. More focused research should investigate milking practices and parlor designs as they relate to worker safety and health. Additional dairy-related injury research is vital given the trend towards large industrial milking operations.

  17. Nonlinear Control of Large Disturbances in Magnetic Bearing Systems

    NASA Technical Reports Server (NTRS)

    Jiang, Yuhong; Zmood, R. B.

    1996-01-01

    In this paper, the nonlinear operation of magnetic bearing control methods is reviewed. For large disturbances, the effects of displacement constraints and power amplifier current and di/dt limits on bearing control system performance are analyzed. The operation of magnetic bearings exhibiting self-excited large scale oscillations have been studied both experimentally and by simulation. The simulation of the bearing system has been extended to include the effects of eddy currents in the actuators, so as to improve the accuracy of the simulation results. The results of these experiments and simulations are compared, and some useful conclusions are drawn for improving bearing system robustness.

  18. Research and Development of Large Capacity CFB Boilers in TPRI

    NASA Astrophysics Data System (ADS)

    Xianbin, Sun; Minhua, Jiang

    This paper presents an overview of advancements of circulating fluidized bed (CFB) technology in Thermal Power Research Institute (TPRI),including technologies and configuration and progress of scaling up. For devoloping large CFB boiler, the CFB combustion test facilities have been established, the key technologies of large capacity CFB boiler have been research systematically, the 100MW ˜330MW CFB boiler have been developed and manufactured. The first domestically designed 100MW and 210MW CFB boiler have been put into commericial operation and have good operating performance. Domestic 330MW CFB boiler demonstration project also has been put into commericial operation,which is H type CFB boiler with Compact heat exchanger. This boiler is China's largest CFB boiler. The technical plan of domestic 600MW supercritical CFB boiler are also briefly introduced.

  19. An assessment of potential weather effects due to operation of the Space Orbiting Light Augmentation Reflector Energy System (SOLARES)

    NASA Technical Reports Server (NTRS)

    Allen, N. C.

    1978-01-01

    Implementation of SOLARES will input large quantities of heat continuously into a stationary location on the Earth's surface. The quantity of heat released by each of the SOlARES ground receivers, having a reflector orbit height of 6378 km, exceeds by 30 times that released by large power parks which were studied in detail. Using atmospheric models, estimates are presented for the local weather effects, the synoptic scale effects, and the global scale effects from such intense thermal radiation.

  20. An integrated assessment of location-dependent scaling for microalgae biofuel production facilities

    DOE PAGES

    Coleman, André M.; Abodeely, Jared M.; Skaggs, Richard L.; ...

    2014-06-19

    Successful development of a large-scale microalgae-based biofuels industry requires comprehensive analysis and understanding of the feedstock supply chain—from facility siting and design through processing and upgrading of the feedstock to a fuel product. The evolution from pilot-scale production facilities to energy-scale operations presents many multi-disciplinary challenges, including a sustainable supply of water and nutrients, operational and infrastructure logistics, and economic competitiveness with petroleum-based fuels. These challenges are partially addressed by applying the Integrated Assessment Framework (IAF) – an integrated multi-scale modeling, analysis, and data management suite – to address key issues in developing and operating an open-pond microalgae production facility.more » This is done by analyzing how variability and uncertainty over space and through time affect feedstock production rates, and determining the site-specific “optimum” facility scale to minimize capital and operational expenses. This approach explicitly and systematically assesses the interdependence of biofuel production potential, associated resource requirements, and production system design trade-offs. To provide a baseline analysis, the IAF was applied in this paper to a set of sites in the southeastern U.S. with the potential to cumulatively produce 5 billion gallons per year. Finally, the results indicate costs can be reduced by scaling downstream processing capabilities to fit site-specific growing conditions, available and economically viable resources, and specific microalgal strains.« less

  1. The latest developments and outlook for hydrogen liquefaction technology

    NASA Astrophysics Data System (ADS)

    Ohlig, K.; Decker, L.

    2014-01-01

    Liquefied hydrogen is presently mainly used for space applications and the semiconductor industry. While clean energy applications, for e.g. the automotive sector, currently contribute to this demand with a small share only, their demand may see a significant boost in the next years with the need for large scale liquefaction plants exceeding the current plant sizes by far. Hydrogen liquefaction for small scale plants with a maximum capacity of 3 tons per day (tpd) is accomplished with a Brayton refrigeration cycle using helium as refrigerant. This technology is characterized by low investment costs but lower process efficiency and hence higher operating costs. For larger plants, a hydrogen Claude cycle is used, characterized by higher investment but lower operating costs. However, liquefaction plants meeting the potentially high demand in the clean energy sector will need further optimization with regard to energy efficiency and hence operating costs. The present paper gives an overview of the currently applied technologies, including their thermodynamic and technical background. Areas of improvement are identified to derive process concepts for future large scale hydrogen liquefaction plants meeting the needs of clean energy applications with optimized energy efficiency and hence minimized operating costs. Compared to studies in this field, this paper focuses on application of new technology and innovative concepts which are either readily available or will require short qualification procedures. They will hence allow implementation in plants in the close future.

  2. ATLAS and LHC computing on CRAY

    NASA Astrophysics Data System (ADS)

    Sciacca, F. G.; Haug, S.; ATLAS Collaboration

    2017-10-01

    Access and exploitation of large scale computing resources, such as those offered by general purpose HPC centres, is one important measure for ATLAS and the other Large Hadron Collider experiments in order to meet the challenge posed by the full exploitation of the future data within the constraints of flat budgets. We report on the effort of moving the Swiss WLCG T2 computing, serving ATLAS, CMS and LHCb, from a dedicated cluster to the large Cray systems at the Swiss National Supercomputing Centre CSCS. These systems do not only offer very efficient hardware, cooling and highly competent operators, but also have large backfill potentials due to size and multidisciplinary usage and potential gains due to economy at scale. Technical solutions, performance, expected return and future plans are discussed.

  3. A 17 GHz molecular rectifier

    PubMed Central

    Trasobares, J.; Vuillaume, D.; Théron, D.; Clément, N.

    2016-01-01

    Molecular electronics originally proposed that small molecules sandwiched between electrodes would accomplish electronic functions and enable ultimate scaling to be reached. However, so far, functional molecular devices have only been demonstrated at low frequency. Here, we demonstrate molecular diodes operating up to 17.8 GHz. Direct current and radio frequency (RF) properties were simultaneously measured on a large array of molecular junctions composed of gold nanocrystal electrodes, ferrocenyl undecanethiol molecules and the tip of an interferometric scanning microwave microscope. The present nanometre-scale molecular diodes offer a current density increase by several orders of magnitude compared with that of micrometre-scale molecular diodes, allowing RF operation. The measured S11 parameters show a diode rectification ratio of 12 dB which is linked to the rectification behaviour of the direct current conductance. From the RF measurements, we extrapolate a cut-off frequency of 520 GHz. A comparison with the silicon RF-Schottky diodes, architecture suggests that the RF-molecular diodes are extremely attractive for scaling and high-frequency operation. PMID:27694833

  4. Examination of Daily Weather in the NCAR CCM

    NASA Astrophysics Data System (ADS)

    Cocke, S. D.

    2006-05-01

    The NCAR CCM is one of the most extensively studied climate models in the scientific community. However, most studies focus primarily on the long term mean behavior, typically monthly or longer time scales. In this study we examine the daily weather in the GCM by performing a series of daily or weekly 10 day forecasts for one year at moderate (T63) and high (T126) resolution. The model is initialized with operational "AVN" and ECMWF analyses, and model performance is compared to that of major operational centers, using conventional skill scores used by the major centers. Such a detailed look at the CCM at shorter time scales may lead to improvements in physical parameterizations, which may in turn lead to improved climate simulations. One finding from this study is that the CCM has a significant drying tendency in the lower troposphere compared to the operational analyses. Another is that the large scale predictability of the GCM is competitive with most of the operational models, particularly in the southern hemisphere.

  5. An operational global-scale ocean thermal analysis system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clancy, R. M.; Pollak, K.D.; Phoebus, P.A.

    1990-04-01

    The Optimum Thermal Interpolation System (OTIS) is an ocean thermal analysis system designed for operational use at FNOC. It is based on the optimum interpolation of the assimilation technique and functions in an analysis-prediction-analysis data assimilation cycle with the TOPS mixed-layer model. OTIS provides a rigorous framework for combining real-time data, climatology, and predictions from numerical ocean prediction models to produce a large-scale synoptic representation of ocean thermal structure. The techniques and assumptions used in OTIS are documented and results of operational tests of global scale OTIS at FNOC are presented. The tests involved comparisons of OTIS against an existingmore » operational ocean thermal structure model and were conducted during February, March, and April 1988. Qualitative comparison of the two products suggests that OTIS gives a more realistic representation of subsurface anomalies and horizontal gradients and that it also gives a more accurate analysis of the thermal structure, with improvements largest below the mixed layer. 37 refs.« less

  6. Solar-Power System Produces High-Pressure Steam

    NASA Technical Reports Server (NTRS)

    Lansing, F. L.

    1985-01-01

    Combination of three multistaged solar collectors produces highpressure steam for large-scale continuously operating turbines for generating mechanical or electrical energy. Superheated water vapor drives turbines, attaining an overall system efficiency about 22 percent.

  7. Riverbed Hydrologic Exchange Dynamics in a Large Regulated River Reach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Tian; Bao, Jie; Huang, Maoyi

    Hydrologic exchange flux (HEF) is an important hydrologic component in river corridors that includes both bidirectional (hyporheic) and unidirectional (gaining/losing) surface water – groundwater exchanges. Quantifying HEF rates in a large regulated river is difficult due to the large spatial domains, complexity of geomorphologic features and subsurface properties, and the great stage variations created by dam operations at multiple time scales. In this study, we developed a method that combined numerical modeling and field measurements for estimating HEF rates across the river bed in a 7‐km long reach of the highly regulated Columbia River. A high‐resolution computational fluid dynamics (CFD)more » modeling framework was developed and validated by field measurements and other modeling results to characterize the HEF dynamics across the river bed. We found that about 85% of the time from 2008‐2014 the river was losing water with an annual average net HEF rates across the river bed (Qz) of ‐2.3 m3 s−1 (negative indicating downwelling). June was the only month that the river gained water, with monthly averaged Qz of 0.8 m3 s−1. We also found that the daily dam operations increased the hourly gross gaining and losing rate over an average year of 8% and 2%, respectively. By investigating the HEF feedbacks at various time scales, we suggest that the dam operations could reduce the HEF at seasonal time scale by decreasing the seasonal flow variations, while also enhance the HEF at sub‐daily time scale by generating high frequency discharge variations. These changes could generate significant impacts on biogeochemical processes in the hyporheic zone.« less

  8. Scalable 96-well Plate Based iPSC Culture and Production Using a Robotic Liquid Handling System.

    PubMed

    Conway, Michael K; Gerger, Michael J; Balay, Erin E; O'Connell, Rachel; Hanson, Seth; Daily, Neil J; Wakatsuki, Tetsuro

    2015-05-14

    Continued advancement in pluripotent stem cell culture is closing the gap between bench and bedside for using these cells in regenerative medicine, drug discovery and safety testing. In order to produce stem cell derived biopharmaceutics and cells for tissue engineering and transplantation, a cost-effective cell-manufacturing technology is essential. Maintenance of pluripotency and stable performance of cells in downstream applications (e.g., cell differentiation) over time is paramount to large scale cell production. Yet that can be difficult to achieve especially if cells are cultured manually where the operator can introduce significant variability as well as be prohibitively expensive to scale-up. To enable high-throughput, large-scale stem cell production and remove operator influence novel stem cell culture protocols using a bench-top multi-channel liquid handling robot were developed that require minimal technician involvement or experience. With these protocols human induced pluripotent stem cells (iPSCs) were cultured in feeder-free conditions directly from a frozen stock and maintained in 96-well plates. Depending on cell line and desired scale-up rate, the operator can easily determine when to passage based on a series of images showing the optimal colony densities for splitting. Then the necessary reagents are prepared to perform a colony split to new plates without a centrifugation step. After 20 passages (~3 months), two iPSC lines maintained stable karyotypes, expressed stem cell markers, and differentiated into cardiomyocytes with high efficiency. The system can perform subsequent high-throughput screening of new differentiation protocols or genetic manipulation designed for 96-well plates. This technology will reduce the labor and technical burden to produce large numbers of identical stem cells for a myriad of applications.

  9. Validation of large-scale, monochromatic UV disinfection systems for drinking water using dyed microspheres.

    PubMed

    Blatchley, E R; Shen, C; Scheible, O K; Robinson, J P; Ragheb, K; Bergstrom, D E; Rokjer, D

    2008-02-01

    Dyed microspheres have been developed as a new method for validation of ultraviolet (UV) reactor systems. When properly applied, dyed microspheres allow measurement of the UV dose distribution delivered by a photochemical reactor for a given operating condition. Prior to this research, dyed microspheres had only been applied to a bench-scale UV reactor. The goal of this research was to extend the application of dyed microspheres to large-scale reactors. Dyed microsphere tests were conducted on two prototype large-scale UV reactors at the UV Validation and Research Center of New York (UV Center) in Johnstown, NY. All microsphere tests were conducted under conditions that had been used previously in biodosimetry experiments involving two challenge bacteriophage: MS2 and Qbeta. Numerical simulations based on computational fluid dynamics and irradiance field modeling were also performed for the same set of operating conditions used in the microspheres assays. Microsphere tests on the first reactor illustrated difficulties in sample collection and discrimination of microspheres against ambient particles. Changes in sample collection and work-up were implemented in tests conducted on the second reactor that allowed for improvements in microsphere capture and discrimination against the background. Under these conditions, estimates of the UV dose distribution from the microspheres assay were consistent with numerical simulations and the results of biodosimetry, using both challenge organisms. The combined application of dyed microspheres, biodosimetry, and numerical simulation offers the potential to provide a more in-depth description of reactor performance than any of these methods individually, or in combination. This approach also has the potential to substantially reduce uncertainties in reactor validation, thereby leading to better understanding of reactor performance, improvements in reactor design, and decreases in reactor capital and operating costs.

  10. Power Supply for Variable Frequency Induction Heating Using MERS Soft-Switching High Frequency Inverter

    NASA Astrophysics Data System (ADS)

    Isobe, Takanori; Kitahara, Tadayuki; Fukutani, Kazuhiko; Shimada, Ryuichi

    Variable frequency induction heating has great potential for industrial heating applications due to the possibility of achieving heating distribution control; however, large-scale induction heating with variable frequency has not yet been introduced for practical use. This paper proposes a high frequency soft-switching inverter for induction heating that can achieve variable frequency operation. One challenge of variable frequency induction heating is increasing power electronics ratings. This paper indicates that its current source type dc-link configuration and soft-switching characteristics can make it possible to build a large-scale system with variable frequency capability. A 90-kVA 150-1000Hz variable frequency experimental power supply for steel strip induction heating was developed. Experiments confirmed the feasibility of variable frequency induction heating with proposed converter and the advantages of variable frequency operation.

  11. Liquid Oxygen Propellant Densification Production and Performance Test Results With a Large-Scale Flight-Weight Propellant Tank for the X33 RLV

    NASA Technical Reports Server (NTRS)

    Tomsik, Thomas M.; Meyer, Michael L.

    2010-01-01

    This paper describes in-detail a test program that was initiated at the Glenn Research Center (GRC) involving the cryogenic densification of liquid oxygen (LO2). A large scale LO2 propellant densification system rated for 200 gpm and sized for the X-33 LO2 propellant tank, was designed, fabricated and tested at the GRC. Multiple objectives of the test program included validation of LO2 production unit hardware and characterization of densifier performance at design and transient conditions. First, performance data is presented for an initial series of LO2 densifier screening and check-out tests using densified liquid nitrogen. The second series of tests show performance data collected during LO2 densifier test operations with liquid oxygen as the densified product fluid. An overview of LO2 X-33 tanking operations and load tests with the 20,000 gallon Structural Test Article (STA) are described. Tank loading testing and the thermal stratification that occurs inside of a flight-weight launch vehicle propellant tank were investigated. These operations involved a closed-loop recirculation process of LO2 flow through the densifier and then back into the STA. Finally, in excess of 200,000 gallons of densified LO2 at 120 oR was produced with the propellant densification unit during the demonstration program, an achievement that s never been done before in the realm of large-scale cryogenic tests.

  12. A Comparison of Hybrid Reynolds Averaged Navier Stokes/Large Eddy Simulation (RANS/LES) and Unsteady RANS Predictions of Separated Flow for a Variable Speed Power Turbine Blade Operating with Low Inlet Turbulence Levels

    DTIC Science & Technology

    2017-10-01

    Facility is a large-scale cascade that allows detailed flow field surveys and blade surface measurements.10–12 The facility has a continuous run ...structured grids at 2 flow conditions, cruise and takeoff, of the VSPT blade . Computations were run in parallel on a Department of Defense...RANS/LES) and Unsteady RANS Predictions of Separated Flow for a Variable-Speed Power- Turbine Blade Operating with Low Inlet Turbulence Levels

  13. Low speed tests of a fixed geometry inlet for a tilt nacelle V/STOL airplane

    NASA Technical Reports Server (NTRS)

    Syberg, J.; Koncsek, J. L.

    1977-01-01

    Test data were obtained with a 1/4 scale cold flow model of the inlet at freestream velocities from 0 to 77 m/s (150 knots) and angles of attack from 45 deg to 120 deg. A large scale model was tested with a high bypass ratio turbofan in the NASA/ARC wind tunnel. A fixed geometry inlet is a viable concept for a tilt nacelle V/STOL application. Comparison of data obtained with the two models indicates that flow separation at high angles of attack and low airflow rates is strongly sensitive to Reynolds number and that the large scale model has a significantly improved range of separation-free operation.

  14. UTM Safely Enabling UAS Operations in Low-Altitude Airspace

    NASA Technical Reports Server (NTRS)

    Kopardekar, Parimal

    2017-01-01

    Conduct research, development and testing to identify airspace operations requirements to enable large-scale visual and beyond visual line of sight UAS operations in the low-altitude airspace. Use build-a-little-test-a-little strategy remote areas to urban areas Low density: No traffic management required but understanding of airspace constraints. Cooperative traffic management: Understanding of airspace constraints and other operations. Manned and unmanned traffic management: Scalable and heterogeneous operations. UTM construct consistent with FAAs risk-based strategy. UTM research platform is used for simulations and tests. UTM offers path towards scalability.

  15. UTM Safely Enabling UAS Operations in Low-Altitude Airspace

    NASA Technical Reports Server (NTRS)

    Kopardekar, Parimal H.

    2016-01-01

    Conduct research, development and testing to identify airspace operations requirements to enable large-scale visual and beyond visual line of sight UAS operations in the low-altitude airspace. Use build-a-little-test-a-little strategy remote areas to urban areas Low density: No traffic management required but understanding of airspace constraints. Cooperative traffic management: Understanding of airspace constraints and other operations. Manned and unmanned traffic management: Scalable and heterogeneous operations. UTM construct consistent with FAAs risk-based strategy. UTM research platform is used for simulations and tests. UTM offers path towards scalability.

  16. DOUBLE DYNAMO SIGNATURES IN A GLOBAL MHD SIMULATION AND MEAN-FIELD DYNAMOS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beaudoin, Patrice; Simard, Corinne; Cossette, Jean-François

    The 11 year solar activity cycle is the most prominent periodic manifestation of the magnetohydrodynamical (MHD) large-scale dynamo operating in the solar interior, yet longer and shorter (quasi-) periodicities are also present. The so-called “quasi-biennial” signal appearing in many proxies of solar activity has been gaining increasing attention since its detection in p -mode frequency shifts, which suggests a subphotospheric origin. A number of candidate mechanisms have been proposed, including beating between co-existing global dynamo modes, dual dynamos operating in spatially separated regions of the solar interior, and Rossby waves driving short-period oscillations in the large-scale solar magnetic field producedmore » by the 11 year activity cycle. In this article, we analyze a global MHD simulation of solar convection producing regular large-scale magnetic cycles, and detect and characterize shorter periodicities developing therein. By constructing kinematic mean-field α {sup 2}Ω dynamo models incorporating the turbulent electromotive force (emf) extracted from that same simulation, we find that dual-dynamo behavior materializes in fairly wide regions of the model’s parameters space. This suggests that the origin of the similar behavior detected in the MHD simulation lies with the joint complexity of the turbulent emf and differential rotation profile, rather that with dynamical interactions such as those mediated by Rossby waves. Analysis of the simulation also reveals that the dual dynamo operating therein leaves a double-period signature in the temperature field, consistent with a dual-period helioseismic signature. Order-of-magnitude estimates for the magnitude of the expected frequency shifts are commensurate with helioseismic measurements. Taken together, our results support the hypothesis that the solar quasi-biennial oscillations are associated with a secondary dynamo process operating in the outer reaches of the solar convection zone.« less

  17. A multidisciplinary approach to the development of low-cost high-performance lightwave networks

    NASA Technical Reports Server (NTRS)

    Maitan, Jacek; Harwit, Alex

    1991-01-01

    Our research focuses on high-speed distributed systems. We anticipate that our results will allow the fabrication of low-cost networks employing multi-gigabit-per-second data links for space and military applications. The recent development of high-speed low-cost photonic components and new generations of microprocessors creates an opportunity to develop advanced large-scale distributed information systems. These systems currently involve hundreds of thousands of nodes and are made up of components and communications links that may fail during operation. In order to realize these systems, research is needed into technologies that foster adaptability and scaleability. Self-organizing mechanisms are needed to integrate a working fabric of large-scale distributed systems. The challenge is to fuse theory, technology, and development methodologies to construct a cost-effective, efficient, large-scale system.

  18. US National Large-scale City Orthoimage Standard Initiative

    USGS Publications Warehouse

    Zhou, G.; Song, C.; Benjamin, S.; Schickler, W.

    2003-01-01

    The early procedures and algorithms for National digital orthophoto generation in National Digital Orthophoto Program (NDOP) were based on earlier USGS mapping operations, such as field control, aerotriangulation (derived in the early 1920's), the quarter-quadrangle-centered (3.75 minutes of longitude and latitude in geographic extent), 1:40,000 aerial photographs, and 2.5 D digital elevation models. However, large-scale city orthophotos using early procedures have disclosed many shortcomings, e.g., ghost image, occlusion, shadow. Thus, to provide the technical base (algorithms, procedure) and experience needed for city large-scale digital orthophoto creation is essential for the near future national large-scale digital orthophoto deployment and the revision of the Standards for National Large-scale City Digital Orthophoto in National Digital Orthophoto Program (NDOP). This paper will report our initial research results as follows: (1) High-precision 3D city DSM generation through LIDAR data processing, (2) Spatial objects/features extraction through surface material information and high-accuracy 3D DSM data, (3) 3D city model development, (4) Algorithm development for generation of DTM-based orthophoto, and DBM-based orthophoto, (5) True orthophoto generation by merging DBM-based orthophoto and DTM-based orthophoto, and (6) Automatic mosaic by optimizing and combining imagery from many perspectives.

  19. Comparing centralised and decentralised anaerobic digestion of stillage from a large-scale bioethanol plant to animal feed production.

    PubMed

    Drosg, B; Wirthensohn, T; Konrad, G; Hornbachner, D; Resch, C; Wäger, F; Loderer, C; Waltenberger, R; Kirchmayr, R; Braun, R

    2008-01-01

    A comparison of stillage treatment options for large-scale bioethanol plants was based on the data of an existing plant producing approximately 200,000 t/yr of bioethanol and 1,400,000 t/yr of stillage. Animal feed production--the state-of-the-art technology at the plant--was compared to anaerobic digestion. The latter was simulated in two different scenarios: digestion in small-scale biogas plants in the surrounding area versus digestion in a large-scale biogas plant at the bioethanol production site. Emphasis was placed on a holistic simulation balancing chemical parameters and calculating logistic algorithms to compare the efficiency of the stillage treatment solutions. For central anaerobic digestion different digestate handling solutions were considered because of the large amount of digestate. For land application a minimum of 36,000 ha of available agricultural area would be needed and 600,000 m(3) of storage volume. Secondly membrane purification of the digestate was investigated consisting of decanter, microfiltration, and reverse osmosis. As a third option aerobic wastewater treatment of the digestate was discussed. The final outcome was an economic evaluation of the three mentioned stillage treatment options, as a guide to stillage management for operators of large-scale bioethanol plants. Copyright IWA Publishing 2008.

  20. Unmanned Aircraft System (UAS) Traffic Management (UTM): Enabling Civilian Low-Altitude Airspace and Unmanned Aerial System Operations

    NASA Technical Reports Server (NTRS)

    Kopardekar, Parimal Hemchandra

    2016-01-01

    Just a year ago we laid out the UTM challenges and NASA's proposed solutions. During the past year NASA's goal continues to be to conduct research, development and testing to identify airspace operations requirements to enable large-scale visual and beyond visual line-of-sight UAS operations in the low-altitude airspace. Significant progress has been made, and NASA is continuing to move forward.

  1. HELICITY CONSERVATION IN NONLINEAR MEAN-FIELD SOLAR DYNAMO

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pipin, V. V.; Sokoloff, D. D.; Zhang, H.

    It is believed that magnetic helicity conservation is an important constraint on large-scale astrophysical dynamos. In this paper, we study a mean-field solar dynamo model that employs two different formulations of the magnetic helicity conservation. In the first approach, the evolution of the averaged small-scale magnetic helicity is largely determined by the local induction effects due to the large-scale magnetic field, turbulent motions, and the turbulent diffusive loss of helicity. In this case, the dynamo model shows that the typical strength of the large-scale magnetic field generated by the dynamo is much smaller than the equipartition value for the magneticmore » Reynolds number 10{sup 6}. This is the so-called catastrophic quenching (CQ) phenomenon. In the literature, this is considered to be typical for various kinds of solar dynamo models, including the distributed-type and the Babcock-Leighton-type dynamos. The problem can be resolved by the second formulation, which is derived from the integral conservation of the total magnetic helicity. In this case, the dynamo model shows that magnetic helicity propagates with the dynamo wave from the bottom of the convection zone to the surface. This prevents CQ because of the local balance between the large-scale and small-scale magnetic helicities. Thus, the solar dynamo can operate in a wide range of magnetic Reynolds numbers up to 10{sup 6}.« less

  2. Developing Renewable Energy Projects Larger Than 10 MWs at Federal Facilities (Book)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2013-03-01

    To accomplish Federal goals for renewable energy, sustainability, and energy security, large-scale renewable energy projects must be developed and constructed on Federal sites at a significant scale with significant private investment. The U.S. Department of Energy's Federal Energy Management Program (FEMP) helps Federal agencies meet these goals and assists agency personnel navigate the complexities of developing such projects and attract the necessary private capital to complete them. This guide is intended to provide a general resource that will begin to develop the Federal employee's awareness and understanding of the project developer's operating environment and the private sector's awareness and understandingmore » of the Federal environment. Because the vast majority of the investment that is required to meet the goals for large-scale renewable energy projects will come from the private sector, this guide has been organized to match Federal processes with typical phases of commercial project development. The main purpose of this guide is to provide a project development framework to allow the Federal Government, private developers, and investors to work in a coordinated fashion on large-scale renewable energy projects. The framework includes key elements that describe a successful, financially attractive large-scale renewable energy project.« less

  3. Breakdowns in coordinated decision making at and above the incident management team level: an analysis of three large scale Australian wildfires.

    PubMed

    Bearman, Chris; Grunwald, Jared A; Brooks, Benjamin P; Owen, Christine

    2015-03-01

    Emergency situations are by their nature difficult to manage and success in such situations is often highly dependent on effective team coordination. Breakdowns in team coordination can lead to significant disruption to an operational response. Breakdowns in coordination were explored in three large-scale bushfires in Australia: the Kilmore East fire, the Wangary fire, and the Canberra Firestorm. Data from these fires were analysed using a top-down and bottom-up qualitative analysis technique. Forty-four breakdowns in coordinated decision making were identified, which yielded 83 disconnects grouped into three main categories: operational, informational and evaluative. Disconnects were specific instances where differences in understanding existed between team members. The reasons why disconnects occurred were largely consistent across the three sets of data. In some cases multiple disconnects occurred in a temporal manner, which suggested some evidence of disconnects creating states that were conducive to the occurrence of further disconnects. In terms of resolution, evaluative disconnects were nearly always resolved however operational and informational disconnects were rarely resolved effectively. The exploratory data analysis and discussion presented here represents the first systematic research to provide information about the reasons why breakdowns occur in emergency management and presents an account of how team processes can act to disrupt coordination and the operational response. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  4. Large Scale Application of Vibration Sensors for Fan Monitoring at Commercial Layer Hen Houses

    PubMed Central

    Chen, Yan; Ni, Ji-Qin; Diehl, Claude A.; Heber, Albert J.; Bogan, Bill W.; Chai, Li-Long

    2010-01-01

    Continuously monitoring the operation of each individual fan can significantly improve the measurement quality of aerial pollutant emissions from animal buildings that have a large number of fans. To monitor the fan operation by detecting the fan vibration is a relatively new technique. A low-cost electronic vibration sensor was developed and commercialized. However, its large scale application has not yet been evaluated. This paper presents long-term performance results of this vibration sensor at two large commercial layer houses. Vibration sensors were installed on 164 fans of 130 cm diameter to continuously monitor the fan on/off status for two years. The performance of the vibration sensors was compared with fan rotational speed (FRS) sensors. The vibration sensors exhibited quick response and high sensitivity to fan operations and therefore satisfied the general requirements of air quality research. The study proved that detecting fan vibration was an effective method to monitor the on/off status of a large number of single-speed fans. The vibration sensor itself was $2 more expensive than a magnetic proximity FRS sensor but the overall cost including installation and data acquisition hardware was $77 less expensive than the FRS sensor. A total of nine vibration sensors failed during the study and the failure rate was related to the batches of product. A few sensors also exhibited unsteady sensitivity. As a new product, the quality of the sensor should be improved to make it more reliable and acceptable. PMID:22163544

  5. Automated Scheduling of Science Activities for Titan Encounters by Cassini

    NASA Technical Reports Server (NTRS)

    Ray, Trina L.; Knight, Russel L.; Mohr, Dave

    2014-01-01

    In an effort to demonstrate the efficacy of automated planning and scheduling techniques for large missions, we have adapted ASPEN (Activity Scheduling and Planning Environment) [1] and CLASP (Compressed Large-scale Activity Scheduling and Planning) [2] to the domain of scheduling high-level science goals into conflict-free operations plans for Titan encounters by the Cassini spacecraft.

  6. A Combined Eulerian-Lagrangian Data Representation for Large-Scale Applications.

    PubMed

    Sauer, Franz; Xie, Jinrong; Ma, Kwan-Liu

    2017-10-01

    The Eulerian and Lagrangian reference frames each provide a unique perspective when studying and visualizing results from scientific systems. As a result, many large-scale simulations produce data in both formats, and analysis tasks that simultaneously utilize information from both representations are becoming increasingly popular. However, due to their fundamentally different nature, drawing correlations between these data formats is a computationally difficult task, especially in a large-scale setting. In this work, we present a new data representation which combines both reference frames into a joint Eulerian-Lagrangian format. By reorganizing Lagrangian information according to the Eulerian simulation grid into a "unit cell" based approach, we can provide an efficient out-of-core means of sampling, querying, and operating with both representations simultaneously. We also extend this design to generate multi-resolution subsets of the full data to suit the viewer's needs and provide a fast flow-aware trajectory construction scheme. We demonstrate the effectiveness of our method using three large-scale real world scientific datasets and provide insight into the types of performance gains that can be achieved.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, Shuaiqi; Zhang, Minghua; Xie, Shaocheng

    Large-scale forcing data, such as vertical velocity and advective tendencies, are required to drive single-column models (SCMs), cloud-resolving models, and large-eddy simulations. Previous studies suggest that some errors of these model simulations could be attributed to the lack of spatial variability in the specified domain-mean large-scale forcing. This study investigates the spatial variability of the forcing and explores its impact on SCM simulated precipitation and clouds. A gridded large-scale forcing data during the March 2000 Cloud Intensive Operational Period at the Atmospheric Radiation Measurement program's Southern Great Plains site is used for analysis and to drive the single-column version ofmore » the Community Atmospheric Model Version 5 (SCAM5). When the gridded forcing data show large spatial variability, such as during a frontal passage, SCAM5 with the domain-mean forcing is not able to capture the convective systems that are partly located in the domain or that only occupy part of the domain. This problem has been largely reduced by using the gridded forcing data, which allows running SCAM5 in each subcolumn and then averaging the results within the domain. This is because the subcolumns have a better chance to capture the timing of the frontal propagation and the small-scale systems. As a result, other potential uses of the gridded forcing data, such as understanding and testing scale-aware parameterizations, are also discussed.« less

  8. Linux OS Jitter Measurements at Large Node Counts using a BlueGene/L

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Terry R; Tauferner, Mr. Andrew; Inglett, Mr. Todd

    2010-01-01

    We present experimental results for a coordinated scheduling implementation of the Linux operating system. Results were collected on an IBM Blue Gene/L machine at scales up to 16K nodes. Our results indicate coordinated scheduling was able to provide a dramatic improvement in scaling performance for two applications characterized as bulk synchronous parallel programs.

  9. Large space telescope engineering scale model optical design

    NASA Technical Reports Server (NTRS)

    Facey, T. A.

    1973-01-01

    The objective is to develop the detailed design and tolerance data for the LST engineering scale model optical system. This will enable MSFC to move forward to the optical element procurement phase and also to evaluate tolerances, manufacturing requirements, assembly/checkout procedures, reliability, operational complexity, stability requirements of the structure and thermal system, and the flexibility to change and grow.

  10. Intervention for First Graders with Limited Number Knowledge: Large-Scale Replication of a Randomized Controlled Trial

    ERIC Educational Resources Information Center

    Gersten, Russell; Rolfhus, Eric; Clarke, Ben; Decker, Lauren E.; Wilkins, Chuck; Dimino, Joseph

    2015-01-01

    Replication studies are extremely rare in education. This randomized controlled trial (RCT) is a scale-up replication of Fuchs et al., which in a sample of 139 found a statistically significant positive impact for Number Rockets, a small-group intervention for at-risk first graders that focused on building understanding of number operations. The…

  11. Chapter 3 - Large-scale patterns of forest fire occurrence in the conterminous United States, Alaska and Hawaii, 2016

    Treesearch

    Kevin M. Potter

    2018-01-01

    As a pervasive disturbance agent operating at many spatial and temporal scales, wildland fire is a key abiotic factor affecting forest health both positively and negatively. In some ecosystems, for example, wildland fires have been essential for regulating processes that maintain forest health (Lundquist and others 2011). Wildland fire is an important ecological...

  12. Generation of a Large-scale Magnetic Field in a Convective Full-sphere Cross-helicity Dynamo

    NASA Astrophysics Data System (ADS)

    Pipin, V. V.; Yokoi, N.

    2018-05-01

    We study the effects of the cross-helicity in the full-sphere large-scale mean-field dynamo models of a 0.3 M ⊙ star rotating with a period of 10 days. In exploring several dynamo scenarios that stem from magnetic field generation by the cross-helicity effect, we found that the cross-helicity provides the natural generation mechanisms for the large-scale scale axisymmetric and nonaxisymmetric magnetic field. Therefore, the rotating stars with convective envelopes can produce a large-scale magnetic field generated solely due to the turbulent cross-helicity effect (we call it γ 2-dynamo). Using mean-field models we compare the properties of the large-scale magnetic field organization that stems from dynamo mechanisms based on the kinetic helicity (associated with the α 2 dynamos) and cross-helicity. For the fully convective stars, both generation mechanisms can maintain large-scale dynamos even for the solid body rotation law inside the star. The nonaxisymmetric magnetic configurations become preferable when the cross-helicity and the α-effect operate independently of each other. This corresponds to situations with purely γ 2 or α 2 dynamos. The combination of these scenarios, i.e., the γ 2 α 2 dynamo, can generate preferably axisymmetric, dipole-like magnetic fields at strengths of several kGs. Thus, we found a new dynamo scenario that is able to generate an axisymmetric magnetic field even in the case of a solid body rotation of the star. We discuss the possible applications of our findings to stellar observations.

  13. Torsional Oscillations in a Global Solar Dynamo

    NASA Astrophysics Data System (ADS)

    Beaudoin, P.; Charbonneau, P.; Racine, E.; Smolarkiewicz, P. K.

    2013-02-01

    We characterize and analyze rotational torsional oscillations developing in a large-eddy magnetohydrodynamical simulation of solar convection (Ghizaru, Charbonneau, and Smolarkiewicz, Astrophys. J. Lett. 715, L133, 2010; Racine et al., Astrophys. J. 735, 46, 2011) producing an axisymmetric, large-scale, magnetic field undergoing periodic polarity reversals. Motivated by the many solar-like features exhibited by these oscillations, we carry out an analysis of the large-scale zonal dynamics. We demonstrate that simulated torsional oscillations are not driven primarily by the periodically varying large-scale magnetic torque, as one might have expected, but rather via the magnetic modulation of angular-momentum transport by the large-scale meridional flow. This result is confirmed by a straightforward energy analysis. We also detect a fairly sharp transition in rotational dynamics taking place as one moves from the base of the convecting layers to the base of the thin tachocline-like shear layer formed in the stably stratified fluid layers immediately below. We conclude by discussing the implications of our analyses with regard to the mechanism of amplitude saturation in the global dynamo operating in the simulation, and speculate on the possible precursor value of torsional oscillations for the forecast of solar-cycle characteristics.

  14. CFD Study of Full-Scale Aerobic Bioreactors: Evaluation of Dynamic O2 Distribution, Gas-Liquid Mass Transfer and Reaction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humbird, David; Sitaraman, Hariswaran; Stickel, Jonathan

    If advanced biofuels are to measurably displace fossil fuels in the near term, they will have to operate at levels of scale, efficiency, and margin unprecedented in the current biotech industry. For aerobically-grown products in particular, scale-up is complex and the practical size, cost, and operability of extremely large reactors is not well understood. Put simply, the problem of how to attain fuel-class production scales comes down to cost-effective delivery of oxygen at high mass transfer rates and low capital and operating costs. To that end, very large reactor vessels (>500 m3) are proposed in order to achieve favorable economiesmore » of scale. Additionally, techno-economic evaluation indicates that bubble-column reactors are more cost-effective than stirred-tank reactors in many low-viscosity cultures. In order to advance the design of extremely large aerobic bioreactors, we have performed computational fluid dynamics (CFD) simulations of bubble-column reactors. A multiphase Euler-Euler model is used to explicitly account for the spatial distribution of air (i.e., gas bubbles) in the reactor. Expanding on the existing bioreactor CFD literature (typically focused on the hydrodynamics of bubbly flows), our simulations include interphase mass transfer of oxygen and a simple phenomenological reaction representing the uptake and consumption of dissolved oxygen by submerged cells. The simulations reproduce the expected flow profiles, with net upward flow in the center of column and downward flow near the wall. At high simulated oxygen uptake rates (OUR), oxygen-depleted regions can be observed in the reactor. By increasing the gas flow to enhance mixing and eliminate depleted areas, a maximum oxygen transfer (OTR) rate is obtained as a function of superficial velocity. These insights regarding minimum superficial velocity and maximum reactor size are incorporated into NREL's larger techno-economic models to supplement standard reactor design equations.« less

  15. The OLI Radiometric Scale Realization Round Robin Measurement Campaign

    NASA Technical Reports Server (NTRS)

    Cutlip, Hansford; Cole,Jerold; Johnson, B. Carol; Maxwell, Stephen; Markham, Brian; Ong, Lawrence; Hom, Milton; Biggar, Stuart

    2011-01-01

    A round robin radiometric scale realization was performed at the Ball Aerospace Radiometric Calibration Laboratory in January/February 2011 in support of the Operational Land Imager (OLI) Program. Participants included Ball Aerospace, NIST, NASA Goddard Space Flight Center, and the University of Arizona. The eight day campaign included multiple observations of three integrating sphere sources by nine radiometers. The objective of the campaign was to validate the radiance calibration uncertainty ascribed to the integrating sphere used to calibrate the OLI instrument. The instrument level calibration source uncertainty was validated by quatnifying: (1) the long term stability of the NIST calibrated radiance artifact, (2) the responsivity scale of the Ball Aerospace transfer radiometer and (3) the operational characteristics of the large integrating sphere.

  16. Nanoliter-Scale Protein Crystallization and Screening with a Microfluidic Droplet Robot

    PubMed Central

    Zhu, Ying; Zhu, Li-Na; Guo, Rui; Cui, Heng-Jun; Ye, Sheng; Fang, Qun

    2014-01-01

    Large-scale screening of hundreds or even thousands of crystallization conditions while with low sample consumption is in urgent need, in current structural biology research. Here we describe a fully-automated droplet robot for nanoliter-scale crystallization screening that combines the advantages of both automated robotics technique for protein crystallization screening and the droplet-based microfluidic technique. A semi-contact dispensing method was developed to achieve flexible, programmable and reliable liquid-handling operations for nanoliter-scale protein crystallization experiments. We applied the droplet robot in large-scale screening of crystallization conditions of five soluble proteins and one membrane protein with 35–96 different crystallization conditions, study of volume effects on protein crystallization, and determination of phase diagrams of two proteins. The volume for each droplet reactor is only ca. 4–8 nL. The protein consumption significantly reduces 50–500 fold compared with current crystallization stations. PMID:24854085

  17. Nanoliter-scale protein crystallization and screening with a microfluidic droplet robot.

    PubMed

    Zhu, Ying; Zhu, Li-Na; Guo, Rui; Cui, Heng-Jun; Ye, Sheng; Fang, Qun

    2014-05-23

    Large-scale screening of hundreds or even thousands of crystallization conditions while with low sample consumption is in urgent need, in current structural biology research. Here we describe a fully-automated droplet robot for nanoliter-scale crystallization screening that combines the advantages of both automated robotics technique for protein crystallization screening and the droplet-based microfluidic technique. A semi-contact dispensing method was developed to achieve flexible, programmable and reliable liquid-handling operations for nanoliter-scale protein crystallization experiments. We applied the droplet robot in large-scale screening of crystallization conditions of five soluble proteins and one membrane protein with 35-96 different crystallization conditions, study of volume effects on protein crystallization, and determination of phase diagrams of two proteins. The volume for each droplet reactor is only ca. 4-8 nL. The protein consumption significantly reduces 50-500 fold compared with current crystallization stations.

  18. Variable density management in riparian reserves: lessons learned from an operational study in managed forests of western Oregon, USA.

    Treesearch

    Samuel Chan; Paul Anderson; John Cissel; Larry Lateen; Charley Thompson

    2004-01-01

    A large-scale operational study has been undertaken to investigate variable density management in conjunction with riparian buffers as a means to accelerate development of late-seral habitat, facilitate rare species management, and maintain riparian functions in 40-70 year-old headwater forests in western Oregon, USA. Upland variable retention treatments include...

  19. Gene activity test determines cold tolerance in Douglas-fir seedlings

    Treesearch

    Peter A. Balk; Diane L. Haase; Monique F. van Wordragen

    2008-01-01

    Forest tree nurseries rely on a tight scheduling of operations to be able to deliver vigorous seedlings to the planting site. Cooler or freezer storage is often used to maintain planting stock in an inactive condition and to ensure a plant supply for geographically diverse planting sites, which is a requirement for large-scale or internationally operating nurseries....

  20. Support of Helicopter 'Free Flight' Operations in the 1996 Olympics

    NASA Technical Reports Server (NTRS)

    Branstetter, James R.; Cooper, Eric G.

    1996-01-01

    The microcosm of activity surrounding the 1996 Olympic Games provided researchers an opportunity for demonstrating state-of-the art technology in the first large-scale deployment of a prototype digital communication/navigation/surveillance system in a confined environment. At the same time it provided an ideal opportunity for transportation officials to showcase the merits of an integrated transportation system in meeting the operational needs to transport time sensitive goods and provide public safety services under real-world conditions. Five aeronautical CNS functions using a digital datalink system were chosen for operational flight testing onboard 91 aircraft, most of them helicopters, participating in the Atlanta Short-Haul Transportation System. These included: GPS-based Automatic Dependent Surveillance, Cockpit Display of Traffic Information, Controller-Pilot Communications, Graphical Weather Information (uplink), and Automated Electronic Pilot Reporting (downlink). Atlanta provided the first opportunity to demonstrate, in an actual operating environment, key datalink functions which would enhance flight safety and situational awareness for the pilot and supplement conventional air traffic control. The knowledge gained from such a large-scale deployment will help system designers in development of a national infrastructure where aircraft would have the ability to navigate autonomously.

  1. The application of DEA (Data Envelopment Analysis) window analysis in the assessment of influence on operational efficiencies after the establishment of branched hospitals.

    PubMed

    Jia, Tongying; Yuan, Huiyun

    2017-04-12

    Many large-scaled public hospitals have established branched hospitals in China. This study is to provide evidence for strategy making on the management and development of multi-branched hospitals by evaluating and comparing the operational efficiencies of different hospitals before and after their establishment of branched hospitals. DEA (Data Envelopment Analysis) window analysis was performed on a 7-year data pool from five public hospitals provided by health authorities and institutional surveys. The operational efficiencies of sample hospitals measured in this study (including technical efficiency, pure technical efficiency and scale efficiency) had overall trends towards increase during this 7-year period of time, however, a temporary downturn occurred shortly after the establishment of branched hospitals; pure technical efficiency contributed more to the improvement of technical efficiency compared to scale efficiency. The establishment of branched-hospitals did not lead to a long-term negative effect on hospital operational efficiencies. Our data indicated the importance of improving scale efficiency via the optimization of organizational management, as well as the advantage of a different form of branch-establishment, merging and reorganization. This study brought an insight into the practical application of DEA window analysis on the assessment of hospital operational efficiencies.

  2. a Stochastic Approach to Multiobjective Optimization of Large-Scale Water Reservoir Networks

    NASA Astrophysics Data System (ADS)

    Bottacin-Busolin, A.; Worman, A. L.

    2013-12-01

    A main challenge for the planning and management of water resources is the development of multiobjective strategies for operation of large-scale water reservoir networks. The optimal sequence of water releases from multiple reservoirs depends on the stochastic variability of correlated hydrologic inflows and on various processes that affect water demand and energy prices. Although several methods have been suggested, large-scale optimization problems arising in water resources management are still plagued by the high dimensional state space and by the stochastic nature of the hydrologic inflows. In this work, the optimization of reservoir operation is approached using approximate dynamic programming (ADP) with policy iteration and function approximators. The method is based on an off-line learning process in which operating policies are evaluated for a number of stochastic inflow scenarios, and the resulting value functions are used to design new, improved policies until convergence is attained. A case study is presented of a multi-reservoir system in the Dalälven River, Sweden, which includes 13 interconnected reservoirs and 36 power stations. Depending on the late spring and summer peak discharges, the lowlands adjacent to Dalälven can often be flooded during the summer period, and the presence of stagnating floodwater during the hottest months of the year is the cause of a large proliferation of mosquitos, which is a major problem for the people living in the surroundings. Chemical pesticides are currently being used as a preventive countermeasure, which do not provide an effective solution to the problem and have adverse environmental impacts. In this study, ADP was used to analyze the feasibility of alternative operating policies for reducing the flood risk at a reasonable economic cost for the hydropower companies. To this end, mid-term operating policies were derived by combining flood risk reduction with hydropower production objectives. The performance of the resulting policies was evaluated by simulating the online operating process for historical inflow scenarios and synthetic inflow forecasts. The simulations are based on a combined mid- and short-term planning model in which the value function derived in the mid-term planning phase provides the value of the policy at the end of the short-term operating horizon. While a purely deterministic linear analysis provided rather optimistic results, the stochastic model allowed for a more accurate evaluation of trade-offs and limitations of alternative operating strategies for the Dalälven reservoir network.

  3. Universal quantum computation using all-optical hybrid encoding

    NASA Astrophysics Data System (ADS)

    Guo, Qi; Cheng, Liu-Yong; Wang, Hong-Fu; Zhang, Shou

    2015-04-01

    By employing displacement operations, single-photon subtractions, and weak cross-Kerr nonlinearity, we propose an alternative way of implementing several universal quantum logical gates for all-optical hybrid qubits encoded in both single-photon polarization state and coherent state. Since these schemes can be straightforwardly implemented only using local operations without teleportation procedure, therefore, less physical resources and simpler operations are required than the existing schemes. With the help of displacement operations, a large phase shift of the coherent state can be obtained via currently available tiny cross-Kerr nonlinearity. Thus, all of these schemes are nearly deterministic and feasible under current technology conditions, which makes them suitable for large-scale quantum computing. Project supported by the National Natural Science Foundation of China (Grant Nos. 61465013, 11465020, and 11264042).

  4. The use of Merging and Aggregation Operators for MRDB Data Feeding

    NASA Astrophysics Data System (ADS)

    Kozioł, Krystian; Lupa, Michał

    2013-12-01

    This paper presents the application of two generalization operators - merging and displacement - in the process of automatic data feeding in a multiresolution data base of topographic objects from large-scale data-bases (1 : 500-1 : 5000). An ordered collection of objects makes a layer of development that in the process of generalization is subjected to the processes of merging and displacement in order to maintain recognizability in the reduced scale of the map. The solution to the above problem is the algorithms described in the work; these algorithms use the standard recognition of drawings (Chrobak 2010), independent of the user. A digital cartographic generalization process is a set of consecutive operators where merging and aggregation play a key role. The proper operation has a significant impact on the qualitative assessment of data generalization

  5. Controllability of multiplex, multi-time-scale networks

    NASA Astrophysics Data System (ADS)

    Pósfai, Márton; Gao, Jianxi; Cornelius, Sean P.; Barabási, Albert-László; D'Souza, Raissa M.

    2016-09-01

    The paradigm of layered networks is used to describe many real-world systems, from biological networks to social organizations and transportation systems. While recently there has been much progress in understanding the general properties of multilayer networks, our understanding of how to control such systems remains limited. One fundamental aspect that makes this endeavor challenging is that each layer can operate at a different time scale; thus, we cannot directly apply standard ideas from structural control theory of individual networks. Here we address the problem of controlling multilayer and multi-time-scale networks focusing on two-layer multiplex networks with one-to-one interlayer coupling. We investigate the practically relevant case when the control signal is applied to the nodes of one layer. We develop a theory based on disjoint path covers to determine the minimum number of inputs (Ni) necessary for full control. We show that if both layers operate on the same time scale, then the network structure of both layers equally affect controllability. In the presence of time-scale separation, controllability is enhanced if the controller interacts with the faster layer: Ni decreases as the time-scale difference increases up to a critical time-scale difference, above which Ni remains constant and is completely determined by the faster layer. We show that the critical time-scale difference is large if layer I is easy and layer II is hard to control in isolation. In contrast, control becomes increasingly difficult if the controller interacts with the layer operating on the slower time scale and increasing time-scale separation leads to increased Ni, again up to a critical value, above which Ni still depends on the structure of both layers. This critical value is largely determined by the longest path in the faster layer that does not involve cycles. By identifying the underlying mechanisms that connect time-scale difference and controllability for a simplified model, we provide crucial insight into disentangling how our ability to control real interacting complex systems is affected by a variety of sources of complexity.

  6. Speed-Up Techniques for Complementary Metal Oxide Semiconductor Very Large Scale Integration.

    DTIC Science & Technology

    1984-12-14

    The input voltage at which the two transistors are in the constant current region at the same time marks the active operating region of the inverter...decoder precharge configurations. One circuit displayed a marked enhancement in operation while the other precharged circuit displyed degraded operation due...34 IEEE Journal of Solid State Circuits, SC-18: 457-462 (October 1983). 19. Cobbold , R. Theory and Applications of Field Effect Transistors, New York: John

  7. Staff Development, Deception Operations, and Force Projection: Lessons from the Normandy Invasion

    DTIC Science & Technology

    2015-12-21

    1940.8 The Allies needed to breach Germany’s Atlantic Wall and establish a lodgment and logistic center before large-scale ground operations could...invasion of Italy in 1943 — was the third allied amphibious operation in the North African, Mediterranean, and European Theatre . Between 1942 and 1943, the...Schuster 1948), 122-125. 18 Pogue, The Supreme Command, 45-48; Douglas Porch, The Mediterranean Theatre in World War II: The Path to Victory (New

  8. How large is large enough for insects? Forest fragmentation effects at three spatial scales

    NASA Astrophysics Data System (ADS)

    Ribas, C. R.; Sobrinho, T. G.; Schoereder, J. H.; Sperber, C. F.; Lopes-Andrade, C.; Soares, S. M.

    2005-02-01

    Several mechanisms may lead to species loss in fragmented habitats, such as edge and shape effects, loss of habitat and heterogeneity. Ants and crickets were sampled in 18 forest remnants in south-eastern Brazil, to test whether a group of small remnants maintains the same insect species richness as similar sized large remnants, at three spatial scales. We tested hypotheses about alpha and gamma diversity to explain the results. Groups of remnants conserve as many species of ants as a single one. Crickets, however, showed a scale-dependent pattern: at small scales there was no significant or important difference between groups of remnants and a single one, while at the larger scale the group of remnants maintained more species. Alpha diversity (local species richness) was similar in a group of remnants and in a single one, at the three spatial scales, both for ants and crickets. Gamma diversity, however, varied both with taxa (ants and crickets) and spatial scale, which may be linked to insect mobility, remnant isolation, and habitat heterogeneity. Biological characteristics of the organisms involved have to be considered when studying fragmentation effects, as well as spatial scale at which it operates. Mobility of the organisms influences fragmentation effects, and consequently conservation strategies.

  9. Multi-Scale Three-Dimensional Variational Data Assimilation System for Coastal Ocean Prediction

    NASA Technical Reports Server (NTRS)

    Li, Zhijin; Chao, Yi; Li, P. Peggy

    2012-01-01

    A multi-scale three-dimensional variational data assimilation system (MS-3DVAR) has been formulated and the associated software system has been developed for improving high-resolution coastal ocean prediction. This system helps improve coastal ocean prediction skill, and has been used in support of operational coastal ocean forecasting systems and field experiments. The system has been developed to improve the capability of data assimilation for assimilating, simultaneously and effectively, sparse vertical profiles and high-resolution remote sensing surface measurements into coastal ocean models, as well as constraining model biases. In this system, the cost function is decomposed into two separate units for the large- and small-scale components, respectively. As such, data assimilation is implemented sequentially from large to small scales, the background error covariance is constructed to be scale-dependent, and a scale-dependent dynamic balance is incorporated. This scheme then allows effective constraining large scales and model bias through assimilating sparse vertical profiles, and small scales through assimilating high-resolution surface measurements. This MS-3DVAR enhances the capability of the traditional 3DVAR for assimilating highly heterogeneously distributed observations, such as along-track satellite altimetry data, and particularly maximizing the extraction of information from limited numbers of vertical profile observations.

  10. A comparative study of all-vanadium and iron-chromium redox flow batteries for large-scale energy storage

    NASA Astrophysics Data System (ADS)

    Zeng, Y. K.; Zhao, T. S.; An, L.; Zhou, X. L.; Wei, L.

    2015-12-01

    The promise of redox flow batteries (RFBs) utilizing soluble redox couples, such as all vanadium ions as well as iron and chromium ions, is becoming increasingly recognized for large-scale energy storage of renewables such as wind and solar, owing to their unique advantages including scalability, intrinsic safety, and long cycle life. An ongoing question associated with these two RFBs is determining whether the vanadium redox flow battery (VRFB) or iron-chromium redox flow battery (ICRFB) is more suitable and competitive for large-scale energy storage. To address this concern, a comparative study has been conducted for the two types of battery based on their charge-discharge performance, cycle performance, and capital cost. It is found that: i) the two batteries have similar energy efficiencies at high current densities; ii) the ICRFB exhibits a higher capacity decay rate than does the VRFB; and iii) the ICRFB is much less expensive in capital costs when operated at high power densities or at large capacities.

  11. Measurement of Unsteady Blade Surface Pressure on a Single Rotation Large Scale Advanced Prop-fan with Angular and Wake Inflow at Mach Numbers from 0.02 to 0.70

    NASA Technical Reports Server (NTRS)

    Bushnell, P.; Gruber, M.; Parzych, D.

    1988-01-01

    Unsteady blade surface pressure data for the Large-Scale Advanced Prop-Fan (LAP) blade operation with angular inflow, wake inflow and uniform flow over a range of inflow Mach numbers of 0.02 to 0.70 is provided. The data are presented as Fourier coefficients for the first 35 harmonics of shaft rotational frequency. Also presented is a brief discussion of the unsteady blade response observed at takeoff and cruise conditions with angular and wake inflow.

  12. Friction-Stir Welding of Large Scale Cryogenic Fuel Tanks for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Jones, Clyde S., III; Venable, Richard A.

    1998-01-01

    The Marshall Space Flight Center has established a facility for the joining of large-scale aluminum-lithium alloy 2195 cryogenic fuel tanks using the friction-stir welding process. Longitudinal welds, approximately five meters in length, were made possible by retrofitting an existing vertical fusion weld system, designed to fabricate tank barrel sections ranging from two to ten meters in diameter. The structural design requirements of the tooling, clamping and the spindle travel system will be described in this paper. Process controls and real-time data acquisition will also be described, and were critical elements contributing to successful weld operation.

  13. Cost-Efficient Storage of Cryogens

    NASA Technical Reports Server (NTRS)

    Fesmire, J. E.; Sass, J. P.; Nagy, Z.; Sojoumer, S. J.; Morris, D. L.; Augustynowicz, S. D.

    2007-01-01

    NASA's cryogenic infrastructure that supports launch vehicle operations and propulsion testing is reaching an age where major refurbishment will soon be required. Key elements of this infrastructure are the large double-walled cryogenic storage tanks used for both space vehicle launch operations and rocket propulsion testing at the various NASA field centers. Perlite powder has historically been the insulation material of choice for these large storage tank applications. New bulk-fill insulation materials, including glass bubbles and aerogel beads, have been shown to provide improved thermal and mechanical performance. A research testing program was conducted to investigate the thermal performance benefits as well as to identify operational considerations and associated risks associated with the application of these new materials in large cryogenic storage tanks. The program was divided into three main areas: material testing (thermal conductivity and physical characterization), tank demonstration testing (liquid nitrogen and liquid hydrogen), and system studies (thermal modeling, economic analysis, and insulation changeout). The results of this research work show that more energy-efficient insulation solutions are possible for large-scale cryogenic storage tanks worldwide and summarize the operational requirements that should be considered for these applications.

  14. Application of a fluidized bed reactor charged with aragonite for control of alkalinity, pH and carbon dioxide in marine recirculating aquaculture systems

    USGS Publications Warehouse

    Paul S Wills, PhD; Pfeiffer, Timothy; Baptiste, Richard; Watten, Barnaby J.

    2016-01-01

    Control of alkalinity, dissolved carbon dioxide (dCO2), and pH are critical in marine recirculating aquaculture systems (RAS) in order to maintain health and maximize growth. A small-scale prototype aragonite sand filled fluidized bed reactor was tested under varying conditions of alkalinity and dCO2 to develop and model the response of dCO2 across the reactor. A large-scale reactor was then incorporated into an operating marine recirculating aquaculture system to observe the reactor as the system moved toward equilibrium. The relationship between alkalinity dCO2, and pH across the reactor are described by multiple regression equations. The change in dCO2 across the small-scale reactor indicated a strong likelihood that an equilibrium alkalinity would be maintained by using a fluidized bed aragonite reactor. The large-scale reactor verified this observation and established equilibrium at an alkalinity of approximately 135 mg/L as CaCO3, dCO2 of 9 mg/L, and a pH of 7.0 within 4 days that was stable during a 14 day test period. The fluidized bed aragonite reactor has the potential to simplify alkalinity and pH control, and aid in dCO2 control in RAS design and operation. Aragonite sand, purchased in bulk, is less expensive than sodium bicarbonate and could reduce overall operating production costs.

  15. Recommendations for ground effects research for V/STOL and STOL aircraft and associated equipment for large scale testing

    NASA Technical Reports Server (NTRS)

    Kuhn, Richard E.

    1986-01-01

    The current understanding of the effects of ground proximity on V/STOL and STOL aircraft is reviewd. Areas covered include (1) single jet suckdown in hover, (2) fountain effects on multijet configurations, (3) STOL ground effects including the effect of the ground vortex flow field, (4) downwash at the tail, and (5) hot gas ingestion in both hover and STOL operation. The equipment needed for large scale testing to extend the state of the art is reviewed and developments in three areas are recommended as follows: (1) improve methods for simulating the engine exhaust and inlet flows; (2) develop a model support system that can simulate realistic rates of climb and descent as well as steady height operation; and (3) develop a blowing BLC ground board as an alternative to a moving belt ground board to properly simulate the flow on the ground.

  16. Are large-scale flow experiments informing the science and management of freshwater ecosystems?

    USGS Publications Warehouse

    Olden, Julian D.; Konrad, Christopher P.; Melis, Theodore S.; Kennard, Mark J.; Freeman, Mary C.; Mims, Meryl C.; Bray, Erin N.; Gido, Keith B.; Hemphill, Nina P.; Lytle, David A.; McMullen, Laura E.; Pyron, Mark; Robinson, Christopher T.; Schmidt, John C.; Williams, John G.

    2013-01-01

    Greater scientific knowledge, changing societal values, and legislative mandates have emphasized the importance of implementing large-scale flow experiments (FEs) downstream of dams. We provide the first global assessment of FEs to evaluate their success in advancing science and informing management decisions. Systematic review of 113 FEs across 20 countries revealed that clear articulation of experimental objectives, while not universally practiced, was crucial for achieving management outcomes and changing dam-operating policies. Furthermore, changes to dam operations were three times less likely when FEs were conducted primarily for scientific purposes. Despite the recognized importance of riverine flow regimes, four-fifths of FEs involved only discrete flow events. Over three-quarters of FEs documented both abiotic and biotic outcomes, but only one-third examined multiple taxonomic responses, thus limiting how FE results can inform holistic dam management. Future FEs will present new opportunities to advance scientifically credible water policies.

  17. Experimental Investigation of a Large-Scale Low-Boom Inlet Concept

    NASA Technical Reports Server (NTRS)

    Hirt, Stefanie M.; Chima, Rodrick V.; Vyas, Manan A.; Wayman, Thomas R.; Conners, Timothy R.; Reger, Robert W.

    2011-01-01

    A large-scale low-boom inlet concept was tested in the NASA Glenn Research Center 8- x 6- foot Supersonic Wind Tunnel. The purpose of this test was to assess inlet performance, stability and operability at various Mach numbers and angles of attack. During this effort, two models were tested: a dual stream inlet designed to mimic potential aircraft flight hardware integrating a high-flow bypass stream; and a single stream inlet designed to study a configuration with a zero-degree external cowl angle and to permit surface visualization of the vortex generator flow on the internal centerbody surface. During the course of the test, the low-boom inlet concept was demonstrated to have high recovery, excellent buzz margin, and high operability. This paper will provide an overview of the setup, show a brief comparison of the dual stream and single stream inlet results, and examine the dual stream inlet characteristics.

  18. Fuel savings and emissions reductions from light duty fuel cell vehicles

    NASA Astrophysics Data System (ADS)

    Mark, J.; Ohi, J. M.; Hudson, D. V., Jr.

    1994-04-01

    Fuel cell vehicles (FCV's) operate efficiently, emit few pollutants, and run on nonpetroleum fuels. Because of these characteristics, the large-scale deployment of FCV's has the potential to lessen U.S. dependence on foreign oil and improve air quality. This study characterizes the benefits of large-scale FCV deployment in the light duty vehicle market. Specifically, the study assesses the potential fuel savings and emissions reductions resulting from large-scale use of these FCV's and identifies the key parameters that affect the scope of the benefits from FCV use. The analysis scenario assumes that FCV's will compete with gasoline-powered light trucks and cars in the new vehicle market for replacement of retired vehicles and will compete for growth in the total market. Analysts concluded that the potential benefits from FCV's, measured in terms of consumer outlays for motor fuel and the value of reduced air emissions, are substantial.

  19. Proceedings of the Joint IAEA/CSNI Specialists` Meeting on Fracture Mechanics Verification by Large-Scale Testing held at Pollard Auditorium, Oak Ridge, Tennessee

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pugh, C.E.; Bass, B.R.; Keeney, J.A.

    This report contains 40 papers that were presented at the Joint IAEA/CSNI Specialists` Meeting Fracture Mechanics Verification by Large-Scale Testing held at the Pollard Auditorium, Oak Ridge, Tennessee, during the week of October 26--29, 1992. The papers are printed in the order of their presentation in each session and describe recent large-scale fracture (brittle and/or ductile) experiments, analyses of these experiments, and comparisons between predictions and experimental results. The goal of the meeting was to allow international experts to examine the fracture behavior of various materials and structures under conditions relevant to nuclear reactor components and operating environments. The emphasismore » was on the ability of various fracture models and analysis methods to predict the wide range of experimental data now available. The individual papers have been cataloged separately.« less

  20. 1 million-Q optomechanical microdisk resonators for sensing with very large scale integration

    NASA Astrophysics Data System (ADS)

    Hermouet, M.; Sansa, M.; Banniard, L.; Fafin, A.; Gely, M.; Allain, P. E.; Santos, E. Gil; Favero, I.; Alava, T.; Jourdan, G.; Hentz, S.

    2018-02-01

    Cavity optomechanics have become a promising route towards the development of ultrasensitive sensors for a wide range of applications including mass, chemical and biological sensing. In this study, we demonstrate the potential of Very Large Scale Integration (VLSI) with state-of-the-art low-loss performance silicon optomechanical microdisks for sensing applications. We report microdisks exhibiting optical Whispering Gallery Modes (WGM) with 1 million quality factors, yielding high displacement sensitivity and strong coupling between optical WGMs and in-plane mechanical Radial Breathing Modes (RBM). Such high-Q microdisks with mechanical resonance frequencies in the 102 MHz range were fabricated on 200 mm wafers with Variable Shape Electron Beam lithography. Benefiting from ultrasensitive readout, their Brownian motion could be resolved with good Signal-to-Noise ratio at ambient pressure, as well as in liquid, despite high frequency operation and large fluidic damping: the mechanical quality factor reduced from few 103 in air to 10's in liquid, and the mechanical resonance frequency shifted down by a few percent. Proceeding one step further, we performed an all-optical operation of the resonators in air using a pump-probe scheme. Our results show our VLSI process is a viable approach for the next generation of sensors operating in vacuum, gas or liquid phase.

  1. 18/20 T high magnetic field scanning tunneling microscope with fully low voltage operability, high current resolution, and large scale searching ability.

    PubMed

    Li, Quanfeng; Wang, Qi; Hou, Yubin; Lu, Qingyou

    2012-04-01

    We present a home-built 18/20 T high magnetic field scanning tunneling microscope (STM) featuring fully low voltage (lower than ±15 V) operability in low temperatures, large scale searching ability, and 20 fA high current resolution (measured by using a 100 GOhm dummy resistor to replace the tip-sample junction) with a bandwidth of 3.03 kHz. To accomplish low voltage operation which is important in achieving high precision, low noise, and low interference with the strong magnetic field, the coarse approach is implemented with an inertial slider driven by the lateral bending of a piezoelectric scanner tube (PST) whose inner electrode is axially split into two for enhanced bending per volt. The PST can also drive the same sliding piece to inertial slide in the other bending direction (along the sample surface) of the PST, which realizes the large area searching ability. The STM head is housed in a three segment tubular chamber, which is detachable near the STM head for the convenience of sample and tip changes. Atomic resolution images of a graphite sample taken under 17.6 T and 18.0001 T are presented to show its performance. © 2012 American Institute of Physics

  2. The latest developments and outlook for hydrogen liquefaction technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ohlig, K.; Decker, L.

    2014-01-29

    Liquefied hydrogen is presently mainly used for space applications and the semiconductor industry. While clean energy applications, for e.g. the automotive sector, currently contribute to this demand with a small share only, their demand may see a significant boost in the next years with the need for large scale liquefaction plants exceeding the current plant sizes by far. Hydrogen liquefaction for small scale plants with a maximum capacity of 3 tons per day (tpd) is accomplished with a Brayton refrigeration cycle using helium as refrigerant. This technology is characterized by low investment costs but lower process efficiency and hence highermore » operating costs. For larger plants, a hydrogen Claude cycle is used, characterized by higher investment but lower operating costs. However, liquefaction plants meeting the potentially high demand in the clean energy sector will need further optimization with regard to energy efficiency and hence operating costs. The present paper gives an overview of the currently applied technologies, including their thermodynamic and technical background. Areas of improvement are identified to derive process concepts for future large scale hydrogen liquefaction plants meeting the needs of clean energy applications with optimized energy efficiency and hence minimized operating costs. Compared to studies in this field, this paper focuses on application of new technology and innovative concepts which are either readily available or will require short qualification procedures. They will hence allow implementation in plants in the close future.« less

  3. Blueprint for a microwave trapped ion quantum computer.

    PubMed

    Lekitsch, Bjoern; Weidt, Sebastian; Fowler, Austin G; Mølmer, Klaus; Devitt, Simon J; Wunderlich, Christof; Hensinger, Winfried K

    2017-02-01

    The availability of a universal quantum computer may have a fundamental impact on a vast number of research fields and on society as a whole. An increasingly large scientific and industrial community is working toward the realization of such a device. An arbitrarily large quantum computer may best be constructed using a modular approach. We present a blueprint for a trapped ion-based scalable quantum computer module, making it possible to create a scalable quantum computer architecture based on long-wavelength radiation quantum gates. The modules control all operations as stand-alone units, are constructed using silicon microfabrication techniques, and are within reach of current technology. To perform the required quantum computations, the modules make use of long-wavelength radiation-based quantum gate technology. To scale this microwave quantum computer architecture to a large size, we present a fully scalable design that makes use of ion transport between different modules, thereby allowing arbitrarily many modules to be connected to construct a large-scale device. A high error-threshold surface error correction code can be implemented in the proposed architecture to execute fault-tolerant operations. With appropriate adjustments, the proposed modules are also suitable for alternative trapped ion quantum computer architectures, such as schemes using photonic interconnects.

  4. Scaling up digital circuit computation with DNA strand displacement cascades.

    PubMed

    Qian, Lulu; Winfree, Erik

    2011-06-03

    To construct sophisticated biochemical circuits from scratch, one needs to understand how simple the building blocks can be and how robustly such circuits can scale up. Using a simple DNA reaction mechanism based on a reversible strand displacement process, we experimentally demonstrated several digital logic circuits, culminating in a four-bit square-root circuit that comprises 130 DNA strands. These multilayer circuits include thresholding and catalysis within every logical operation to perform digital signal restoration, which enables fast and reliable function in large circuits with roughly constant switching time and linear signal propagation delays. The design naturally incorporates other crucial elements for large-scale circuitry, such as general debugging tools, parallel circuit preparation, and an abstraction hierarchy supported by an automated circuit compiler.

  5. Investigation of a laser Doppler velocimeter system to measure the flow field around a large scale V/STOL aircraft in ground effect

    NASA Technical Reports Server (NTRS)

    Zalay, A. D.; Brashears, M. R.; Jordan, A. J.; Shrider, K. R.; Vought, C. D.

    1979-01-01

    The flow field measured around a hovering 70 percent scale vertical takeoff and landing (V/STOL) aircraft model is described. The velocity measurements were conducted with a ground based laser Doppler velocimeter. The remote sensing instrumentation and experimental tests of the velocity surveys are discussed. The distribution of vertical velocity in the fan jet and fountain; the radial velocity in the wall jet and the horizontal velocity along the aircraft underside are presented for different engine rpms and aircraft height above ground. Results show that it is feasible to use a mobile laser Doppler velocimeter to measure the flow field generated by a large scale V/STOL aircraft operating in ground effect.

  6. Reflections on conformal spectra

    DOE PAGES

    Kim, Hyungrok; Kravchuk, Petr; Ooguri, Hirosi

    2016-04-29

    Here, we use modular invariance and crossing symmetry of conformal field theory to reveal approximate reflection symmetries in the spectral decompositions of the partition function in two dimensions in the limit of large central charge and of the four-point function in any dimension in the limit of large scaling dimensions Δ 0 of external operators. We use these symmetries to motivate universal upper bounds on the spectrum and the operator product expansion coefficients, which we then derive by independent techniques. Some of the bounds for four-point functions are valid for finite Δ 0 as well as for large Δ 0.more » We discuss a similar symmetry in a large spacetime dimension limit. Finally, we comment on the analogue of the Cardy formula and sparse light spectrum condition for the four-point function.« less

  7. Multiple multicontrol unitary operations: Implementation and applications

    NASA Astrophysics Data System (ADS)

    Lin, Qing

    2018-04-01

    The efficient implementation of computational tasks is critical to quantum computations. In quantum circuits, multicontrol unitary operations are important components. Here, we present an extremely efficient and direct approach to multiple multicontrol unitary operations without decomposition to CNOT and single-photon gates. With the proposed approach, the necessary two-photon operations could be reduced from O( n 3) with the traditional decomposition approach to O( n), which will greatly relax the requirements and make large-scale quantum computation feasible. Moreover, we propose the potential application to the ( n- k)-uniform hypergraph state.

  8. Analysis of the Thermal Loads on the KSTAR Cryogenic System

    NASA Astrophysics Data System (ADS)

    Kim, Y. S.; Oh, Y. K.; Kim, W. C.; Park, Y. M.; Lee, Y. J.; Jin, S. B.; Sa, J. W.; Choi, C. H.; Cho, K. W.; Bak, J. S.; Lee, G. S.

    2004-06-01

    A large-scale helium refrigeration system is one of the key components for the KSTAR (Korea Superconducting Tokamak Advanced Research) device. In the design of the refrigeration system, an estimation of the thermal loads on the cold mass is an important issue according to the operation scenario. The cold mass of the KSTAR device is about 250 tons including 30 superconducting (SC) coils and the magnet structure. In addition to the static thermal loads, pulsed thermal loads to the refrigeration system have been considered in the operation stage. The main pulsed thermal loads on magnet system are AC losses in the SC coils and eddy current losses in the magnet structure that depend on the magnetic field variation rate. The nuclear radiation loss due to plasma pulse operation is also considered. The designed cooling capacity of the refrigeration system is estimated to be about 9 kW at 4.5 K isothermal. In this paper, calculation of the various kinds of thermal loads on KSTAR cryogenic system and design of the large-scale helium refrigeration system are presented.

  9. Large-scale HTS bulks for magnetic application

    NASA Astrophysics Data System (ADS)

    Werfel, Frank N.; Floegel-Delor, Uta; Riedel, Thomas; Goebel, Bernd; Rothfeld, Rolf; Schirrmeister, Peter; Wippich, Dieter

    2013-01-01

    ATZ Company has constructed about 130 HTS magnet systems using high-Tc bulk magnets. A key feature in scaling-up is the fabrication of YBCO melts textured multi-seeded large bulks with three to eight seeds. Except of levitation, magnetization, trapped field and hysteresis, we review system engineering parameters of HTS magnetic linear and rotational bearings like compactness, cryogenics, power density, efficiency and robust construction. We examine mobile compact YBCO bulk magnet platforms cooled with LN2 and Stirling cryo-cooler for demonstrator use. Compact cryostats for Maglev train operation contain 24 pieces of 3-seed bulks and can levitate 2500-3000 N at 10 mm above a permanent magnet (PM) track. The effective magnetic distance of the thermally insulated bulks is 2 mm only; the stored 2.5 l LN2 allows more than 24 h operation without refilling. 34 HTS Maglev vacuum cryostats are manufactured tested and operate in Germany, China and Brazil. The magnetic levitation load to weight ratio is more than 15, and by group assembling the HTS cryostats under vehicles up to 5 t total loads levitated above a magnetic track is achieved.

  10. Large-Scale NASA Science Applications on the Columbia Supercluster

    NASA Technical Reports Server (NTRS)

    Brooks, Walter

    2005-01-01

    Columbia, NASA's newest 61 teraflops supercomputer that became operational late last year, is a highly integrated Altix cluster of 10,240 processors, and was named to honor the crew of the Space Shuttle lost in early 2003. Constructed in just four months, Columbia increased NASA's computing capability ten-fold, and revitalized the Agency's high-end computing efforts. Significant cutting-edge science and engineering simulations in the areas of space and Earth sciences, as well as aeronautics and space operations, are already occurring on this largest operational Linux supercomputer, demonstrating its capacity and capability to accelerate NASA's space exploration vision. The presentation will describe how an integrated environment consisting not only of next-generation systems, but also modeling and simulation, high-speed networking, parallel performance optimization, and advanced data analysis and visualization, is being used to reduce design cycle time, accelerate scientific discovery, conduct parametric analysis of multiple scenarios, and enhance safety during the life cycle of NASA missions. The talk will conclude by discussing how NAS partnered with various NASA centers, other government agencies, computer industry, and academia, to create a national resource in large-scale modeling and simulation.

  11. Unmanned Aircraft Systems Traffic Management (UTM) Safely Enabling UAS Operations in Low-Altitude Airspace

    NASA Technical Reports Server (NTRS)

    Kopardekar, Parimal H.

    2017-01-01

    Conduct research, development and testing to identify airspace operations requirements to enable large-scale visual and beyond visual line of sight UAS operations in the low-altitude airspace. Use build-a-little-test-a-little strategy remote areas to urban areas Low density: No traffic management required but understanding of airspace constraints. Cooperative traffic management: Understanding of airspace constraints and other operations. Manned and unmanned traffic management: Scalable and heterogeneous operations. UTM construct consistent with FAAs risk-based strategy. UTM research platform is used for simulations and tests. UTM offers path towards scalability

  12. Large storage operations under climate change: expanding uncertainties and evolving tradeoffs

    NASA Astrophysics Data System (ADS)

    Giuliani, Matteo; Anghileri, Daniela; Castelletti, Andrea; Vu, Phuong Nam; Soncini-Sessa, Rodolfo

    2016-03-01

    In a changing climate and society, large storage systems can play a key role for securing water, energy, and food, and rebalancing their cross-dependencies. In this letter, we study the role of large storage operations as flexible means of adaptation to climate change. In particular, we explore the impacts of different climate projections for different future time horizons on the multi-purpose operations of the existing system of large dams in the Red River basin (China-Laos-Vietnam). We identify the main vulnerabilities of current system operations, understand the risk of failure across sectors by exploring the evolution of the system tradeoffs, quantify how the uncertainty associated to climate scenarios is expanded by the storage operations, and assess the expected costs if no adaptation is implemented. Results show that, depending on the climate scenario and the time horizon considered, the existing operations are predicted to change on average from -7 to +5% in hydropower production, +35 to +520% in flood damages, and +15 to +160% in water supply deficit. These negative impacts can be partially mitigated by adapting the existing operations to future climate, reducing the loss of hydropower to 5%, potentially saving around 34.4 million US year-1 at the national scale. Since the Red River is paradigmatic of many river basins across south east Asia, where new large dams are under construction or are planned to support fast growing economies, our results can support policy makers in prioritizing responses and adaptation strategies to the changing climate.

  13. The Role of Free Stream Turbulence on the Aerodynamic Performance of a Wind Turbine Blade

    NASA Astrophysics Data System (ADS)

    Maldonado, Victor; Thormann, Adrien; Meneveau, Charles; Castillo, Luciano

    2014-11-01

    Effects of free stream turbulence with large integral scale on the aerodynamic performance of an S809 airfoil-based wind turbine blade at low Reynolds number are studied using wind tunnel experiments. A constant chord (2-D) S809 airfoil wind turbine blade model with an operating Reynolds number of 208,000 based on chord length was tested for a range of angles of attack representative of fully attached and stalled flow as encountered in typical wind turbine operation. The smooth-surface blade was subjected to a quasi-laminar free stream with very low free-stream turbulence as well as to elevated free-stream turbulence generated by an active grid. This turbulence contained large-scale eddies with levels of free-stream turbulence intensity of up to 6.14% and an integral length scale of about 60% of chord-length. The pressure distribution was acquired using static pressure taps and the lift was subsequently computed by numerical integration. The wake velocity deficit was measured utilizing hot-wire anemometry to compute the drag coefficient also via integration. In addition, the mean flow was quantified using 2-D particle image velocimetry (PIV) over the suction surface of the blade. Results indicate that turbulence, even with very large-scale eddies comparable in size to the chord-length, significantly improves the aerodynamic performance of the blade by increasing the lift coefficient and overall lift-to-drag ratio, L/D for all angles tested except zero degrees.

  14. Large-scale production of diesel-like biofuels - process design as an inherent part of microorganism development.

    PubMed

    Cuellar, Maria C; Heijnen, Joseph J; van der Wielen, Luuk A M

    2013-06-01

    Industrial biotechnology is playing an important role in the transition to a bio-based economy. Currently, however, industrial implementation is still modest, despite the advances made in microorganism development. Given that the fuels and commodity chemicals sectors are characterized by tight economic margins, we propose to address overall process design and efficiency at the start of bioprocess development. While current microorganism development is targeted at product formation and product yield, addressing process design at the start of bioprocess development means that microorganism selection can also be extended to other critical targets for process technology and process scale implementation, such as enhancing cell separation or increasing cell robustness at operating conditions that favor the overall process. In this paper we follow this approach for the microbial production of diesel-like biofuels. We review current microbial routes with both oleaginous and engineered microorganisms. For the routes leading to extracellular production, we identify the process conditions for large scale operation. The process conditions identified are finally translated to microorganism development targets. We show that microorganism development should be directed at anaerobic production, increasing robustness at extreme process conditions and tailoring cell surface properties. All the same time, novel process configurations integrating fermentation and product recovery, cell reuse and low-cost technologies for product separation are mandatory. This review provides a state-of-the-art summary of the latest challenges in large-scale production of diesel-like biofuels. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Dislocation Multiplication by Single Cross Slip for FCC at Submicron Scales

    NASA Astrophysics Data System (ADS)

    Cui, Yi-Nan; Liu, Zhan-Li; Zhuang, Zhuo

    2013-04-01

    The operation mechanism of single cross slip multiplication (SCSM) is investigated by studying the response of one dislocation loop expanding in face-centered-cubic (FCC) single crystal using three-dimensional discrete dislocation dynamic (3D-DDD) simulation. The results show that SCSM can trigger highly correlated dislocation generation in a short time, which may shed some light on understanding the large strain burst observed experimentally. Furthermore, we find that there is a critical stress and material size for the operation of SCSM, which agrees with that required to trigger large strain burst in the compression tests of FCC micropillars.

  16. Collaboration and nested environmental governance: Scale dependency, scale framing, and cross-scale interactions in collaborative conservation.

    PubMed

    Wyborn, Carina; Bixler, R Patrick

    2013-07-15

    The problem of fit between social institutions and ecological systems is an enduring challenge in natural resource management and conservation. Developments in the science of conservation biology encourage the management of landscapes at increasingly larger scales. In contrast, sociological approaches to conservation emphasize the importance of ownership, collaboration and stewardship at scales relevant to the individual or local community. Despite the proliferation of initiatives seeking to work with local communities to undertake conservation across large landscapes, there is an inherent tension between these scales of operation. Consequently, questions about the changing nature of effective conservation across scales abound. Through an analysis of three nested cases working in a semiautonomous fashion in the Northern Rocky Mountains in North America, this paper makes an empirical contribution to the literature on nested governance, collaboration and communication across scales. Despite different scales of operation, constituencies and scale frames, we demonstrate a surprising similarity in organizational structure and an implicit dependency between these initiatives. This paper examines the different capacities and capabilities of collaborative conservation from the local to regional to supra regional. We draw on the underexplored concept of 'scale-dependent comparative advantage' (Cash and Moser, 2000), to gain insight into what activities take place at which scale and what those activities contribute to nested governance and collaborative conservation. The comparison of these semiautonomous cases provides fruitful territory to draw lessons for understanding the roles and relationships of organizations operating at different scales in more connected networks of nested governance. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Low-cost production of solar-cell panels

    NASA Technical Reports Server (NTRS)

    Bickler, D. B.; Gallagher, B. D.; Sanchez, L. E.

    1980-01-01

    Large-scale production model combines most modern manufacturing techniques to produce silicon-solar-cell panels of low costs by 1982. Model proposes facility capable of operating around the clock with annual production capacity of 20 W of solar cell panels.

  18. EVALUATION PLAN FOR TWO LARGE-SCALE LANDFILL BIOREACTOR TECHNOLOGIES

    EPA Science Inventory

    Abstract - Waste Management, Inc., is operating two long-term bioreactor studies at the Outer Loop Landfill in Louisville, KY, including facultative landfill bioreactor and staged aerobic-anaerobic landfill bioreactor demonstrations. A Quality Assurance Project Plan (QAPP) was p...

  19. Seattle wide-area information for travelers (SWIFT) : architecture study

    DOT National Transportation Integrated Search

    1998-10-19

    The SWIFT (Seattle Wide-area Information For Travelers) Field Operational Test was intended to evaluate the performance of a large-scale urban Advanced Traveler Information System (ATIS) deployment in the Seattle area. The unique features of the SWIF...

  20. Lessons learned from twenty-year operation of the Large Helical Device poloidal coils made from cable-in-conduit conductors

    NASA Astrophysics Data System (ADS)

    Takahata, Kazuya; Moriuchi, Sadatomo; Ooba, Kouki; Takami, Shigeyuki; Iwamoto, Akifumi; Mito, Toshiyuki; Imagawa, Shinsaku

    2018-04-01

    The Large Helical Device (LHD) superconducting magnet system consists of two pairs of helical coils and three pairs of poloidal coils. The poloidal coils use cable-in-conduit (CIC) conductors, which have now been adopted in many fusion devices, with forced cooling by supercritical helium. The poloidal coils were first energized with the helical coils on March 27, 1998. Since that time, the coils have experienced 54,600 h of steady cooling, 10,600 h of excitation operation, and nineteen thermal cycles for twenty years. During this period, no superconducting-to-normal transition of the conductors has been observed. The stable operation of the poloidal coils demonstrates that a CIC conductor is suited to large-scale superconducting magnets. The AC loss has remained constant, even though a slight decrease was observed in the early phase of operation. The hydraulic characteristics have been maintained without obstruction over the entire period of steady cooling. The experience gained from twenty years of operation has also provided lessons regarding malfunctions of peripheral equipment.

  1. Varying the forcing scale in low Prandtl number dynamos

    NASA Astrophysics Data System (ADS)

    Brandenburg, A.; Haugen, N. E. L.; Li, Xiang-Yu; Subramanian, K.

    2018-06-01

    Small-scale dynamos are expected to operate in all astrophysical fluids that are turbulent and electrically conducting, for example the interstellar medium, stellar interiors, and accretion disks, where they may also be affected by or competing with large-scale dynamos. However, the possibility of small-scale dynamos being excited at small and intermediate ratios of viscosity to magnetic diffusivity (the magnetic Prandtl number) has been debated, and the possibility of them depending on the large-scale forcing wavenumber has been raised. Here we show, using four values of the forcing wavenumber, that the small-scale dynamo does not depend on the scale-separation between the size of the simulation domain and the integral scale of the turbulence, i.e., the forcing scale. Moreover, the spectral bottleneck in turbulence, which has been implied as being responsible for raising the excitation conditions of small-scale dynamos, is found to be invariant under changing the forcing wavenumber. However, when forcing at the lowest few wavenumbers, the effective forcing wavenumber that enters in the definition of the magnetic Reynolds number is found to be about twice the minimum wavenumber of the domain. Our work is relevant to future studies of small-scale dynamos, of which several applications are being discussed.

  2. Implementation of large-scale routine diagnostics using whole slide imaging in Sweden: Digital pathology experiences 2006-2013

    PubMed Central

    Thorstenson, Sten; Molin, Jesper; Lundström, Claes

    2014-01-01

    Recent technological advances have improved the whole slide imaging (WSI) scanner quality and reduced the cost of storage, thereby enabling the deployment of digital pathology for routine diagnostics. In this paper we present the experiences from two Swedish sites having deployed routine large-scale WSI for primary review. At Kalmar County Hospital, the digitization process started in 2006 to reduce the time spent at the microscope in order to improve the ergonomics. Since 2008, more than 500,000 glass slides have been scanned in the routine operations of Kalmar and the neighboring Linköping University Hospital. All glass slides are digitally scanned yet they are also physically delivered to the consulting pathologist who can choose to review the slides on screen, in the microscope, or both. The digital operations include regular remote case reporting by a few hospital pathologists, as well as around 150 cases per week where primary review is outsourced to a private clinic. To investigate how the pathologists choose to use the digital slides, a web-based questionnaire was designed and sent out to the pathologists in Kalmar and Linköping. The responses showed that almost all pathologists think that ergonomics have improved and that image quality was sufficient for most histopathologic diagnostic work. 38 ± 28% of the cases were diagnosed digitally, but the survey also revealed that the pathologists commonly switch back and forth between digital and conventional microscopy within the same case. The fact that two full-scale digital systems have been implemented and that a large portion of the primary reporting is voluntarily performed digitally shows that large-scale digitization is possible today. PMID:24843825

  3. Base Station Placement Algorithm for Large-Scale LTE Heterogeneous Networks.

    PubMed

    Lee, Seungseob; Lee, SuKyoung; Kim, Kyungsoo; Kim, Yoon Hyuk

    2015-01-01

    Data traffic demands in cellular networks today are increasing at an exponential rate, giving rise to the development of heterogeneous networks (HetNets), in which small cells complement traditional macro cells by extending coverage to indoor areas. However, the deployment of small cells as parts of HetNets creates a key challenge for operators' careful network planning. In particular, massive and unplanned deployment of base stations can cause high interference, resulting in highly degrading network performance. Although different mathematical modeling and optimization methods have been used to approach various problems related to this issue, most traditional network planning models are ill-equipped to deal with HetNet-specific characteristics due to their focus on classical cellular network designs. Furthermore, increased wireless data demands have driven mobile operators to roll out large-scale networks of small long term evolution (LTE) cells. Therefore, in this paper, we aim to derive an optimum network planning algorithm for large-scale LTE HetNets. Recently, attempts have been made to apply evolutionary algorithms (EAs) to the field of radio network planning, since they are characterized as global optimization methods. Yet, EA performance often deteriorates rapidly with the growth of search space dimensionality. To overcome this limitation when designing optimum network deployments for large-scale LTE HetNets, we attempt to decompose the problem and tackle its subcomponents individually. Particularly noting that some HetNet cells have strong correlations due to inter-cell interference, we propose a correlation grouping approach in which cells are grouped together according to their mutual interference. Both the simulation and analytical results indicate that the proposed solution outperforms the random-grouping based EA as well as an EA that detects interacting variables by monitoring the changes in the objective function algorithm in terms of system throughput performance.

  4. High-Resiliency and Auto-Scaling of Large-Scale Cloud Computing for OCO-2 L2 Full Physics Processing

    NASA Astrophysics Data System (ADS)

    Hua, H.; Manipon, G.; Starch, M.; Dang, L. B.; Southam, P.; Wilson, B. D.; Avis, C.; Chang, A.; Cheng, C.; Smyth, M.; McDuffie, J. L.; Ramirez, P.

    2015-12-01

    Next generation science data systems are needed to address the incoming flood of data from new missions such as SWOT and NISAR where data volumes and data throughput rates are order of magnitude larger than present day missions. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. We present our experiences on deploying a hybrid-cloud computing science data system (HySDS) for the OCO-2 Science Computing Facility to support large-scale processing of their Level-2 full physics data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer ~10X costs savings but with an unpredictable computing environment based on market forces. We will present how we enabled high-tolerance computing in order to achieve large-scale computing as well as operational cost savings.

  5. GIS applications for military operations in coastal zones

    USGS Publications Warehouse

    Fleming, S.; Jordan, T.; Madden, M.; Usery, E.L.; Welch, R.

    2009-01-01

    In order to successfully support current and future US military operations in coastal zones, geospatial information must be rapidly integrated and analyzed to meet ongoing force structure evolution and new mission directives. Coastal zones in a military-operational environment are complex regions that include sea, land and air features that demand high-volume databases of extreme detail within relatively narrow geographic corridors. Static products in the form of analog maps at varying scales traditionally have been used by military commanders and their operational planners. The rapidly changing battlefield of 21st Century warfare, however, demands dynamic mapping solutions. Commercial geographic information system (GIS) software for military-specific applications is now being developed and employed with digital databases to provide customized digital maps of variable scale, content and symbolization tailored to unique demands of military units. Research conducted by the Center for Remote Sensing and Mapping Science at the University of Georgia demonstrated the utility of GIS-based analysis and digital map creation when developing large-scale (1:10,000) products from littoral warfare databases. The methodology employed-selection of data sources (including high resolution commercial images and Lidar), establishment of analysis/modeling parameters, conduct of vehicle mobility analysis, development of models and generation of products (such as a continuous sea-land DEM and geo-visualization of changing shorelines with tidal levels)-is discussed. Based on observations and identified needs from the National Geospatial-Intelligence Agency, formerly the National Imagery and Mapping Agency, and the Department of Defense, prototype GIS models for military operations in sea, land and air environments were created from multiple data sets of a study area at US Marine Corps Base Camp Lejeune, North Carolina. Results of these models, along with methodologies for developing large-scale littoral warfare databases, aid the National Geospatial-Intelligence Agency in meeting littoral warfare analysis, modeling and map generation requirements for US military organizations. ?? 2008 International Society for Photogrammetry and Remote Sensing, Inc. (ISPRS).

  6. GIS applications for military operations in coastal zones

    NASA Astrophysics Data System (ADS)

    Fleming, S.; Jordan, T.; Madden, M.; Usery, E. L.; Welch, R.

    In order to successfully support current and future US military operations in coastal zones, geospatial information must be rapidly integrated and analyzed to meet ongoing force structure evolution and new mission directives. Coastal zones in a military-operational environment are complex regions that include sea, land and air features that demand high-volume databases of extreme detail within relatively narrow geographic corridors. Static products in the form of analog maps at varying scales traditionally have been used by military commanders and their operational planners. The rapidly changing battlefield of 21st Century warfare, however, demands dynamic mapping solutions. Commercial geographic information system (GIS) software for military-specific applications is now being developed and employed with digital databases to provide customized digital maps of variable scale, content and symbolization tailored to unique demands of military units. Research conducted by the Center for Remote Sensing and Mapping Science at the University of Georgia demonstrated the utility of GIS-based analysis and digital map creation when developing large-scale (1:10,000) products from littoral warfare databases. The methodology employed-selection of data sources (including high resolution commercial images and Lidar), establishment of analysis/modeling parameters, conduct of vehicle mobility analysis, development of models and generation of products (such as a continuous sea-land DEM and geo-visualization of changing shorelines with tidal levels)-is discussed. Based on observations and identified needs from the National Geospatial-Intelligence Agency, formerly the National Imagery and Mapping Agency, and the Department of Defense, prototype GIS models for military operations in sea, land and air environments were created from multiple data sets of a study area at US Marine Corps Base Camp Lejeune, North Carolina. Results of these models, along with methodologies for developing large-scale littoral warfare databases, aid the National Geospatial-Intelligence Agency in meeting littoral warfare analysis, modeling and map generation requirements for US military organizations.

  7. How Big is Too Big for Hubs: Marginal Profitability in Hub-and-Spoke Networks

    NASA Technical Reports Server (NTRS)

    Ross, Leola B.; Schmidt, Stephen J.

    1997-01-01

    Increasing the scale of hub operations at major airports has led to concerns about congestion at excessively large hubs. In this paper, we estimate the marginal cost of adding spokes to an existing hub network. We observe entry/non-entry decisions on potential spokes from existing hubs, and estimate both a variable profit function for providing service in markets using that spoke as well as the fixed costs of providing service to the spoke. We let the fixed costs depend upon the scale of operations at the hub, and find the hub size at which spoke service costs are minimized.

  8. Small-scale dynamo at low magnetic Prandtl numbers

    NASA Astrophysics Data System (ADS)

    Schober, Jennifer; Schleicher, Dominik; Bovino, Stefano; Klessen, Ralf S.

    2012-12-01

    The present-day Universe is highly magnetized, even though the first magnetic seed fields were most probably extremely weak. To explain the growth of the magnetic field strength over many orders of magnitude, fast amplification processes need to operate. The most efficient mechanism known today is the small-scale dynamo, which converts turbulent kinetic energy into magnetic energy leading to an exponential growth of the magnetic field. The efficiency of the dynamo depends on the type of turbulence indicated by the slope of the turbulence spectrum v(ℓ)∝ℓϑ, where v(ℓ) is the eddy velocity at a scale ℓ. We explore turbulent spectra ranging from incompressible Kolmogorov turbulence with ϑ=1/3 to highly compressible Burgers turbulence with ϑ=1/2. In this work, we analyze the properties of the small-scale dynamo for low magnetic Prandtl numbers Pm, which denotes the ratio of the magnetic Reynolds number, Rm, to the hydrodynamical one, Re. We solve the Kazantsev equation, which describes the evolution of the small-scale magnetic field, using the WKB approximation. In the limit of low magnetic Prandtl numbers, the growth rate is proportional to Rm(1-ϑ)/(1+ϑ). We furthermore discuss the critical magnetic Reynolds number Rmcrit, which is required for small-scale dynamo action. The value of Rmcrit is roughly 100 for Kolmogorov turbulence and 2700 for Burgers. Furthermore, we discuss that Rmcrit provides a stronger constraint in the limit of low Pm than it does for large Pm. We conclude that the small-scale dynamo can operate in the regime of low magnetic Prandtl numbers if the magnetic Reynolds number is large enough. Thus, the magnetic field amplification on small scales can take place in a broad range of physical environments and amplify week magnetic seed fields on short time scales.

  9. Small-scale dynamo at low magnetic Prandtl numbers.

    PubMed

    Schober, Jennifer; Schleicher, Dominik; Bovino, Stefano; Klessen, Ralf S

    2012-12-01

    The present-day Universe is highly magnetized, even though the first magnetic seed fields were most probably extremely weak. To explain the growth of the magnetic field strength over many orders of magnitude, fast amplification processes need to operate. The most efficient mechanism known today is the small-scale dynamo, which converts turbulent kinetic energy into magnetic energy leading to an exponential growth of the magnetic field. The efficiency of the dynamo depends on the type of turbulence indicated by the slope of the turbulence spectrum v(ℓ)∝ℓ^{ϑ}, where v(ℓ) is the eddy velocity at a scale ℓ. We explore turbulent spectra ranging from incompressible Kolmogorov turbulence with ϑ=1/3 to highly compressible Burgers turbulence with ϑ=1/2. In this work, we analyze the properties of the small-scale dynamo for low magnetic Prandtl numbers Pm, which denotes the ratio of the magnetic Reynolds number, Rm, to the hydrodynamical one, Re. We solve the Kazantsev equation, which describes the evolution of the small-scale magnetic field, using the WKB approximation. In the limit of low magnetic Prandtl numbers, the growth rate is proportional to Rm^{(1-ϑ)/(1+ϑ)}. We furthermore discuss the critical magnetic Reynolds number Rm_{crit}, which is required for small-scale dynamo action. The value of Rm_{crit} is roughly 100 for Kolmogorov turbulence and 2700 for Burgers. Furthermore, we discuss that Rm_{crit} provides a stronger constraint in the limit of low Pm than it does for large Pm. We conclude that the small-scale dynamo can operate in the regime of low magnetic Prandtl numbers if the magnetic Reynolds number is large enough. Thus, the magnetic field amplification on small scales can take place in a broad range of physical environments and amplify week magnetic seed fields on short time scales.

  10. Successful scaling-up of self-sustained pyrolysis of oil palm biomass under pool-type reactor.

    PubMed

    Idris, Juferi; Shirai, Yoshihito; Andou, Yoshito; Mohd Ali, Ahmad Amiruddin; Othman, Mohd Ridzuan; Ibrahim, Izzudin; Yamamoto, Akio; Yasuda, Nobuhiko; Hassan, Mohd Ali

    2016-02-01

    An appropriate technology for waste utilisation, especially for a large amount of abundant pressed-shredded oil palm empty fruit bunch (OFEFB), is important for the oil palm industry. Self-sustained pyrolysis, whereby oil palm biomass was combusted by itself to provide the heat for pyrolysis without an electrical heater, is more preferable owing to its simplicity, ease of operation and low energy requirement. In this study, biochar production under self-sustained pyrolysis of oil palm biomass in the form of oil palm empty fruit bunch was tested in a 3-t large-scale pool-type reactor. During the pyrolysis process, the biomass was loaded layer by layer when the smoke appeared on the top, to minimise the entrance of oxygen. This method had significantly increased the yield of biochar. In our previous report, we have tested on a 30-kg pilot-scale capacity under self-sustained pyrolysis and found that the higher heating value (HHV) obtained was 22.6-24.7 MJ kg(-1) with a 23.5%-25.0% yield. In this scaled-up study, a 3-t large-scale procedure produced HHV of 22.0-24.3 MJ kg(-1) with a 30%-34% yield based on a wet-weight basis. The maximum self-sustained pyrolysis temperature for the large-scale procedure can reach between 600 °C and 700 °C. We concluded that large-scale biochar production under self-sustained pyrolysis was successfully conducted owing to the comparable biochar produced, compared with medium-scale and other studies with an electrical heating element, making it an appropriate technology for waste utilisation, particularly for the oil palm industry. © The Author(s) 2015.

  11. An ensemble constrained variational analysis of atmospheric forcing data and its application to evaluate clouds in CAM5: Ensemble 3DCVA and Its Application

    DOE PAGES

    Tang, Shuaiqi; Zhang, Minghua; Xie, Shaocheng

    2016-01-05

    Large-scale atmospheric forcing data can greatly impact the simulations of atmospheric process models including Large Eddy Simulations (LES), Cloud Resolving Models (CRMs) and Single-Column Models (SCMs), and impact the development of physical parameterizations in global climate models. This study describes the development of an ensemble variationally constrained objective analysis of atmospheric large-scale forcing data and its application to evaluate the cloud biases in the Community Atmospheric Model (CAM5). Sensitivities of the variational objective analysis to background data, error covariance matrix and constraint variables are described and used to quantify the uncertainties in the large-scale forcing data. Application of the ensemblemore » forcing in the CAM5 SCM during March 2000 intensive operational period (IOP) at the Southern Great Plains (SGP) of the Atmospheric Radiation Measurement (ARM) program shows systematic biases in the model simulations that cannot be explained by the uncertainty of large-scale forcing data, which points to the deficiencies of physical parameterizations. The SCM is shown to overestimate high clouds and underestimate low clouds. These biases are found to also exist in the global simulation of CAM5 when it is compared with satellite data.« less

  12. An ensemble constrained variational analysis of atmospheric forcing data and its application to evaluate clouds in CAM5: Ensemble 3DCVA and Its Application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, Shuaiqi; Zhang, Minghua; Xie, Shaocheng

    Large-scale atmospheric forcing data can greatly impact the simulations of atmospheric process models including Large Eddy Simulations (LES), Cloud Resolving Models (CRMs) and Single-Column Models (SCMs), and impact the development of physical parameterizations in global climate models. This study describes the development of an ensemble variationally constrained objective analysis of atmospheric large-scale forcing data and its application to evaluate the cloud biases in the Community Atmospheric Model (CAM5). Sensitivities of the variational objective analysis to background data, error covariance matrix and constraint variables are described and used to quantify the uncertainties in the large-scale forcing data. Application of the ensemblemore » forcing in the CAM5 SCM during March 2000 intensive operational period (IOP) at the Southern Great Plains (SGP) of the Atmospheric Radiation Measurement (ARM) program shows systematic biases in the model simulations that cannot be explained by the uncertainty of large-scale forcing data, which points to the deficiencies of physical parameterizations. The SCM is shown to overestimate high clouds and underestimate low clouds. These biases are found to also exist in the global simulation of CAM5 when it is compared with satellite data.« less

  13. Development of an advanced anaerobic digester design and a kinetic model for biogasification of water hyacinth/sludge blends

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Srivastava, V.; Fannin, K.F.; Biljetina, R.

    1986-07-01

    The Institute of Gas Technology (IGT) conducted a comprehensive laboratory-scale research program to develop and optimize the anaerobic digestion process for producing methane from water hyacinth and sludge blends. This study focused on digester design and operating techniques, which gave improved methane yields and production rates over those observed using conventional digesters. The final digester concept and the operating experience was utilized to design and operate a large-scale experimentla test unit (ETU) at Walt Disney World, Florida. This paper describes the novel digester design, operating techniques, and the results obtained in the laboratory. The paper also discusses a kinetic modelmore » which predicts methane yield, methane production rate, and digester effluent solids as a function of retention time. This model was successfully utilized to predict the performance of the ETU. 15 refs., 6 figs., 6 tabs.« less

  14. Pairing induced superconductivity in holography

    NASA Astrophysics Data System (ADS)

    Bagrov, Andrey; Meszena, Balazs; Schalm, Koenraad

    2014-09-01

    We study pairing induced superconductivity in large N strongly coupled systems at finite density using holography. In the weakly coupled dual gravitational theory the mechanism is conventional BCS theory. An IR hard wall cut-off is included to ensure that we can controllably address the dynamics of a single confined Fermi surface. We address in detail the interplay between the scalar order parameter field and fermion pairing. Adding an explicitly dynamical scalar operator with the same quantum numbers as the fermion-pair, the theory experiences a BCS/BEC crossover controlled by the relative scaling dimensions. We find the novel result that this BCS/BEC crossover exposes resonances in the canonical expectation value of the scalar operator. This occurs not only when the scaling dimension is degenerate with the Cooper pair, but also with that of higher derivative paired operators. We speculate that a proper definition of the order parameter which takes mixing with these operators into account stays finite nevertheless.

  15. Implicity restarted Arnoldi/Lanczos methods for large scale eigenvalue calculations

    NASA Technical Reports Server (NTRS)

    Sorensen, Danny C.

    1996-01-01

    Eigenvalues and eigenfunctions of linear operators are important to many areas of applied mathematics. The ability to approximate these quantities numerically is becoming increasingly important in a wide variety of applications. This increasing demand has fueled interest in the development of new methods and software for the numerical solution of large-scale algebraic eigenvalue problems. In turn, the existence of these new methods and software, along with the dramatically increased computational capabilities now available, has enabled the solution of problems that would not even have been posed five or ten years ago. Until very recently, software for large-scale nonsymmetric problems was virtually non-existent. Fortunately, the situation is improving rapidly. The purpose of this article is to provide an overview of the numerical solution of large-scale algebraic eigenvalue problems. The focus will be on a class of methods called Krylov subspace projection methods. The well-known Lanczos method is the premier member of this class. The Arnoldi method generalizes the Lanczos method to the nonsymmetric case. A recently developed variant of the Arnoldi/Lanczos scheme called the Implicitly Restarted Arnoldi Method is presented here in some depth. This method is highlighted because of its suitability as a basis for software development.

  16. Advances in the Application of Surface Drifters.

    PubMed

    Lumpkin, Rick; Özgökmen, Tamay; Centurioni, Luca

    2017-01-03

    Surface drifting buoys, or drifters, are used in oceanographic and climate research, oil spill tracking, weather forecasting, search and rescue operations, calibration and validation of velocities from high-frequency radar and from altimeters, iceberg tracking, and support of offshore drilling operations. In this review, we present a brief history of drifters, from the message in a bottle to the latest satellite-tracked, multisensor drifters. We discuss the different types of drifters currently used for research and operations as well as drifter designs in development. We conclude with a discussion of the various properties that can be observed with drifters, with heavy emphasis on a critical process that cannot adequately be observed by any other instrument: dispersion in the upper ocean, driven by turbulence at scales from waves through the submesoscale to the large-scale geostrophic eddies.

  17. The iMoD display: considerations and challenges in fabricating MOEMS on large area glass substrates

    NASA Astrophysics Data System (ADS)

    Chui, Clarence; Floyd, Philip D.; Heald, David; Arbuckle, Brian; Lewis, Alan; Kothari, Manish; Cummings, Bill; Palmateer, Lauren; Bos, Jan; Chang, Daniel; Chiang, Jedi; Wang, Li-Ming; Pao, Edmon; Su, Fritz; Huang, Vincent; Lin, Wen-Jian; Tang, Wen-Chung; Yeh, Jia-Jiun; Chan, Chen-Chun; Shu, Fang-Ann; Ju, Yuh-Diing

    2007-01-01

    QUALCOMM has developed and transferred to manufacturing iMoD displays, a MEMS-based reflective display technology. The iMoD array architecture allows for development at wafer scale, yet easily scales up to enable fabrication on flat-panel display (FPD) lines. In this paper, we will describe the device operation, process flow and fabrication, technology transfer issues, and display performance.

  18. Testing the modeled effectiveness of an operational fuel reduction treatment in a small Western Montana interface landscape using two spatial scales

    Treesearch

    Michael G. Harrington; Erin Noonan-Wright; Mitchell Doherty

    2007-01-01

    Much of the coniferous zones in the Western United States where fires were historically frequent have seen large increases in stand densities and associated forest fuels due to 20th century anthropogenic influences. This condition is partially responsible for contemporary large, uncharacteristically severe wildfires. Therefore, considerable effort is under way to...

  19. Path changing methods applied to the 4-D guidance of STOL aircraft.

    DOT National Transportation Integrated Search

    1971-11-01

    Prior to the advent of large-scale commercial STOL service, some challenging navigation and guidance problems must be solved. Proposed terminal area operations may require that these aircraft be capable of accurately flying complex flight paths, and ...

  20. Crowdsourcing for large-scale mosquito (Diptera: Culicidae) sampling

    USDA-ARS?s Scientific Manuscript database

    Sampling a cosmopolitan mosquito (Diptera: Culicidae) species throughout its range is logistically challenging and extremely resource intensive. Mosquito control programmes and regional networks operate at the local level and often conduct sampling activities across much of North America. A method f...

  1. Experimental painting of the I-64 Riverside Parkway in Louisville, KY.

    DOT National Transportation Integrated Search

    2009-02-01

    The Kentucky Transportation Cabinet conducted a large-scale zone maintenance painting operation on 13 elevated steel bridges along the I-64 Riverside Parkway in Louisville, KY in 2007. That work included abrasive blast-cleaning and painting of steel ...

  2. Novel Directional Protection Scheme for the FREEDM Smart Grid System

    NASA Astrophysics Data System (ADS)

    Sharma, Nitish

    This research primarily deals with the design and validation of the protection system for a large scale meshed distribution system. The large scale system simulation (LSSS) is a system level PSCAD model which is used to validate component models for different time-scale platforms, to provide a virtual testing platform for the Future Renewable Electric Energy Delivery and Management (FREEDM) system. It is also used to validate the cases of power system protection, renewable energy integration and storage, and load profiles. The protection of the FREEDM system against any abnormal condition is one of the important tasks. The addition of distributed generation and power electronic based solid state transformer adds to the complexity of the protection. The FREEDM loop system has a fault current limiter and in addition, the Solid State Transformer (SST) limits the fault current at 2.0 per unit. Former students at ASU have developed the protection scheme using fiber-optic cable. However, during the NSF-FREEDM site visit, the National Science Foundation (NSF) team regarded the system incompatible for the long distances. Hence, a new protection scheme with a wireless scheme is presented in this thesis. The use of wireless communication is extended to protect the large scale meshed distributed generation from any fault. The trip signal generated by the pilot protection system is used to trigger the FID (fault isolation device) which is an electronic circuit breaker operation (switched off/opening the FIDs). The trip signal must be received and accepted by the SST, and it must block the SST operation immediately. A comprehensive protection system for the large scale meshed distribution system has been developed in PSCAD with the ability to quickly detect the faults. The validation of the protection system is performed by building a hardware model using commercial relays at the ASU power laboratory.

  3. Anisotropy of the Cosmic Microwave Background Radiation on Large and Medium Angular Scales

    NASA Technical Reports Server (NTRS)

    Houghton, Anthony; Timbie, Peter

    1998-01-01

    This grant has supported work at Brown University on measurements of the 2.7 K Cosmic Microwave Background Radiation (CMB). The goal has been to characterize the spatial variations in the temperature of the CMB in order to understand the formation of large-scale structure in the universe. We have concurrently pursued two measurements using millimeter-wave telescopes carried aloft by scientific balloons. Both systems operate over a range of wavelengths, chosen to allow spectral removal of foreground sources such as the atmosphere, Galaxy, etc. The angular resolution of approx. 25 arcminutes is near the angular scale at which the most structure is predicted by current models to be visible in the CMB angular power spectrum. The main goal is to determine the angular scale of this structure; in turn we can infer the density parameter, Omega, for the universe as well as other cosmological parameters, such as the Hubble constant.

  4. The animal-human interface and infectious disease in industrial food animal production: rethinking biosecurity and biocontainment.

    PubMed

    Graham, Jay P; Leibler, Jessica H; Price, Lance B; Otte, Joachim M; Pfeiffer, Dirk U; Tiensin, T; Silbergeld, Ellen K

    2008-01-01

    Understanding interactions between animals and humans is critical in preventing outbreaks of zoonotic disease. This is particularly important for avian influenza. Food animal production has been transformed since the 1918 influenza pandemic. Poultry and swine production have changed from small-scale methods to industrial-scale operations. There is substantial evidence of pathogen movement between and among these industrial facilities, release to the external environment, and exposure to farm workers, which challenges the assumption that modern poultry production is more biosecure and biocontained as compared with backyard or small holder operations in preventing introduction and release of pathogens. An analysis of data from the Thai government investigation in 2004 indicates that the odds of H5N1 outbreaks and infections were significantly higher in large-scale commercial poultry operations as compared with backyard flocks. These data suggest that successful strategies to prevent or mitigate the emergence of pandemic avian influenza must consider risk factors specific to modern industrialized food animal production.

  5. The cost of a large-scale hollow fibre MBR.

    PubMed

    Verrecht, Bart; Maere, Thomas; Nopens, Ingmar; Brepols, Christoph; Judd, Simon

    2010-10-01

    A cost sensitivity analysis was carried out for a full-scale hollow fibre membrane bioreactor to quantify the effect of design choices and operational parameters on cost. Different options were subjected to a long term dynamic influent profile and evaluated using ASM1 for effluent quality, aeration requirements and sludge production. The results were used to calculate a net present value (NPV), incorporating both capital expenditure (capex), based on costs obtained from equipment manufacturers and full-scale plants, and operating expenditure (opex), accounting for energy demand, sludge production and chemical cleaning costs. Results show that the amount of contingency built in to cope with changes in feedwater flow has a large impact on NPV. Deviation from a constant daily flow increases NPV as mean plant utilisation decreases. Conversely, adding a buffer tank reduces NPV, since less membrane surface is required when average plant utilisation increases. Membrane cost and lifetime is decisive in determining NPV: an increased membrane replacement interval from 5 to 10 years reduces NPV by 19%. Operation at higher SRT increases the NPV, since the reduced costs for sludge treatment are offset by correspondingly higher aeration costs at higher MLSS levels, though the analysis is very sensitive to sludge treatment costs. A higher sustainable flux demands greater membrane aeration, but the subsequent opex increase is offset by the reduced membrane area and the corresponding lower capex. Copyright © 2010 Elsevier Ltd. All rights reserved.

  6. Low-speed wind-tunnel investigation of a large scale advanced arrow-wing supersonic transport configuration with engines mounted above wing for upper-surface blowing

    NASA Technical Reports Server (NTRS)

    Shivers, J. P.; Mclemore, H. C.; Coe, P. L., Jr.

    1976-01-01

    Tests have been conducted in a full scale tunnel to determine the low speed aerodynamic characteristics of a large scale advanced arrow wing supersonic transport configuration with engines mounted above the wing for upper surface blowing. Tests were made over an angle of attack range of -10 deg to 32 deg, sideslip angles of + or - 5 deg, and a Reynolds number range of 3,530,000 to 7,330,000. Configuration variables included trailing edge flap deflection, engine jet nozzle angle, engine thrust coefficient, engine out operation, and asymmetrical trailing edge boundary layer control for providing roll trim. Downwash measurements at the tail were obtained for different thrust coefficients, tail heights, and at two fuselage stations.

  7. Large-Scale Cubic-Scaling Random Phase Approximation Correlation Energy Calculations Using a Gaussian Basis.

    PubMed

    Wilhelm, Jan; Seewald, Patrick; Del Ben, Mauro; Hutter, Jürg

    2016-12-13

    We present an algorithm for computing the correlation energy in the random phase approximation (RPA) in a Gaussian basis requiring [Formula: see text] operations and [Formula: see text] memory. The method is based on the resolution of the identity (RI) with the overlap metric, a reformulation of RI-RPA in the Gaussian basis, imaginary time, and imaginary frequency integration techniques, and the use of sparse linear algebra. Additional memory reduction without extra computations can be achieved by an iterative scheme that overcomes the memory bottleneck of canonical RPA implementations. We report a massively parallel implementation that is the key for the application to large systems. Finally, cubic-scaling RPA is applied to a thousand water molecules using a correlation-consistent triple-ζ quality basis.

  8. Uncorrelated Encounter Model of the National Airspace System, Version 2.0

    DTIC Science & Technology

    2013-08-19

    can exist to certify avoidance systems for operational use. Evaluations typically include flight tests, operational impact studies, and simulation of...appropriate for large-scale air traffic impact studies— for example, examination of sector loading or conflict rates. The focus here includes two types of...between two IFR aircraft in oceanic airspace. The reason for this is that one cannot observe encounters of sufficient fidelity in the available data

  9. Digitally programmable microfluidic automaton for multiscale combinatorial mixing and sample processing†

    PubMed Central

    Jensen, Erik C.; Stockton, Amanda M.; Chiesl, Thomas N.; Kim, Jungkyu; Bera, Abhisek; Mathies, Richard A.

    2013-01-01

    A digitally programmable microfluidic Automaton consisting of a 2-dimensional array of pneumatically actuated microvalves is programmed to perform new multiscale mixing and sample processing operations. Large (µL-scale) volume processing operations are enabled by precise metering of multiple reagents within individual nL-scale valves followed by serial repetitive transfer to programmed locations in the array. A novel process exploiting new combining valve concepts is developed for continuous rapid and complete mixing of reagents in less than 800 ms. Mixing, transfer, storage, and rinsing operations are implemented combinatorially to achieve complex assay automation protocols. The practical utility of this technology is demonstrated by performing automated serial dilution for quantitative analysis as well as the first demonstration of on-chip fluorescent derivatization of biomarker targets (carboxylic acids) for microchip capillary electrophoresis on the Mars Organic Analyzer. A language is developed to describe how unit operations are combined to form a microfluidic program. Finally, this technology is used to develop a novel microfluidic 6-sample processor for combinatorial mixing of large sets (>26 unique combinations) of reagents. The digitally programmable microfluidic Automaton is a versatile programmable sample processor for a wide range of process volumes, for multiple samples, and for different types of analyses. PMID:23172232

  10. Experimental design, operation, and results of a 4 kW high temperature steam electrolysis experiment

    DOE PAGES

    Zhang, Xiaoyu; O'Brien, James E.; Tao, Greg; ...

    2015-08-06

    High temperature steam electrolysis (HTSE) is a promising technology for large-scale hydrogen production. However, research on HTSE performance above the kW level is limited. This paper presents the results of 4 kW HTSE long-term test completed in a multi-kW test facility recently developed at the Idaho National Laboratory (INL). The 4 kW HTSE unit included two solid oxide electrolysis stacks operating in parallel, each of which included 40 electrode-supported planar cells. A current density of 0.41 A/cm2 was used for the long-term operation, resulting in a hydrogen production rate about 25 slpm. A demonstration of 920 hours stable operation wasmore » achieved. The paper also includes detailed descriptions of the piping layout, steam generation and delivery system, test fixture, heat recuperation system, hot zone, instrumentation, and operating conditions. As a result, this successful demonstration of multi-kW scale HTSE unit will help to advance the technology toward near-term commercialization.« less

  11. Derivation of Optimal Operating Rules for Large-scale Reservoir Systems Considering Multiple Trade-off

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Lei, X.; Liu, P.; Wang, H.; Li, Z.

    2017-12-01

    Flood control operation of multi-reservoir systems such as parallel reservoirs and hybrid reservoirs often suffer from complex interactions and trade-off among tributaries and the mainstream. The optimization of such systems is computationally intensive due to nonlinear storage curves, numerous constraints and complex hydraulic connections. This paper aims to derive the optimal flood control operating rules based on the trade-off among tributaries and the mainstream using a new algorithm known as weighted non-dominated sorting genetic algorithm II (WNSGA II). WNSGA II could locate the Pareto frontier in non-dominated region efficiently due to the directed searching by weighted crowding distance, and the results are compared with those of conventional operating rules (COR) and single objective genetic algorithm (GA). Xijiang river basin in China is selected as a case study, with eight reservoirs and five flood control sections within four tributaries and the mainstream. Furthermore, the effects of inflow uncertainty have been assessed. Results indicate that: (1) WNSGA II could locate the non-dominated solutions faster and provide better Pareto frontier than the traditional non-dominated sorting genetic algorithm II (NSGA II) due to the weighted crowding distance; (2) WNSGA II outperforms COR and GA on flood control in the whole basin; (3) The multi-objective operating rules from WNSGA II deal with the inflow uncertainties better than COR. Therefore, the WNSGA II can be used to derive stable operating rules for large-scale reservoir systems effectively and efficiently.

  12. Compounded effects of heat waves and droughts over the Western Electricity Grid: spatio-temporal scales of impacts and predictability toward mitigation and adaptation.

    NASA Astrophysics Data System (ADS)

    Voisin, N.; Kintner-Meyer, M.; Skaggs, R.; Xie, Y.; Wu, D.; Nguyen, T. B.; Fu, T.; Zhou, T.

    2016-12-01

    Heat waves and droughts are projected to be more frequent and intense. We have seen in the past the effects of each of those extreme climate events on electricity demand and constrained electricity generation, challenging power system operations. Our aim here is to understand the compounding effects under historical conditions. We present a benchmark of Western US grid performance under 55 years of historical climate, and including droughts, using 2010-level of water demand and water management infrastructure, and 2010-level of electricity grid infrastructure and operations. We leverage CMIP5 historical hydrology simulations and force a large scale river routing- reservoir model with 2010-level sectoral water demands. The regulated flow at each water-dependent generating plants is processed to adjust water-dependent electricity generation parameterization in a production cost model, that represents 2010-level power system operations with hourly energy demand of 2010. The resulting benchmark includes a risk distribution of several grid performance metrics (unserved energy, production cost, carbon emission) as a function of inter-annual variability in regional water availability and predictability using large scale climate oscillations. In the second part of the presentation, we describe an approach to map historical heat waves onto this benchmark grid performance using a building energy demand model. The impact of the heat waves, combined with the impact of droughts, is explored at multiple scales to understand the compounding effects. Vulnerabilities of the power generation and transmission systems are highlighted to guide future adaptation.

  13. Driving terrestrial ecosystem models from space

    NASA Technical Reports Server (NTRS)

    Waring, R. H.

    1993-01-01

    Regional air pollution, land-use conversion, and projected climate change all affect ecosystem processes at large scales. Changes in vegetation cover and growth dynamics can impact the functioning of ecosystems, carbon fluxes, and climate. As a result, there is a need to assess and monitor vegetation structure and function comprehensively at regional to global scales. To provide a test of our present understanding of how ecosystems operate at large scales we can compare model predictions of CO2, O2, and methane exchange with the atmosphere against regional measurements of interannual variation in the atmospheric concentration of these gases. Recent advances in remote sensing of the Earth's surface are beginning to provide methods for estimating important ecosystem variables at large scales. Ecologists attempting to generalize across landscapes have made extensive use of models and remote sensing technology. The success of such ventures is dependent on merging insights and expertise from two distinct fields. Ecologists must provide the understanding of how well models emulate important biological variables and their interactions; experts in remote sensing must provide the biophysical interpretation of complex optical reflectance and radar backscatter data.

  14. Development of fine-resolution analyses and expanded large-scale forcing properties. Part I: Methodology and evaluation

    DOE PAGES

    Li, Zhijin; Vogelmann, Andrew M.; Feng, Sha; ...

    2015-01-20

    We produce fine-resolution, three-dimensional fields of meteorological and other variables for the U.S. Department of Energy’s Atmospheric Radiation Measurement (ARM) Southern Great Plains site. The Community Gridpoint Statistical Interpolation system is implemented in a multiscale data assimilation (MS-DA) framework that is used within the Weather Research and Forecasting model at a cloud-resolving resolution of 2 km. The MS-DA algorithm uses existing reanalysis products and constrains fine-scale atmospheric properties by assimilating high-resolution observations. A set of experiments show that the data assimilation analysis realistically reproduces the intensity, structure, and time evolution of clouds and precipitation associated with a mesoscale convective system.more » Evaluations also show that the large-scale forcing derived from the fine-resolution analysis has an overall accuracy comparable to the existing ARM operational product. For enhanced applications, the fine-resolution fields are used to characterize the contribution of subgrid variability to the large-scale forcing and to derive hydrometeor forcing, which are presented in companion papers.« less

  15. Experimental investigation of an ejector-powered free-jet facility

    NASA Technical Reports Server (NTRS)

    Long, Mary JO

    1992-01-01

    NASA Lewis Research Center's (LeRC) newly developed Nozzle Acoustic Test Rig (NATR) is a large free-jet test facility powered by an ejector system. In order to assess the pumping performance of this ejector concept and determine its sensitivity to various design parameters, a 1/5-scale model of the NATR was built and tested prior to the operation of the actual facility. This paper discusses the results of the 1/5-scale model tests and compares them with the findings from the full-scale tests.

  16. The Cosmology Large Angular Scale Surveyor (CLASS)

    NASA Astrophysics Data System (ADS)

    Cleary, Joseph

    2018-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is an array of four telescopes designed to measure the polarization of the Cosmic Microwave Background. CLASS aims to detect the B-mode polarization from primordial gravitational waves predicted by cosmic inflation theory, as well as the imprint left by reionization upon the CMB E-mode polarization. This will be achieved through a combination of observing strategy and state-of-the-art instrumentation. CLASS is observing 70% of the sky to characterize the CMB at large angular scales, which will measure the entire CMB power spectrum from the reionization peak to the recombination peak. The four telescopes operate at frequencies of 38, 93, 145, and 217 GHz, in order to estimate Galactic synchrotron and dust foregrounds while avoiding atmospheric absorption. CLASS employs rapid polarization modulation to overcome atmospheric and instrumental noise. Polarization sensitive cryogenic detectors with low noise levels provide CLASS the sensitivity required to constrain the tensor-to-scalar ratio down to levels of r ~ 0.01 while also measuring the optical depth the reionization to sample-variance levels. These improved constraints on the optical depth to reionization are required to pin down the mass of neutrinos from complementary cosmological data. CLASS has completed a year of observations at 38 GHz and is in the process of deploying the rest of the telescope array. This poster provides an overview and update on the CLASS science, hardware and survey operations.

  17. [Effect of pilot UASB-SFSBR-MAP process for the large scale swine wastewater treatment].

    PubMed

    Wang, Liang; Chen, Chong-Jun; Chen, Ying-Xu; Wu, Wei-Xiang

    2013-03-01

    In this paper, a treatment process consisted of UASB, step-fed sequencing batch reactor (SFSBR) and magnesium ammonium phosphate precipitation reactor (MAP) was built to treat the large scale swine wastewater, which aimed at overcoming drawbacks of conventional anaerobic-aerobic treatment process and SBR treatment process, such as the low denitrification efficiency, high operating costs and high nutrient losses and so on. Based on the treatment process, a pilot engineering was constructed. It was concluded from the experiment results that the removal efficiency of COD, NH4(+) -N and TP reached 95.1%, 92.7% and 88.8%, the recovery rate of NH4(+) -N and TP by MAP process reached 23.9% and 83.8%, the effluent quality was superior to the discharge standard of pollutants for livestock and poultry breeding (GB 18596-2001), mass concentration of COD, TN, NH4(+) -N, TP and SS were not higher than 135, 116, 43, 7.3 and 50 mg x L(-1) respectively. The process developed was reliable, kept self-balance of carbon source and alkalinity, reached high nutrient recovery efficiency. And the operating cost was equal to that of the traditional anaerobic-aerobic treatment process. So the treatment process could provide a high value of application and dissemination and be fit for the treatment pf the large scale swine wastewater in China.

  18. Assembly, characterization, and operation of large-scale TES detector arrays for ACTPol

    NASA Astrophysics Data System (ADS)

    Pappas, Christine Goodwin

    2016-01-01

    The Polarization-sensitive Receiver for the Atacama Cosmology Telescope (ACTPol) is designed to measure the Cosmic Microwave Background (CMB) temperature and polarization anisotropies on small angular scales. Measurements of the CMB temperature and polarization anisotropies have produced arguably the most important cosmological data to date, establishing the LambdaCDM model and providing the best constraints on most of its parameters. To detect the very small fluctuations in the CMB signal across the sky, ACTPol uses feedhorn-coupled Transition-Edge Sensor (TES) detectors. A TES is a superconducting thin film operated in the transition region between the superconducting and normal states, where it functions as a highly sensitive resistive thermometer. In this thesis, aspects of the assembly, characterization, and in-field operation of the ACTPol TES detector arrays are discussed. First, a novel microfabrication process for producing high-density superconducting aluminum/polyimide flexible circuitry (flex) designed to connect large-scale detector arrays to the first stage of readout is presented. The flex is used in parts of the third ACTPol array and is currently being produced for use in the AdvACT detector arrays, which will begin to replace the ACTPol arrays in 2016. Next, we describe methods and results for the in-lab and on-telescope characterization of the detectors in the third ACTPol array. Finally, we describe the ACTPol TES R(T,I) transition shapes and how they affect the detector calibration and operation. Methods for measuring the exact detector calibration and re-biasing functions, taking into account the R(T,I) transition shape, are presented.

  19. Three-Year Evaluation of a Large Scale Early Grade French Immersion Program: The Ottawa Study

    ERIC Educational Resources Information Center

    Barik, Henri; Swain, Marrill

    1975-01-01

    The school performance of pupils in grades K-2 of the French immersion program in operation in Ottawa public schools is evaluated in comparison with that of pupils in the regular English program. (Author/RM)

  20. Intelligent switching between different noise propagation algorithms: analysis and sensitivity

    DOT National Transportation Integrated Search

    2012-08-10

    When modeling aircraft noise on a large scale (such as an analysis of annual aircraft : operations at an airport), it is important that the noise propagation model used for the : analysis be both efficient and accurate. In this analysis, three differ...

  1. Multi-platform operational validation of the Western Mediterranean SOCIB forecasting system

    NASA Astrophysics Data System (ADS)

    Juza, Mélanie; Mourre, Baptiste; Renault, Lionel; Tintoré, Joaquin

    2014-05-01

    The development of science-based ocean forecasting systems at global, regional, and local scales can support a better management of the marine environment (maritime security, environmental and resources protection, maritime and commercial operations, tourism, ...). In this context, SOCIB (the Balearic Islands Coastal Observing and Forecasting System, www.socib.es) has developed an operational ocean forecasting system in the Western Mediterranean Sea (WMOP). WMOP uses a regional configuration of the Regional Ocean Modelling System (ROMS, Shchepetkin and McWilliams, 2005) nested in the larger scale Mediterranean Forecasting System (MFS) with a spatial resolution of 1.5-2km. WMOP aims at reproducing both the basin-scale ocean circulation and the mesoscale variability which is known to play a crucial role due to its strong interaction with the large scale circulation in this region. An operational validation system has been developed to systematically assess the model outputs at daily, monthly and seasonal time scales. Multi-platform observations are used for this validation, including satellite products (Sea Surface Temperature, Sea Level Anomaly), in situ measurements (from gliders, Argo floats, drifters and fixed moorings) and High-Frequency radar data. The validation procedures allow to monitor and certify the general realism of the daily production of the ocean forecasting system before its distribution to users. Additionally, different indicators (Sea Surface Temperature and Salinity, Eddy Kinetic Energy, Mixed Layer Depth, Heat Content, transports in key sections) are computed every day both at the basin-scale and in several sub-regions (Alboran Sea, Balearic Sea, Gulf of Lion). The daily forecasts, validation diagnostics and indicators from the operational model over the last months are available at www.socib.es.

  2. Observation of the ballooning mode that limits the operation space of the high-density super-dense-core plasma in the LHD

    NASA Astrophysics Data System (ADS)

    Ohdachi, S.; Watanabe, K. Y.; Tanaka, K.; Suzuki, Y.; Takemura, Y.; Sakakibara, S.; Du, X. D.; Bando, T.; Narushima, Y.; Sakamoto, R.; Miyazawa, J.; Motojima, G.; Morisaki, T.; LHD Experiment Group

    2017-06-01

    The central beta of the super-dense-core (SDC) plasma in the large helical device (LHD) is limited by a large scale MHD event called ‘core density collapse’ (CDC). The detailed measurement reveals that a new type of ballooning mode, quite localized in space and destabilized from the 3D nature of Heliotron devices, is the cause of the CDC. It is the first observation of an unstable mode in a region with global negative magnetic shear. Avoidance of the excitation of this mode is a key to expand the operational limit of the LHD.

  3. A spatial picture of the synthetic large-scale motion from dynamic roughness

    NASA Astrophysics Data System (ADS)

    Huynh, David; McKeon, Beverley

    2017-11-01

    Jacobi and McKeon (2011) set up a dynamic roughness apparatus to excite a synthetic, travelling wave-like disturbance in a wind tunnel, boundary layer study. In the present work, this dynamic roughness has been adapted for a flat-plate, turbulent boundary layer experiment in a water tunnel. A key advantage of operating in water as opposed to air is the longer flow timescales. This makes accessible higher non-dimensional actuation frequencies and correspondingly shorter synthetic length scales, and is thus more amenable to particle image velocimetry. As a result, this experiment provides a novel spatial picture of the synthetic mode, the coupled small scales, and their streamwise development. It is demonstrated that varying the roughness actuation frequency allows for significant tuning of the streamwise wavelength of the synthetic mode, with a range of 3 δ-13 δ being achieved. Employing a phase-locked decomposition, spatial snapshots are constructed of the synthetic large scale and used to analyze its streamwise behavior. Direct spatial filtering is used to separate the synthetic large scale and the related small scales, and the results are compared to those obtained by temporal filtering that invokes Taylor's hypothesis. The support of AFOSR (Grant # FA9550-16-1-0361) is gratefully acknowledged.

  4. Scaling of mode shapes from operational modal analysis using harmonic forces

    NASA Astrophysics Data System (ADS)

    Brandt, A.; Berardengo, M.; Manzoni, S.; Cigada, A.

    2017-10-01

    This paper presents a new method for scaling mode shapes obtained by means of operational modal analysis. The method is capable of scaling mode shapes on any structure, also structures with closely coupled modes, and the method can be used in the presence of ambient vibration from traffic or wind loads, etc. Harmonic excitation can be relatively easily accomplished by using general-purpose actuators, also for force levels necessary for driving large structures such as bridges and highrise buildings. The signal processing necessary for mode shape scaling by the proposed method is simple and the method can easily be implemented in most measurement systems capable of generating a sine wave output. The tests necessary to scale the modes are short compared to typical operational modal analysis test time. The proposed method is thus easy to apply and inexpensive relative to some other methods for scaling mode shapes that are available in literature. Although it is not necessary per se, we propose to excite the structure at, or close to, the eigenfrequencies of the modes to be scaled, since this provides better signal-to-noise ratio in the response sensors, thus permitting the use of smaller actuators. An extensive experimental activity on a real structure was carried out and the results reported demonstrate the feasibility and accuracy of the proposed method. Since the method utilizes harmonic excitation for the mode shape scaling, we propose to call the method OMAH.

  5. Panoptes: web-based exploration of large scale genome variation data.

    PubMed

    Vauterin, Paul; Jeffery, Ben; Miles, Alistair; Amato, Roberto; Hart, Lee; Wright, Ian; Kwiatkowski, Dominic

    2017-10-15

    The size and complexity of modern large-scale genome variation studies demand novel approaches for exploring and sharing the data. In order to unlock the potential of these data for a broad audience of scientists with various areas of expertise, a unified exploration framework is required that is accessible, coherent and user-friendly. Panoptes is an open-source software framework for collaborative visual exploration of large-scale genome variation data and associated metadata in a web browser. It relies on technology choices that allow it to operate in near real-time on very large datasets. It can be used to browse rich, hybrid content in a coherent way, and offers interactive visual analytics approaches to assist the exploration. We illustrate its application using genome variation data of Anopheles gambiae, Plasmodium falciparum and Plasmodium vivax. Freely available at https://github.com/cggh/panoptes, under the GNU Affero General Public License. paul.vauterin@gmail.com. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  6. Large-Scale Test of Dynamic Correlation Processors: Implications for Correlation-Based Seismic Pipelines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dodge, D. A.; Harris, D. B.

    Correlation detectors are of considerable interest to the seismic monitoring communities because they offer reduced detection thresholds and combine detection, location and identification functions into a single operation. They appear to be ideal for applications requiring screening of frequent repeating events. However, questions remain about how broadly empirical correlation methods are applicable. We describe the effectiveness of banks of correlation detectors in a system that combines traditional power detectors with correlation detectors in terms of efficiency, which we define to be the fraction of events detected by the correlators. This paper elaborates and extends the concept of a dynamic correlationmore » detection framework – a system which autonomously creates correlation detectors from event waveforms detected by power detectors; and reports observed performance on a network of arrays in terms of efficiency. We performed a large scale test of dynamic correlation processors on an 11 terabyte global dataset using 25 arrays in the single frequency band 1-3 Hz. The system found over 3.2 million unique signals and produced 459,747 screened detections. A very satisfying result is that, on average, efficiency grows with time and, after nearly 16 years of operation, exceeds 47% for events observed over all distance ranges and approaches 70% for near regional and 90% for local events. This observation suggests that future pipeline architectures should make extensive use of correlation detectors, principally for decluttering observations of local and near-regional events. Our results also suggest that future operations based on correlation detection will require commodity large-scale computing infrastructure, since the numbers of correlators in an autonomous system can grow into the hundreds of thousands.« less

  7. Large-Scale Test of Dynamic Correlation Processors: Implications for Correlation-Based Seismic Pipelines

    DOE PAGES

    Dodge, D. A.; Harris, D. B.

    2016-03-15

    Correlation detectors are of considerable interest to the seismic monitoring communities because they offer reduced detection thresholds and combine detection, location and identification functions into a single operation. They appear to be ideal for applications requiring screening of frequent repeating events. However, questions remain about how broadly empirical correlation methods are applicable. We describe the effectiveness of banks of correlation detectors in a system that combines traditional power detectors with correlation detectors in terms of efficiency, which we define to be the fraction of events detected by the correlators. This paper elaborates and extends the concept of a dynamic correlationmore » detection framework – a system which autonomously creates correlation detectors from event waveforms detected by power detectors; and reports observed performance on a network of arrays in terms of efficiency. We performed a large scale test of dynamic correlation processors on an 11 terabyte global dataset using 25 arrays in the single frequency band 1-3 Hz. The system found over 3.2 million unique signals and produced 459,747 screened detections. A very satisfying result is that, on average, efficiency grows with time and, after nearly 16 years of operation, exceeds 47% for events observed over all distance ranges and approaches 70% for near regional and 90% for local events. This observation suggests that future pipeline architectures should make extensive use of correlation detectors, principally for decluttering observations of local and near-regional events. Our results also suggest that future operations based on correlation detection will require commodity large-scale computing infrastructure, since the numbers of correlators in an autonomous system can grow into the hundreds of thousands.« less

  8. Utilization of lunar materials and expertise for large scale operations in space: Abstracts. [lunar bases and space industrialization

    NASA Technical Reports Server (NTRS)

    Criswell, D. R. (Editor)

    1976-01-01

    The practicality of exploiting the moon, not only as a source of materials for large habitable structures at Lagrangian points, but also as a base for colonization is discussed in abstracts of papers presented at a special session on lunar utilization. Questions and answers which followed each presentation are included after the appropriate abstract. Author and subject indexes are provided.

  9. Blueprint for a microwave trapped ion quantum computer

    PubMed Central

    Lekitsch, Bjoern; Weidt, Sebastian; Fowler, Austin G.; Mølmer, Klaus; Devitt, Simon J.; Wunderlich, Christof; Hensinger, Winfried K.

    2017-01-01

    The availability of a universal quantum computer may have a fundamental impact on a vast number of research fields and on society as a whole. An increasingly large scientific and industrial community is working toward the realization of such a device. An arbitrarily large quantum computer may best be constructed using a modular approach. We present a blueprint for a trapped ion–based scalable quantum computer module, making it possible to create a scalable quantum computer architecture based on long-wavelength radiation quantum gates. The modules control all operations as stand-alone units, are constructed using silicon microfabrication techniques, and are within reach of current technology. To perform the required quantum computations, the modules make use of long-wavelength radiation–based quantum gate technology. To scale this microwave quantum computer architecture to a large size, we present a fully scalable design that makes use of ion transport between different modules, thereby allowing arbitrarily many modules to be connected to construct a large-scale device. A high error–threshold surface error correction code can be implemented in the proposed architecture to execute fault-tolerant operations. With appropriate adjustments, the proposed modules are also suitable for alternative trapped ion quantum computer architectures, such as schemes using photonic interconnects. PMID:28164154

  10. Launch processing system transition from development to operation

    NASA Technical Reports Server (NTRS)

    Paul, H. C.

    1977-01-01

    The Launch Processing System has been under development at Kennedy Space Center since 1973. A prototype system was developed and delivered to Marshall Space Flight Center for Solid Rocket Booster checkout in July 1976. The first production hardware arrived in late 1976. The System uses a distributed computer network for command and monitoring and is supported by a dual large scale computer system for 'off line' processing. A high level of automation is anticipated for Shuttle and Payload testing and launch operations to gain the advantages of short turnaround capability, repeatability of operations, and minimization of operations and maintenance (O&M) manpower. Learning how to efficiently apply the system is our current problem. We are searching for more effective ways to convey LPS system performance characteristics from the designer to a large number of users. Once we have done this, we can realize the advantages of LPS system design.

  11. Optimisation Of a Magnetostrictive Wave Energy Converter

    NASA Astrophysics Data System (ADS)

    Mundon, T. R.; Nair, B.

    2014-12-01

    Oscilla Power, Inc. (OPI) is developing a patented magnetostrictive wave energy converter aimed at reducing the cost of grid-scale electricity from ocean waves. Designed to operate cost-effectively across a wide range of wave conditions, this will be the first use of reverse magnetostriction for large-scale energy production. The device architecture is a straightforward two-body, point absorbing system that has been studied at length by various researchers. A large surface float is anchored to a submerged heave (reaction) plate by multiple taut tethers that are largely made up of discrete, robust power takeoff modules that house the magnetostrictive generators. The unique generators developed by OPI utilize the phenomenon of reverse magnetostriction, which through the application of load to a specific low cost alloy, can generate significant magnetic flux changes, and thus create power through electromagnetic induction. Unlike traditional generators, the mode of operation is low-displacement, high-force, high damping which in combination with the specific multi-tether configuration creates some unique effects and interesting optimization challenges. Using an empirical approach with a combination of numerical tools, such as ORCAFLEX, and physical models, we investigated the properties and sensitivities of this system arrangement, including various heave plate geometries, with the overall goal of identifying the mass and hydrodynamic parameters required for optimum performance. Furthermore, through a detailed physical model test program at the University of New Hampshire, we were able to study in more detail how the heave plate geometry affects the drag and added mass coefficients. In presenting this work we will discuss how alternate geometries could be used to optimize the hydrodynamic parameters of the heave plate, allowing maximum inertial forces in operational conditions, while simultaneously minimizing the forces generated in extreme waves. This presentation will cover the significant findings from this research, including physical model results and identified sensitivity parameters. In addition, we will discuss some preliminary results from our large-scale ocean trial conducted in August & September of this year.

  12. Design of distributed PID-type dynamic matrix controller for fractional-order systems

    NASA Astrophysics Data System (ADS)

    Wang, Dawei; Zhang, Ridong

    2018-01-01

    With the continuous requirements for product quality and safety operation in industrial production, it is difficult to describe the complex large-scale processes with integer-order differential equations. However, the fractional differential equations may precisely represent the intrinsic characteristics of such systems. In this paper, a distributed PID-type dynamic matrix control method based on fractional-order systems is proposed. First, the high-order approximate model of integer order is obtained by utilising the Oustaloup method. Then, the step response model vectors of the plant is obtained on the basis of the high-order model, and the online optimisation for multivariable processes is transformed into the optimisation of each small-scale subsystem that is regarded as a sub-plant controlled in the distributed framework. Furthermore, the PID operator is introduced into the performance index of each subsystem and the fractional-order PID-type dynamic matrix controller is designed based on Nash optimisation strategy. The information exchange among the subsystems is realised through the distributed control structure so as to complete the optimisation task of the whole large-scale system. Finally, the control performance of the designed controller in this paper is verified by an example.

  13. Bottom-up production of meta-atoms for optical magnetism in visible and NIR light

    NASA Astrophysics Data System (ADS)

    Barois, Philippe; Ponsinet, Virginie; Baron, Alexandre; Richetti, Philippe

    2018-02-01

    Many unusual optical properties of metamaterials arise from the magnetic response of engineered structures of sub-wavelength size (meta-atoms) exposed to light. The top-down approach whereby engineered nanostructure of well-defined morphology are engraved on a surface proved to be successful for the generation of strong optical magnetism. It faces however the limitations of high cost and small active area in visible light where nanometre resolution is needed. The bottom-up approach whereby the fabrication metamaterials of large volume or large area results from the combination of nanochemitry and self-assembly techniques may constitute a cost-effective alternative. This approach nevertheless requires the large-scale production of functional building-blocks (meta-atoms) bearing a strong magnetic optical response. We propose in this paper a few tracks that lead to the large scale synthesis of magnetic metamaterials operating in visible or near IR light.

  14. Template Interfaces for Agile Parallel Data-Intensive Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramakrishnan, Lavanya; Gunter, Daniel; Pastorello, Gilerto Z.

    Tigres provides a programming library to compose and execute large-scale data-intensive scientific workflows from desktops to supercomputers. DOE User Facilities and large science collaborations are increasingly generating large enough data sets that it is no longer practical to download them to a desktop to operate on them. They are instead stored at centralized compute and storage resources such as high performance computing (HPC) centers. Analysis of this data requires an ability to run on these facilities, but with current technologies, scaling an analysis to an HPC center and to a large data set is difficult even for experts. Tigres ismore » addressing the challenge of enabling collaborative analysis of DOE Science data through a new concept of reusable "templates" that enable scientists to easily compose, run and manage collaborative computational tasks. These templates define common computation patterns used in analyzing a data set.« less

  15. Size and structure of Chlorella zofingiensis /FeCl 3 flocs in a shear flow: Algae Floc Structure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wyatt, Nicholas B.; O'Hern, Timothy J.; Shelden, Bion

    Flocculation is a promising method to overcome the economic hurdle to separation of algae from its growth medium in large scale operations. But, understanding of the floc structure and the effects of shear on the floc structure are crucial to the large scale implementation of this technique. The floc structure is important because it determines, in large part, the density and settling behavior of the algae. Freshwater algae floc size distributions and fractal dimensions are presented as a function of applied shear rate in a Couette cell using ferric chloride as a flocculant. Comparisons are made with measurements made formore » a polystyrene microparticle model system taken here as well as reported literature results. The algae floc size distributions are found to be self-preserving with respect to shear rate, consistent with literature data for polystyrene. Moreover, three fractal dimensions are calculated which quantitatively characterize the complexity of the floc structure. Low shear rates result in large, relatively dense packed flocs which elongate and fracture as the shear rate is increased. Our results presented here provide crucial information for economically implementing flocculation as a large scale algae harvesting strategy.« less

  16. Towards a Quantitative Use of Satellite Remote Sensing in Crop Growth Models for Large Scale Agricultural Production Estimate (Invited)

    NASA Astrophysics Data System (ADS)

    Defourny, P.

    2013-12-01

    The development of better agricultural monitoring capabilities is clearly considered as a critical step for strengthening food production information and market transparency thanks to timely information about crop status, crop area and yield forecasts. The documentation of global production will contribute to tackle price volatility by allowing local, national and international operators to make decisions and anticipate market trends with reduced uncertainty. Several operational agricultural monitoring systems are currently operating at national and international scales. Most are based on the methods derived from the pioneering experiences completed some decades ago, and use remote sensing to qualitatively compare one year to the others to estimate the risks of deviation from a normal year. The GEO Agricultural Monitoring Community of Practice described the current monitoring capabilities at the national and global levels. An overall diagram summarized the diverse relationships between satellite EO and agriculture information. There is now a large gap between the current operational large scale systems and the scientific state of the art in crop remote sensing, probably because the latter mainly focused on local studies. The poor availability of suitable in-situ and satellite data over extended areas hampers large scale demonstrations preventing the much needed up scaling research effort. For the cropland extent, this paper reports a recent research achievement using the full ENVISAT MERIS 300 m archive in the context of the ESA Climate Change Initiative. A flexible combination of classification methods depending to the region of the world allows mapping the land cover as well as the global croplands at 300 m for the period 2008 2012. This wall to wall product is then compared with regards to the FP 7-Geoland 2 results obtained using as Landsat-based sampling strategy over the IGADD countries. On the other hand, the vegetation indices and the biophysical variables such the Green Area Index (GAI), fAPAR and fcover usually retrieved from MODIS, MERIS, SPOT-Vegetation described the quality of the green vegetation development. The GLOBAM (Belgium) and EU FP-7 MOCCCASIN projects (Russia) improved the standard products and were demonstrated over large scale. The GAI retrieved from MODIS time series using a purity index criterion depicted successfully the inter-annual variability. Furthermore, the quantitative assimilation of these GAI time series into a crop growth model improved the yield estimate over years. These results showed that the GAI assimilation works best at the district or provincial level. In the context of the GEO Ag., the Joint Experiment of Crop Assessment and Monitoring (JECAM) was designed to enable the global agricultural monitoring community to compare such methods and results over a variety of regional cropping systems. For a network of test sites around the world, satellite and field measurements are currently collected and will be made available for collaborative effort. This experiment should facilitate international standards for data products and reporting, eventually supporting the development of a global system of systems for agricultural crop assessment and monitoring.

  17. Plane down in the city: Operation Crash and Surge.

    PubMed

    Kann, Duane F; Draper, Thomas W

    2014-01-01

    This article is about the experiences gained from the largest full-scale exercise ever conducted in the State of Florida, specifically regarding the Orlando International Airport (MCO) venues. The exercise was centred on an airplane crashing into a hotel just outside of MCO property. The scenario clarified details regarding Incident Command and the unique jurisdictional responsibilities associated with a large-scale mass casualty incident. There were additional challenges with airline operations, walking wounded, and information sharing that provided valuable experiences toward enhancing emergency operations. This article also outlines information gained by the MCO "go team" that traveled to San Francisco following the crash of Asiana flight 214. This real-life incident shone a light on many of the strengths and opportunities found throughout the MCO exercise and this article shows the interrelationship of both of these invaluable experiences.

  18. Application of multivariate analysis and mass transfer principles for refinement of a 3-L bioreactor scale-down model--when shake flasks mimic 15,000-L bioreactors better.

    PubMed

    Ahuja, Sanjeev; Jain, Shilpa; Ram, Kripa

    2015-01-01

    Characterization of manufacturing processes is key to understanding the effects of process parameters on process performance and product quality. These studies are generally conducted using small-scale model systems. Because of the importance of the results derived from these studies, the small-scale model should be predictive of large scale. Typically, small-scale bioreactors, which are considered superior to shake flasks in simulating large-scale bioreactors, are used as the scale-down models for characterizing mammalian cell culture processes. In this article, we describe a case study where a cell culture unit operation in bioreactors using one-sided pH control and their satellites (small-scale runs conducted using the same post-inoculation cultures and nutrient feeds) in 3-L bioreactors and shake flasks indicated that shake flasks mimicked the large-scale performance better than 3-L bioreactors. We detail here how multivariate analysis was used to make the pertinent assessment and to generate the hypothesis for refining the existing 3-L scale-down model. Relevant statistical techniques such as principal component analysis, partial least square, orthogonal partial least square, and discriminant analysis were used to identify the outliers and to determine the discriminatory variables responsible for performance differences at different scales. The resulting analysis, in combination with mass transfer principles, led to the hypothesis that observed similarities between 15,000-L and shake flask runs, and differences between 15,000-L and 3-L runs, were due to pCO2 and pH values. This hypothesis was confirmed by changing the aeration strategy at 3-L scale. By reducing the initial sparge rate in 3-L bioreactor, process performance and product quality data moved closer to that of large scale. © 2015 American Institute of Chemical Engineers.

  19. Instrumentation for localized superconducting cavity diagnostics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Conway, Z. A.; Ge, M.; Iwashita, Y.

    2017-01-12

    Superconducting accelerator cavities are now routinely operated at levels approaching the theoretical limit of niobium. To achieve these operating levels more information than is available from the RF excitation signal is required to characterize and determine fixes for the sources of performance limitations. This information is obtained using diagnostic techniques which complement the analysis of the RF signal. In this paper we describe the operation and select results from three of these diagnostic techniques: the use of large scale thermometer arrays, second sound wave defect location and high precision cavity imaging with the Kyoto camera.

  20. A low-frequency chip-scale optomechanical oscillator with 58 kHz mechanical stiffening and more than 100th-order stable harmonics.

    PubMed

    Huang, Yongjun; Flores, Jaime Gonzalo Flor; Cai, Ziqiang; Yu, Mingbin; Kwong, Dim-Lee; Wen, Guangjun; Churchill, Layne; Wong, Chee Wei

    2017-06-29

    For the sensitive high-resolution force- and field-sensing applications, the large-mass microelectromechanical system (MEMS) and optomechanical cavity have been proposed to realize the sub-aN/Hz 1/2 resolution levels. In view of the optomechanical cavity-based force- and field-sensors, the optomechanical coupling is the key parameter for achieving high sensitivity and resolution. Here we demonstrate a chip-scale optomechanical cavity with large mass which operates at ≈77.7 kHz fundamental mode and intrinsically exhibiting large optomechanical coupling of 44 GHz/nm or more, for both optical resonance modes. The mechanical stiffening range of ≈58 kHz and a more than 100 th -order harmonics are obtained, with which the free-running frequency instability is lower than 10 -6 at 100 ms integration time. Such results can be applied to further improve the sensing performance of the optomechanical inspired chip-scale sensors.

  1. Modeling relief demands in an emergency supply chain system under large-scale disasters based on a queuing network.

    PubMed

    He, Xinhua; Hu, Wenfa

    2014-01-01

    This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites) located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model.

  2. Multimode resource-constrained multiple project scheduling problem under fuzzy random environment and its application to a large scale hydropower construction project.

    PubMed

    Xu, Jiuping; Feng, Cuiying

    2014-01-01

    This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method.

  3. Large-Scale Low-Boom Inlet Test Overview

    NASA Technical Reports Server (NTRS)

    Hirt, Stefanie

    2011-01-01

    This presentation provides a high level overview of the Large-Scale Low-Boom Inlet Test and was presented at the Fundamental Aeronautics 2011 Technical Conference. In October 2010 a low-boom supersonic inlet concept with flow control was tested in the 8'x6' supersonic wind tunnel at NASA Glenn Research Center (GRC). The primary objectives of the test were to evaluate the inlet stability and operability of a large-scale low-boom supersonic inlet concept by acquiring performance and flowfield validation data, as well as evaluate simple, passive, bleedless inlet boundary layer control options. During this effort two models were tested: a dual stream inlet intended to model potential flight hardware and a single stream design to study a zero-degree external cowl angle and to permit surface flow visualization of the vortex generator flow control on the internal centerbody surface. The tests were conducted by a team of researchers from NASA GRC, Gulfstream Aerospace Corporation, University of Illinois at Urbana-Champaign, and the University of Virginia

  4. A large-scale investigation of microplastic contamination: Abundance and characteristics of microplastics in European beach sediment.

    PubMed

    Lots, Froukje A E; Behrens, Paul; Vijver, Martina G; Horton, Alice A; Bosker, Thijs

    2017-10-15

    Here we present the large-scale distribution of microplastic contamination in beach sediment across Europe. Sediment samples were collected from 23 locations across 13 countries by citizen scientists, and analysed using a standard operating procedure. We found significant variability in the concentrations of microplastics, ranging from 72±24 to 1512±187 microplastics per kg of dry sediment, with high variability within sampling locations. Three hotspots of microplastic accumulation (>700 microplastics per kg of dry sediment) were found. There was limited variability in the physico-chemical characteristics of the plastics across sampling locations. The majority of the microplastics were fibrous, <1mm in size, and blue/black in colour. In addition, using Raman spectrometry we identified particles as polyester, polyethylene, and polypropylene. Our research is the first large spatial-scale analysis of microplastics on European beaches giving insights into the nature and extent of the microplastic challenge. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Design Sketches For Optical Crossbar Switches Intended For Large-Scale Parallel Processing Applications

    NASA Astrophysics Data System (ADS)

    Hartmann, Alfred; Redfield, Steve

    1989-04-01

    This paper discusses design of large-scale (1000x 1000) optical crossbar switching networks for use in parallel processing supercom-puters. Alternative design sketches for an optical crossbar switching network are presented using free-space optical transmission with either a beam spreading/masking model or a beam steering model for internodal communications. The performances of alternative multiple access channel communications protocol-unslotted and slotted ALOHA and carrier sense multiple access (CSMA)-are compared with the performance of the classic arbitrated bus crossbar of conventional electronic parallel computing. These comparisons indicate an almost inverse relationship between ease of implementation and speed of operation. Practical issues of optical system design are addressed, and an optically addressed, composite spatial light modulator design is presented for fabrication to arbitrarily large scale. The wide range of switch architecture, communications protocol, optical systems design, device fabrication, and system performance problems presented by these design sketches poses a serious challenge to practical exploitation of highly parallel optical interconnects in advanced computer designs.

  6. Leaky Integrate and Fire Neuron by Charge-Discharge Dynamics in Floating-Body MOSFET.

    PubMed

    Dutta, Sangya; Kumar, Vinay; Shukla, Aditya; Mohapatra, Nihar R; Ganguly, Udayan

    2017-08-15

    Neuro-biology inspired Spiking Neural Network (SNN) enables efficient learning and recognition tasks. To achieve a large scale network akin to biology, a power and area efficient electronic neuron is essential. Earlier, we had demonstrated an LIF neuron by a novel 4-terminal impact ionization based n+/p/n+ with an extended gate (gated-INPN) device by physics simulation. Excellent improvement in area and power compared to conventional analog circuit implementations was observed. In this paper, we propose and experimentally demonstrate a compact conventional 3-terminal partially depleted (PD) SOI- MOSFET (100 nm gate length) to replace the 4-terminal gated-INPN device. Impact ionization (II) induced floating body effect in SOI-MOSFET is used to capture LIF neuron behavior to demonstrate spiking frequency dependence on input. MHz operation enables attractive hardware acceleration compared to biology. Overall, conventional PD-SOI-CMOS technology enables very-large-scale-integration (VLSI) which is essential for biology scale (~10 11 neuron based) large neural networks.

  7. Modeling Relief Demands in an Emergency Supply Chain System under Large-Scale Disasters Based on a Queuing Network

    PubMed Central

    He, Xinhua

    2014-01-01

    This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites) located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model. PMID:24688367

  8. Multimode Resource-Constrained Multiple Project Scheduling Problem under Fuzzy Random Environment and Its Application to a Large Scale Hydropower Construction Project

    PubMed Central

    Xu, Jiuping

    2014-01-01

    This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method. PMID:24550708

  9. A large-scale, long-term study of scale drift: The micro view and the macro view

    NASA Astrophysics Data System (ADS)

    He, W.; Li, S.; Kingsbury, G. G.

    2016-11-01

    The development of measurement scales for use across years and grades in educational settings provides unique challenges, as instructional approaches, instructional materials, and content standards all change periodically. This study examined the measurement stability of a set of Rasch measurement scales that have been in place for almost 40 years. In order to investigate the stability of these scales, item responses were collected from a large set of students who took operational adaptive tests using items calibrated to the measurement scales. For the four scales that were examined, item samples ranged from 2183 to 7923 items. Each item was administered to at least 500 students in each grade level, resulting in approximately 3000 responses per item. Stability was examined at the micro level analysing change in item parameter estimates that have occurred since the items were first calibrated. It was also examined at the macro level, involving groups of items and overall test scores for students. Results indicated that individual items had changes in their parameter estimates, which require further analysis and possible recalibration. At the same time, the results at the total score level indicate substantial stability in the measurement scales over the span of their use.

  10. Large-scale optimal control of interconnected natural gas and electrical transmission systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chiang, Nai-Yuan; Zavala, Victor M.

    2016-04-01

    We present a detailed optimal control model that captures spatiotemporal interactions between gas and electric transmission networks. We use the model to study flexibility and economic opportunities provided by coordination. A large-scale case study in the Illinois system reveals that coordination can enable the delivery of significantly larger amounts of natural gas to the power grid. In particular, under a coordinated setting, gas-fired generators act as distributed demand response resources that can be controlled by the gas pipeline operator. This enables more efficient control of pressures and flows in space and time and overcomes delivery bottlenecks. We demonstrate that themore » additional flexibility not only can benefit the gas operator but can also lead to more efficient power grid operations and results in increased revenue for gas-fired power plants. We also use the optimal control model to analyze computational issues arising in these complex models. We demonstrate that the interconnected Illinois system with full physical resolution gives rise to a highly nonlinear optimal control problem with 4400 differential and algebraic equations and 1040 controls that can be solved with a state-of-the-art sparse optimization solver. (C) 2016 Elsevier Ltd. All rights reserved.« less

  11. A technique for estimating the probability of radiation-stimulated failures of integrated microcircuits in low-intensity radiation fields: Application to the Spektr-R spacecraft

    NASA Astrophysics Data System (ADS)

    Popov, V. D.; Khamidullina, N. M.

    2006-10-01

    In developing radio-electronic devices (RED) of spacecraft operating in the fields of ionizing radiation in space, one of the most important problems is the correct estimation of their radiation tolerance. The “weakest link” in the element base of onboard microelectronic devices under radiation effect is the integrated microcircuits (IMC), especially of large scale (LSI) and very large scale (VLSI) degree of integration. The main characteristic of IMC, which is taken into account when making decisions on using some particular type of IMC in the onboard RED, is the probability of non-failure operation (NFO) at the end of the spacecraft’s lifetime. It should be noted that, until now, the NFO has been calculated only from the reliability characteristics, disregarding the radiation effect. This paper presents the so-called “reliability” approach to determination of radiation tolerance of IMC, which allows one to estimate the probability of non-failure operation of various types of IMC with due account of radiation-stimulated dose failures. The described technique is applied to RED onboard the Spektr-R spacecraft to be launched in 2007.

  12. Occupational health and safety of workers in agriculture and horticulture.

    PubMed

    Lundqvist, P

    2000-01-01

    Working in agriculture and horticulture gives considerable job satisfaction. The tasks are often interesting; you can see the result of your own work, watch your crop grow and mature; you have an affinity with nature and can follow the changes in the seasons. However, today it is a dangerous work environment fraught with occupational injuries and diseases due to hazardous situations and to physiological, physical, biological, chemical, psychological, and sociological factors. The ongoing rapid development may, on the other hand, bring about many changes during the next decades with more farmers and growers switching to organic production. Moreover, increased awareness of animal welfare also may lead to improved working conditions. Large-scale operations with fewer family-operated agricultural businesses might mean fewer injuries among children and older farmers. A consequence of large-scale operations may also be better regulation of working conditions. The greater use of automation technology eliminates many harmful working postures and movements when milking cows and carrying out other tasks. Information technology offers people the opportunity to gain more knowledge about their work. Labeling food produced in a worker-friendly work environment may give the consumers a chance to be involved in the process.

  13. Total-variation based velocity inversion with Bregmanized operator splitting algorithm

    NASA Astrophysics Data System (ADS)

    Zand, Toktam; Gholami, Ali

    2018-04-01

    Many problems in applied geophysics can be formulated as a linear inverse problem. The associated problems, however, are large-scale and ill-conditioned. Therefore, regularization techniques are needed to be employed for solving them and generating a stable and acceptable solution. We consider numerical methods for solving such problems in this paper. In order to tackle the ill-conditioning of the problem we use blockiness as a prior information of the subsurface parameters and formulate the problem as a constrained total variation (TV) regularization. The Bregmanized operator splitting (BOS) algorithm as a combination of the Bregman iteration and the proximal forward backward operator splitting method is developed to solve the arranged problem. Two main advantages of this new algorithm are that no matrix inversion is required and that a discrepancy stopping criterion is used to stop the iterations, which allow efficient solution of large-scale problems. The high performance of the proposed TV regularization method is demonstrated using two different experiments: 1) velocity inversion from (synthetic) seismic data which is based on Born approximation, 2) computing interval velocities from RMS velocities via Dix formula. Numerical examples are presented to verify the feasibility of the proposed method for high-resolution velocity inversion.

  14. Measurement of the steady surface pressure distribution on a single rotation large scale advanced prop-fan blade at Mach numbers from 0.03 to 0.78

    NASA Technical Reports Server (NTRS)

    Bushnell, Peter

    1988-01-01

    The aerodynamic pressure distribution was determined on a rotating Prop-Fan blade at the S1-MA wind tunnel facility operated by the Office National D'Etudes et de Recherches Aerospatiale (ONERA) in Modane, France. The pressure distributions were measured at thirteen radial stations on a single rotation Large Scale Advanced Prop-Fan (LAP/SR7) blade, for a sequence of operating conditions including inflow Mach numbers ranging from 0.03 to 0.78. Pressure distributions for more than one power coefficient and/or advanced ratio setting were measured for most of the inflow Mach numbers investigated. Due to facility power limitations the Prop-Fan test installation was a two bladed version of the eight design configuration. The power coefficient range investigated was therefore selected to cover typical power loading per blade conditions which occur within the Prop-Fan operating envelope. The experimental results provide an extensive source of information on the aerodynamic behavior of the swept Prop-Fan blade, including details which were elusive to current computational models and do not appear in the two-dimensional airfoil data.

  15. Vibration energy harvesting based monitoring of an operational bridge undergoing forced vibration and train passage

    NASA Astrophysics Data System (ADS)

    Cahill, Paul; Hazra, Budhaditya; Karoumi, Raid; Mathewson, Alan; Pakrashi, Vikram

    2018-06-01

    The application of energy harvesting technology for monitoring civil infrastructure is a bourgeoning topic of interest. The ability of kinetic energy harvesters to scavenge ambient vibration energy can be useful for large civil infrastructure under operational conditions, particularly for bridge structures. The experimental integration of such harvesters with full scale structures and the subsequent use of the harvested energy directly for the purposes of structural health monitoring shows promise. This paper presents the first experimental deployment of piezoelectric vibration energy harvesting devices for monitoring a full-scale bridge undergoing forced dynamic vibrations under operational conditions using energy harvesting signatures against time. The calibration of the harvesters is presented, along with details of the host bridge structure and the dynamic assessment procedures. The measured responses of the harvesters from the tests are presented and the use the harvesters for the purposes of structural health monitoring (SHM) is investigated using empirical mode decomposition analysis, following a bespoke data cleaning approach. Finally, the use of sequential Karhunen Loeve transforms to detect train passages during the dynamic assessment is presented. This study is expected to further develop interest in energy-harvesting based monitoring of large infrastructure for both research and commercial purposes.

  16. Parallel and serial computing tools for testing single-locus and epistatic SNP effects of quantitative traits in genome-wide association studies

    PubMed Central

    Ma, Li; Runesha, H Birali; Dvorkin, Daniel; Garbe, John R; Da, Yang

    2008-01-01

    Background Genome-wide association studies (GWAS) using single nucleotide polymorphism (SNP) markers provide opportunities to detect epistatic SNPs associated with quantitative traits and to detect the exact mode of an epistasis effect. Computational difficulty is the main bottleneck for epistasis testing in large scale GWAS. Results The EPISNPmpi and EPISNP computer programs were developed for testing single-locus and epistatic SNP effects on quantitative traits in GWAS, including tests of three single-locus effects for each SNP (SNP genotypic effect, additive and dominance effects) and five epistasis effects for each pair of SNPs (two-locus interaction, additive × additive, additive × dominance, dominance × additive, and dominance × dominance) based on the extended Kempthorne model. EPISNPmpi is the parallel computing program for epistasis testing in large scale GWAS and achieved excellent scalability for large scale analysis and portability for various parallel computing platforms. EPISNP is the serial computing program based on the EPISNPmpi code for epistasis testing in small scale GWAS using commonly available operating systems and computer hardware. Three serial computing utility programs were developed for graphical viewing of test results and epistasis networks, and for estimating CPU time and disk space requirements. Conclusion The EPISNPmpi parallel computing program provides an effective computing tool for epistasis testing in large scale GWAS, and the epiSNP serial computing programs are convenient tools for epistasis analysis in small scale GWAS using commonly available computer hardware. PMID:18644146

  17. OVERVIEW OF US NATIONAL LAND-COVER MAPPING PROGRAM

    EPA Science Inventory

    Because of escalating costs amid growing needs for large-scale, satellite-based landscape information, a group of US federal agencies agreed to pool resources and operate as a consortium to acquire the necessary data land-cover mapping of the nation . The consortium was initiated...

  18. Managing Vocabulary Mapping Services

    PubMed Central

    Che, Chengjian; Monson, Kent; Poon, Kasey B.; Shakib, Shaun C.; Lau, Lee Min

    2005-01-01

    The efficient management and maintenance of large-scale and high-quality vocabulary mapping is an operational challenge. The 3M Health Information Systems (HIS) Healthcare Data Dictionary (HDD) group developed an information management system to provide controlled mapping services, resulting in improved efficiency and quality maintenance. PMID:16779203

  19. Open Education as a "Heterotopia of Desire"

    ERIC Educational Resources Information Center

    Gourlay, Lesley

    2015-01-01

    The movement towards "openness" in education has tended to position itself as inherently democratising, radical, egalitarian and critical of powerful gatekeepers to learning. While "openness" is often positioned as a critique, I will argue that its mainstream discourses--while appearing to oppose large-scale operations of…

  20. Emerging Cyber Infrastructure for NASA's Large-Scale Climate Data Analytics

    NASA Astrophysics Data System (ADS)

    Duffy, D.; Spear, C.; Bowen, M. K.; Thompson, J. H.; Hu, F.; Yang, C. P.; Pierce, D.

    2016-12-01

    The resolution of NASA climate and weather simulations have grown dramatically over the past few years with the highest-fidelity models reaching down to 1.5 KM global resolutions. With each doubling of the resolution, the resulting data sets grow by a factor of eight in size. As the climate and weather models push the envelope even further, a new infrastructure to store data and provide large-scale data analytics is necessary. The NASA Center for Climate Simulation (NCCS) has deployed the Data Analytics Storage Service (DASS) that combines scalable storage with the ability to perform in-situ analytics. Within this system, large, commonly used data sets are stored in a POSIX file system (write once/read many); examples of data stored include Landsat, MERRA2, observing system simulation experiments, and high-resolution downscaled reanalysis. The total size of this repository is on the order of 15 petabytes of storage. In addition to the POSIX file system, the NCCS has deployed file system connectors to enable emerging analytics built on top of the Hadoop File System (HDFS) to run on the same storage servers within the DASS. Coupled with a custom spatiotemporal indexing approach, users can now run emerging analytical operations built on MapReduce and Spark on the same data files stored within the POSIX file system without having to make additional copies. This presentation will discuss the architecture of this system and present benchmark performance measurements from traditional TeraSort and Wordcount to large-scale climate analytical operations on NetCDF data.

  1. Optimal estimation and scheduling in aquifer management using the rapid feedback control method

    NASA Astrophysics Data System (ADS)

    Ghorbanidehno, Hojat; Kokkinaki, Amalia; Kitanidis, Peter K.; Darve, Eric

    2017-12-01

    Management of water resources systems often involves a large number of parameters, as in the case of large, spatially heterogeneous aquifers, and a large number of "noisy" observations, as in the case of pressure observation in wells. Optimizing the operation of such systems requires both searching among many possible solutions and utilizing new information as it becomes available. However, the computational cost of this task increases rapidly with the size of the problem to the extent that textbook optimization methods are practically impossible to apply. In this paper, we present a new computationally efficient technique as a practical alternative for optimally operating large-scale dynamical systems. The proposed method, which we term Rapid Feedback Controller (RFC), provides a practical approach for combined monitoring, parameter estimation, uncertainty quantification, and optimal control for linear and nonlinear systems with a quadratic cost function. For illustration, we consider the case of a weakly nonlinear uncertain dynamical system with a quadratic objective function, specifically a two-dimensional heterogeneous aquifer management problem. To validate our method, we compare our results with the linear quadratic Gaussian (LQG) method, which is the basic approach for feedback control. We show that the computational cost of the RFC scales only linearly with the number of unknowns, a great improvement compared to the basic LQG control with a computational cost that scales quadratically. We demonstrate that the RFC method can obtain the optimal control values at a greatly reduced computational cost compared to the conventional LQG algorithm with small and controllable losses in the accuracy of the state and parameter estimation.

  2. Performance analysis on a large scale borehole ground source heat pump in Tianjin cultural centre

    NASA Astrophysics Data System (ADS)

    Yin, Baoquan; Wu, Xiaoting

    2018-02-01

    In this paper, the temperature distribution of the geothermal field for the vertical borehole ground-coupled heat pump was tested and analysed. Besides the borehole ground-coupled heat pump, the system composed of the ice storage, heat supply network and cooling tower. According to the operation data for nearly three years, the temperature constant zone is in the ground depth of 40m -120m with a temperature gradient of about 3.0°C/100m. The temperature of the soil dropped significantly in the heating season, increased significantly in the cooling season, and reinstated in the transitional season. With the energy balance design of the heating and cooling and the existence of the soil thermal inertia, the soil temperature stayed in a relative stable range and the ground source heat pump system was operated with a relative high efficiency. The geothermal source heat pump was shown to be applicable for large scale utilization.

  3. A scalable and operationally simple radical trifluoromethylation

    PubMed Central

    Beatty, Joel W.; Douglas, James J.; Cole, Kevin P.; Stephenson, Corey R. J.

    2015-01-01

    The large number of reagents that have been developed for the synthesis of trifluoromethylated compounds is a testament to the importance of the CF3 group as well as the associated synthetic challenge. Current state-of-the-art reagents for appending the CF3 functionality directly are highly effective; however, their use on preparative scale has minimal precedent because they require multistep synthesis for their preparation, and/or are prohibitively expensive for large-scale application. For a scalable trifluoromethylation methodology, trifluoroacetic acid and its anhydride represent an attractive solution in terms of cost and availability; however, because of the exceedingly high oxidation potential of trifluoroacetate, previous endeavours to use this material as a CF3 source have required the use of highly forcing conditions. Here we report a strategy for the use of trifluoroacetic anhydride for a scalable and operationally simple trifluoromethylation reaction using pyridine N-oxide and photoredox catalysis to affect a facile decarboxylation to the CF3 radical. PMID:26258541

  4. Rucio, the next-generation Data Management system in ATLAS

    NASA Astrophysics Data System (ADS)

    Serfon, C.; Barisits, M.; Beermann, T.; Garonne, V.; Goossens, L.; Lassnig, M.; Nairz, A.; Vigne, R.; ATLAS Collaboration

    2016-04-01

    Rucio is the next-generation of Distributed Data Management (DDM) system benefiting from recent advances in cloud and ;Big Data; computing to address HEP experiments scaling requirements. Rucio is an evolution of the ATLAS DDM system Don Quixote 2 (DQ2), which has demonstrated very large scale data management capabilities with more than 160 petabytes spread worldwide across 130 sites, and accesses from 1,000 active users. However, DQ2 is reaching its limits in terms of scalability, requiring a large number of support staff to operate and being hard to extend with new technologies. Rucio addresses these issues by relying on new technologies to ensure system scalability, cover new user requirements and employ new automation framework to reduce operational overheads. This paper shows the key concepts of Rucio, details the Rucio design, and the technology it employs, the tests that were conducted to validate it and finally describes the migration steps that were conducted to move from DQ2 to Rucio.

  5. A Rapid Process for Fabricating Gas Sensors

    PubMed Central

    Hsiao, Chun-Ching; Luo, Li-Siang

    2014-01-01

    Zinc oxide (ZnO) is a low-toxicity and environmentally-friendly material applied on devices, sensors or actuators for “green” usage. A porous ZnO film deposited by a rapid process of aerosol deposition (AD) was employed as the gas-sensitive material in a CO gas sensor to reduce both manufacturing cost and time, and to further extend the AD application for a large-scale production. The relative resistance change (ΔR/R) of the ZnO gas sensor was used for gas measurement. The fabricated ZnO gas sensors were measured with operating temperatures ranging from 110 °C to 180 °C, and CO concentrations ranging from 100 ppm to 1000 ppm. The sensitivity and the response time presented good performance at increasing operating temperatures and CO concentrations. AD was successfully for applied for making ZnO gas sensors with great potential for achieving high deposition rates at low deposition temperatures, large-scale production and low cost. PMID:25010696

  6. Effects of Animal Feeding Operations on Water Resources and the Environment

    DTIC Science & Technology

    2000-01-01

    and others tested swine feed and feed ingredients (grain, soybean meal, milk /whey, fats/oils, and protein products). The most frequent serotype...Swine Hepatitis E Virus (sHEV) is a recently discovered virus endemic to Midwest hog herds. The proposed zoonotic nature of Asian strains of human HEV...ground and surface water proximal to large-scale swine operations. We identified chemical pollutants and zoonotic pathogens in the environment on

  7. The construction of power grid operation index system considering the risk of maintenance

    NASA Astrophysics Data System (ADS)

    Tang, Jihong; Wang, Canlin; Jiang, Xinfan; Ye, Jianhui; Pan, Feilai

    2018-02-01

    In recent years, large-scale blackout occurred at home and abroad caused widespread concern about the operation of the grid in the world, and the maintenance risk is an important indicator of grid safety. The barrier operation of the circuit breaker exists in the process of overhaul of the power grid. The operation of the different barrier is of great significance to the change of the power flow, thus affecting the safe operation of the system. Most of the grid operating status evaluation index system did not consider the risk of maintenance, to this end, this paper from the security, economy, quality and cleanliness of the four angles, build the power grid operation index system considering the risk of maintenance.

  8. Perturbation theory for cosmologies with nonlinear structure

    NASA Astrophysics Data System (ADS)

    Goldberg, Sophia R.; Gallagher, Christopher S.; Clifton, Timothy

    2017-11-01

    The next generation of cosmological surveys will operate over unprecedented scales, and will therefore provide exciting new opportunities for testing general relativity. The standard method for modelling the structures that these surveys will observe is to use cosmological perturbation theory for linear structures on horizon-sized scales, and Newtonian gravity for nonlinear structures on much smaller scales. We propose a two-parameter formalism that generalizes this approach, thereby allowing interactions between large and small scales to be studied in a self-consistent and well-defined way. This uses both post-Newtonian gravity and cosmological perturbation theory, and can be used to model realistic cosmological scenarios including matter, radiation and a cosmological constant. We find that the resulting field equations can be written as a hierarchical set of perturbation equations. At leading-order, these equations allow us to recover a standard set of Friedmann equations, as well as a Newton-Poisson equation for the inhomogeneous part of the Newtonian energy density in an expanding background. For the perturbations in the large-scale cosmology, however, we find that the field equations are sourced by both nonlinear and mode-mixing terms, due to the existence of small-scale structures. These extra terms should be expected to give rise to new gravitational effects, through the mixing of gravitational modes on small and large scales—effects that are beyond the scope of standard linear cosmological perturbation theory. We expect our formalism to be useful for accurately modeling gravitational physics in universes that contain nonlinear structures, and for investigating the effects of nonlinear gravity in the era of ultra-large-scale surveys.

  9. Quantum implications of a scale invariant regularization

    NASA Astrophysics Data System (ADS)

    Ghilencea, D. M.

    2018-04-01

    We study scale invariance at the quantum level in a perturbative approach. For a scale-invariant classical theory, the scalar potential is computed at a three-loop level while keeping manifest this symmetry. Spontaneous scale symmetry breaking is transmitted at a quantum level to the visible sector (of ϕ ) by the associated Goldstone mode (dilaton σ ), which enables a scale-invariant regularization and whose vacuum expectation value ⟨σ ⟩ generates the subtraction scale (μ ). While the hidden (σ ) and visible sector (ϕ ) are classically decoupled in d =4 due to an enhanced Poincaré symmetry, they interact through (a series of) evanescent couplings ∝ɛ , dictated by the scale invariance of the action in d =4 -2 ɛ . At the quantum level, these couplings generate new corrections to the potential, as scale-invariant nonpolynomial effective operators ϕ2 n +4/σ2 n. These are comparable in size to "standard" loop corrections and are important for values of ϕ close to ⟨σ ⟩. For n =1 , 2, the beta functions of their coefficient are computed at three loops. In the IR limit, dilaton fluctuations decouple, the effective operators are suppressed by large ⟨σ ⟩, and the effective potential becomes that of a renormalizable theory with explicit scale symmetry breaking by the DR scheme (of μ =constant).

  10. Inferring cortical function in the mouse visual system through large-scale systems neuroscience.

    PubMed

    Hawrylycz, Michael; Anastassiou, Costas; Arkhipov, Anton; Berg, Jim; Buice, Michael; Cain, Nicholas; Gouwens, Nathan W; Gratiy, Sergey; Iyer, Ramakrishnan; Lee, Jung Hoon; Mihalas, Stefan; Mitelut, Catalin; Olsen, Shawn; Reid, R Clay; Teeter, Corinne; de Vries, Saskia; Waters, Jack; Zeng, Hongkui; Koch, Christof

    2016-07-05

    The scientific mission of the Project MindScope is to understand neocortex, the part of the mammalian brain that gives rise to perception, memory, intelligence, and consciousness. We seek to quantitatively evaluate the hypothesis that neocortex is a relatively homogeneous tissue, with smaller functional modules that perform a common computational function replicated across regions. We here focus on the mouse as a mammalian model organism with genetics, physiology, and behavior that can be readily studied and manipulated in the laboratory. We seek to describe the operation of cortical circuitry at the computational level by comprehensively cataloging and characterizing its cellular building blocks along with their dynamics and their cell type-specific connectivities. The project is also building large-scale experimental platforms (i.e., brain observatories) to record the activity of large populations of cortical neurons in behaving mice subject to visual stimuli. A primary goal is to understand the series of operations from visual input in the retina to behavior by observing and modeling the physical transformations of signals in the corticothalamic system. We here focus on the contribution that computer modeling and theory make to this long-term effort.

  11. The Explorer of Diffuse Galactic Emission (EDGE): Determining the Large-Scale Structure Evolution in the Universe

    NASA Technical Reports Server (NTRS)

    Silverberg, R. F.; Cheng, E. S.; Cottingham, D. A.; Fixsen, D. J.; Meyer, S. S.; Knox, L.; Timbie, P.; Wilson, G.

    2003-01-01

    Measurements of the large-scale anisotropy of the Cosmic Infared Background (CIB) can be used to determine the characteristics of the distribution of galaxies at the largest spatial scales. With this information important tests of galaxy evolution models and primordial structure growth are possible. In this paper, we describe the scientific goals, instrumentation, and operation of EDGE, a mission using an Antarctic Long Duration Balloon (LDB) platform. EDGE will osbserve the anisotropy in the CIB in 8 spectral bands from 270 GHz-1.5 THz with 6 arcminute angular resolution over a region -400 square degrees. EDGE uses a one-meter class off-axis telescope and an array of Frequency Selective Bololeters (FSB) to provide the compact and efficient multi-colar, high sensitivity radiometer required to achieve its scientific objectives.

  12. A study of the viability of exploiting memory content similarity to improve resilience to memory errors

    DOE PAGES

    Levy, Scott; Ferreira, Kurt B.; Bridges, Patrick G.; ...

    2014-12-09

    Building the next-generation of extreme-scale distributed systems will require overcoming several challenges related to system resilience. As the number of processors in these systems grow, the failure rate increases proportionally. One of the most common sources of failure in large-scale systems is memory. In this paper, we propose a novel runtime for transparently exploiting memory content similarity to improve system resilience by reducing the rate at which memory errors lead to node failure. We evaluate the viability of this approach by examining memory snapshots collected from eight high-performance computing (HPC) applications and two important HPC operating systems. Based on themore » characteristics of the similarity uncovered, we conclude that our proposed approach shows promise for addressing system resilience in large-scale systems.« less

  13. Linear-scaling density-functional simulations of charged point defects in Al2O3 using hierarchical sparse matrix algebra.

    PubMed

    Hine, N D M; Haynes, P D; Mostofi, A A; Payne, M C

    2010-09-21

    We present calculations of formation energies of defects in an ionic solid (Al(2)O(3)) extrapolated to the dilute limit, corresponding to a simulation cell of infinite size. The large-scale calculations required for this extrapolation are enabled by developments in the approach to parallel sparse matrix algebra operations, which are central to linear-scaling density-functional theory calculations. The computational cost of manipulating sparse matrices, whose sizes are determined by the large number of basis functions present, is greatly improved with this new approach. We present details of the sparse algebra scheme implemented in the ONETEP code using hierarchical sparsity patterns, and demonstrate its use in calculations on a wide range of systems, involving thousands of atoms on hundreds to thousands of parallel processes.

  14. Efficient parallelization of analytic bond-order potentials for large-scale atomistic simulations

    NASA Astrophysics Data System (ADS)

    Teijeiro, C.; Hammerschmidt, T.; Drautz, R.; Sutmann, G.

    2016-07-01

    Analytic bond-order potentials (BOPs) provide a way to compute atomistic properties with controllable accuracy. For large-scale computations of heterogeneous compounds at the atomistic level, both the computational efficiency and memory demand of BOP implementations have to be optimized. Since the evaluation of BOPs is a local operation within a finite environment, the parallelization concepts known from short-range interacting particle simulations can be applied to improve the performance of these simulations. In this work, several efficient parallelization methods for BOPs that use three-dimensional domain decomposition schemes are described. The schemes are implemented into the bond-order potential code BOPfox, and their performance is measured in a series of benchmarks. Systems of up to several millions of atoms are simulated on a high performance computing system, and parallel scaling is demonstrated for up to thousands of processors.

  15. [A large-scale accident in Alpine terrain].

    PubMed

    Wildner, M; Paal, P

    2015-02-01

    Due to the geographical conditions, large-scale accidents amounting to mass casualty incidents (MCI) in Alpine terrain regularly present rescue teams with huge challenges. Using an example incident, specific conditions and typical problems associated with such a situation are presented. The first rescue team members to arrive have the elementary tasks of qualified triage and communication to the control room, which is required to dispatch the necessary additional support. Only with a clear "concept", to which all have to adhere, can the subsequent chaos phase be limited. In this respect, a time factor confounded by adverse weather conditions or darkness represents enormous pressure. Additional hazards are frostbite and hypothermia. If priorities can be established in terms of urgency, then treatment and procedure algorithms have proven successful. For evacuation of causalities, a helicopter should be strived for. Due to the low density of hospitals in Alpine regions, it is often necessary to distribute the patients over a wide area. Rescue operations in Alpine terrain have to be performed according to the particular conditions and require rescue teams to have specific knowledge and expertise. The possibility of a large-scale accident should be considered when planning events. With respect to optimization of rescue measures, regular training and exercises are rational, as is the analysis of previous large-scale Alpine accidents.

  16. Large-scale magnetic topologies of early M dwarfs

    NASA Astrophysics Data System (ADS)

    Donati, J.-F.; Morin, J.; Petit, P.; Delfosse, X.; Forveille, T.; Aurière, M.; Cabanac, R.; Dintrans, B.; Fares, R.; Gastine, T.; Jardine, M. M.; Lignières, F.; Paletou, F.; Ramirez Velez, J. C.; Théado, S.

    2008-10-01

    We present here additional results of a spectropolarimetric survey of a small sample of stars ranging from spectral type M0 to M8 aimed at investigating observationally how dynamo processes operate in stars on both sides of the full convection threshold (spectral type M4). The present paper focuses on early M stars (M0-M3), that is above the full convection threshold. Applying tomographic imaging techniques to time series of rotationally modulated circularly polarized profiles collected with the NARVAL spectropolarimeter, we determine the rotation period and reconstruct the large-scale magnetic topologies of six early M dwarfs. We find that early-M stars preferentially host large-scale fields with dominantly toroidal and non-axisymmetric poloidal configurations, along with significant differential rotation (and long-term variability); only the lowest-mass star of our subsample is found to host an almost fully poloidal, mainly axisymmetric large-scale field resembling those found in mid-M dwarfs. This abrupt change in the large-scale magnetic topologies of M dwarfs (occurring at spectral type M3) has no related signature on X-ray luminosities (measuring the total amount of magnetic flux); it thus suggests that underlying dynamo processes become more efficient at producing large-scale fields (despite producing the same flux) at spectral types later than M3. We suspect that this change relates to the rapid decrease in the radiative cores of low-mass stars and to the simultaneous sharp increase of the convective turnover times (with decreasing stellar mass) that models predict to occur at M3; it may also be (at least partly) responsible for the reduced magnetic braking reported for fully convective stars. Based on observations obtained at the Télescope Bernard Lyot (TBL), operated by the Institut National des Science de l'Univers of the Centre National de la Recherche Scientifique of France. E-mail: donati@ast.obs-mip.fr (J-FD); jmorin@ast.obs-mip.fr (JM); petit@ast.obs-mip.fr (PP); xavier.delfosse@obs.ujf-grenoble.fr (XD); thierry.forveille@obs.ujf-grenoble.fr (TF); auriere@ast.obs-mip.fr (MA); remi.cabanac@ast.obs-mip.fr (RC); dintrans@ast.obs-mip.fr (BD); rfares@ast.obs-mip.fr (RF); tgastine@ast.obs-mip.fr (TG); mmj@st-and.ac.uk (MMJ); lignieres@ast.obs-mip.fr (FL); fpaletou@ast.obs-mip.fr (FP); julio.ramirez@obspm.fr (JCRV); sylvie.theado@ast.obs-mip.fr (ST)

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katti, Amogh; Di Fatta, Giuseppe; Naughton, Thomas

    Future extreme-scale high-performance computing systems will be required to work under frequent component failures. The MPI Forum s User Level Failure Mitigation proposal has introduced an operation, MPI Comm shrink, to synchronize the alive processes on the list of failed processes, so that applications can continue to execute even in the presence of failures by adopting algorithm-based fault tolerance techniques. This MPI Comm shrink operation requires a failure detection and consensus algorithm. This paper presents three novel failure detection and consensus algorithms using Gossiping. The proposed algorithms were implemented and tested using the Extreme-scale Simulator. The results show that inmore » all algorithms the number of Gossip cycles to achieve global consensus scales logarithmically with system size. The second algorithm also shows better scalability in terms of memory and network bandwidth usage and a perfect synchronization in achieving global consensus. The third approach is a three-phase distributed failure detection and consensus algorithm and provides consistency guarantees even in very large and extreme-scale systems while at the same time being memory and bandwidth efficient.« less

  18. Large-scale brain networks in affective and social neuroscience: Towards an integrative functional architecture of the brain

    PubMed Central

    Barrett, Lisa Feldman; Satpute, Ajay

    2013-01-01

    Understanding how a human brain creates a human mind ultimately depends on mapping psychological categories and concepts to physical measurements of neural response. Although it has long been assumed that emotional, social, and cognitive phenomena are realized in the operations of separate brain regions or brain networks, we demonstrate that it is possible to understand the body of neuroimaging evidence using a framework that relies on domain general, distributed structure-function mappings. We review current research in affective and social neuroscience and argue that the emerging science of large-scale intrinsic brain networks provides a coherent framework for a domain-general functional architecture of the human brain. PMID:23352202

  19. Tuneable diode laser gas analyser for methane measurements on a large scale solid oxide fuel cell

    NASA Astrophysics Data System (ADS)

    Lengden, Michael; Cunningham, Robert; Johnstone, Walter

    2011-10-01

    A new in-line, real time gas analyser is described that uses tuneable diode laser spectroscopy (TDLS) for the measurement of methane in solid oxide fuel cells. The sensor has been tested on an operating solid oxide fuel cell (SOFC) in order to prove the fast response and accuracy of the technology as compared to a gas chromatograph. The advantages of using a TDLS system for process control in a large-scale, distributed power SOFC unit are described. In future work, the addition of new laser sources and wavelength modulation will allow the simultaneous measurement of methane, water vapour, carbon-dioxide and carbon-monoxide concentrations.

  20. Local Fitting of the Kohn-Sham Density in a Gaussian and Plane Waves Scheme for Large-Scale Density Functional Theory Simulations.

    PubMed

    Golze, Dorothea; Iannuzzi, Marcella; Hutter, Jürg

    2017-05-09

    A local resolution-of-the-identity (LRI) approach is introduced in combination with the Gaussian and plane waves (GPW) scheme to enable large-scale Kohn-Sham density functional theory calculations. In GPW, the computational bottleneck is typically the description of the total charge density on real-space grids. Introducing the LRI approximation, the linear scaling of the GPW approach with respect to system size is retained, while the prefactor for the grid operations is reduced. The density fitting is an O(N) scaling process implemented by approximating the atomic pair densities by an expansion in one-center fit functions. The computational cost for the grid-based operations becomes negligible in LRIGPW. The self-consistent field iteration is up to 30 times faster for periodic systems dependent on the symmetry of the simulation cell and on the density of grid points. However, due to the overhead introduced by the local density fitting, single point calculations and complete molecular dynamics steps, including the calculation of the forces, are effectively accelerated by up to a factor of ∼10. The accuracy of LRIGPW is assessed for different systems and properties, showing that total energies, reaction energies, intramolecular and intermolecular structure parameters are well reproduced. LRIGPW yields also high quality results for extended condensed phase systems such as liquid water, ice XV, and molecular crystals.

  1. Optimizing C-17 Pacific Basing

    DTIC Science & Technology

    2014-05-01

    Starlifter and C-5 Galaxy turbofan -powered aircraft (Owen, 2014). Pax Americana was characterized by a busy and steady operating environment for US global...center of the Pacific as well as the north and western Pacific rim (Owen, 2014). Turbofan airlifters began to be integrated into the large-scale

  2. Emergent Network Defense

    ERIC Educational Resources Information Center

    Crane, Earl Newell

    2013-01-01

    The research problem that inspired this effort is the challenge of managing the security of systems in large-scale heterogeneous networked environments. Human intervention is slow and limited: humans operate at much slower speeds than networked computer communications and there are few humans associated with each network. Enabling each node in the…

  3. Test and Evaluation of Architecture-Aware Compiler Environment

    DTIC Science & Technology

    2011-11-01

    biology, medicine, social sciences , and security applications. Challenges include extremely large graphs (the Facebook friend network has over...Operations with Temporal Binning ....................................................................... 32 4.12 Memory behavior and Energy per...five challenge problems empirically, exploring their scaling properties, computation and datatype needs, memory behavior , and temporal behavior

  4. Modelling dispersal of radioactive contaminants in Arctic waters as a result of potential recovery operations on the dumped submarine K-27.

    PubMed

    Karcher, M; Hosseini, A; Schnur, R; Kauker, F; Brown, J E; Dowdall, M; Strand, P

    2017-03-15

    Of the wide variety of dumped objects containing radioactive materials in the Arctic seas, the submarine K-27 constitutes a major risk due to the large amount of highly enriched uranium onboard and its location in shallow waters. As the matter of potential operations involving raising of the submarine have entered the public arena, a priori assessment of the contamination in the Arctic marine environment that could result after a possible accident during such operations is a matter of some interest. The dispersion of contaminants within the Arctic has been assessed using a large scale hydrodynamic model for a series of plausible accident scenarios and locations under different oceanographic regimes. Results indicate that, depending primarily on the nature of a release (i.e. instantaneous or continuous), large areas of the Arctic marine environment will exhibit contamination to varying degrees. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Reducing Risk in CO2 Sequestration: A Framework for Integrated Monitoring of Basin Scale Injection

    NASA Astrophysics Data System (ADS)

    Seto, C. J.; Haidari, A. S.; McRae, G. J.

    2009-12-01

    Geological sequestration of CO2 is an option for stabilization of atmospheric CO2 concentrations. Technical ability to safely store CO2 in the subsurface has been demonstrated through pilot projects and a long history of enhanced oil recovery and acid gas disposal operations. To address climate change, current injection operations must be scaled up by a factor of 100, raising issues of safety and security. Monitoring and verification is an essential component in ensuring safe operations and managing risk. Monitoring provides assurance that CO2 is securely stored in the subsurface, and the mechanisms governing transport and storage are well understood. It also provides an early warning mechanism for identification of anomalies in performance, and a means for intervention and remediation through the ability to locate the CO2. Through theoretical studies, bench scale experiments and pilot tests, a number of technologies have demonstrated their ability to monitor CO2 in the surface and subsurface. Because the focus of these studies has been to demonstrate feasibility, individual techniques have not been integrated to provide a more robust method for monitoring. Considering the large volumes required for injection, size of the potential footprint, length of time a project must be monitored and uncertainty, operational considerations of cost and risk must balance safety and security. Integration of multiple monitoring techniques will reduce uncertainty in monitoring injected CO2, thereby reducing risk. We present a framework for risk management of large scale injection through model based monitoring network design. This framework is applied to monitoring CO2 in a synthetic reservoir where there is uncertainty in the underlying permeability field controlling fluid migration. Deformation and seismic data are used to track plume migration. A modified Ensemble Kalman filter approach is used to estimate flow properties by jointly assimilating flow and geomechanical observations. Issues of risk, cost and uncertainty are considered.

  6. Efficient methods and readily customizable libraries for managing complexity of large networks.

    PubMed

    Dogrusoz, Ugur; Karacelik, Alper; Safarli, Ilkin; Balci, Hasan; Dervishi, Leonard; Siper, Metin Can

    2018-01-01

    One common problem in visualizing real-life networks, including biological pathways, is the large size of these networks. Often times, users find themselves facing slow, non-scaling operations due to network size, if not a "hairball" network, hindering effective analysis. One extremely useful method for reducing complexity of large networks is the use of hierarchical clustering and nesting, and applying expand-collapse operations on demand during analysis. Another such method is hiding currently unnecessary details, to later gradually reveal on demand. Major challenges when applying complexity reduction operations on large networks include efficiency and maintaining the user's mental map of the drawing. We developed specialized incremental layout methods for preserving a user's mental map while managing complexity of large networks through expand-collapse and hide-show operations. We also developed open-source JavaScript libraries as plug-ins to the web based graph visualization library named Cytsocape.js to implement these methods as complexity management operations. Through efficient specialized algorithms provided by these extensions, one can collapse or hide desired parts of a network, yielding potentially much smaller networks, making them more suitable for interactive visual analysis. This work fills an important gap by making efficient implementations of some already known complexity management techniques freely available to tool developers through a couple of open source, customizable software libraries, and by introducing some heuristics which can be applied upon such complexity management techniques to ensure preserving mental map of users.

  7. Financing a large-scale picture archival and communication system.

    PubMed

    Goldszal, Alberto F; Bleshman, Michael H; Bryan, R Nick

    2004-01-01

    An attempt to finance a large-scale multi-hospital picture archival and communication system (PACS) solely based on cost savings from current film operations is reported. A modified Request for Proposal described the technical requirements, PACS architecture, and performance targets. The Request for Proposal was complemented by a set of desired financial goals-the main one being the ability to use film savings to pay for the implementation and operation of the PACS. Financing of the enterprise-wide PACS was completed through an operating lease agreement including all PACS equipment, implementation, service, and support for an 8-year term, much like a complete outsourcing. Equipment refreshes, both hardware and software, are included. Our agreement also linked the management of the digital imaging operation (PACS) and the traditional film printing, shifting the operational risks of continued printing and costs related to implementation delays to the PACS vendor. An additional optimization step provided the elimination of the negative film budget variances in the beginning of the project when PACS costs tend to be higher than film and film-related expenses. An enterprise-wide PACS has been adopted to achieve clinical workflow improvements and cost savings. PACS financing was solely based on film savings, which included the entire digital solution (PACS) and any residual film printing. These goals were achieved with simultaneous elimination of any over-budget scenarios providing a non-negative cash flow in each year of an 8-year term.

  8. Statistical Characterization of School Bus Drive Cycles Collected via Onboard Logging Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duran, A.; Walkowicz, K.

    In an effort to characterize the dynamics typical of school bus operation, National Renewable Energy Laboratory (NREL) researchers set out to gather in-use duty cycle data from school bus fleets operating across the country. Employing a combination of Isaac Instruments GPS/CAN data loggers in conjunction with existing onboard telemetric systems resulted in the capture of operating information for more than 200 individual vehicles in three geographically unique domestic locations. In total, over 1,500 individual operational route shifts from Washington, New York, and Colorado were collected. Upon completing the collection of in-use field data using either NREL-installed data acquisition devices ormore » existing onboard telemetry systems, large-scale duty-cycle statistical analyses were performed to examine underlying vehicle dynamics trends within the data and to explore vehicle operation variations between fleet locations. Based on the results of these analyses, high, low, and average vehicle dynamics requirements were determined, resulting in the selection of representative standard chassis dynamometer test cycles for each condition. In this paper, the methodology and accompanying results of the large-scale duty-cycle statistical analysis are presented, including graphical and tabular representations of a number of relationships between key duty-cycle metrics observed within the larger data set. In addition to presenting the results of this analysis, conclusions are drawn and presented regarding potential applications of advanced vehicle technology as it relates specifically to school buses.« less

  9. Reindeer habitat use in relation to two small wind farms, during preconstruction, construction, and operation.

    PubMed

    Skarin, Anna; Alam, Moudud

    2017-06-01

    Worldwide there is a rush toward wind power development and its associated infrastructure. In Fennoscandia, large-scale wind farms comprising several hundred windmills are currently built in important grazing ranges used for Sámi reindeer husbandry. In this study, reindeer habitat use was assessed using reindeer fecal pellet group counts in relation to two relatively small wind farms, with 8 and 10 turbines, respectively. In 2009, 1,315 15-m 2 plots were established and pellet groups were counted and cleaned from the plots. This was repeated once a year in May, during preconstruction, construction, and operation of the wind farms, covering 6 years (2009-2014) of reindeer habitat use in the area. We modeled the presence/absence of any pellets in a plot at both the local (wind farm site) and regional (reindeer calving to autumn range) scale with a hierarchical logistic regression, where spatial correlation was accounted for via random effects, using vegetation type, and the interaction between distance to wind turbine and time period as predictor variables. Our results revealed an absolute reduction in pellet groups by 66% and 86% around each wind farm, respectively, at local scale and by 61% at regional scale during the operation phase compared to the preconstruction phase. At the regional, scale habitat use declined close to the turbines in the same comparison. However, at the local scale, we observed increased habitat use close to the wind turbines at one of the wind farms during the operation phase. This may be explained by continued use of an important migration route close to the wind farm. The reduced use at the regional scale nevertheless suggests that there may be an overall avoidance of both wind farms during operation, but further studies of reindeer movement and behavior are needed to gain a better understanding of the mechanisms behind this suggested avoidance.

  10. Trickle-bed root culture bioreactor design and scale-up: growth, fluid-dynamics, and oxygen mass transfer.

    PubMed

    Ramakrishnan, Divakar; Curtis, Wayne R

    2004-10-20

    Trickle-bed root culture reactors are shown to achieve tissue concentrations as high as 36 g DW/L (752 g FW/L) at a scale of 14 L. Root growth rate in a 1.6-L reactor configuration with improved operational conditions is shown to be indistinguishable from the laboratory-scale benchmark, the shaker flask (mu=0.33 day(-1)). These results demonstrate that trickle-bed reactor systems can sustain tissue concentrations, growth rates and volumetric biomass productivities substantially higher than other reported bioreactor configurations. Mass transfer and fluid dynamics are characterized in trickle-bed root reactors to identify appropriate operating conditions and scale-up criteria. Root tissue respiration goes through a minimum with increasing liquid flow, which is qualitatively consistent with traditional trickle-bed performance. However, liquid hold-up is much higher than traditional trickle-beds and alternative correlations based on liquid hold-up per unit tissue mass are required to account for large changes in biomass volume fraction. Bioreactor characterization is sufficient to carry out preliminary design calculations that indicate scale-up feasibility to at least 10,000 liters.

  11. Experimental Study on Scale-Up of Solid-Liquid Stirred Tank with an Intermig Impeller

    NASA Astrophysics Data System (ADS)

    Zhao, Hongliang; Zhao, Xing; Zhang, Lifeng; Yin, Pan

    2017-02-01

    The scale-up of a solid-liquid stirred tank with an Intermig impeller was characterized via experiments. Solid concentration, impeller just-off-bottom speed and power consumption were measured in stirred tanks of different scales. The scale-up criteria for achieving the same effect of solid suspension in small-scale and large-scale vessels were evaluated. The solids distribution improves if the operating conditions are held constant as the tank is scaled-up. The results of impeller just-off-bottom speed gave X = 0.868 in the scale-up relationship ND X = constant. Based on this criterion, the stirring power per unit volume obviously decreased at N = N js, and the power number ( N P) was approximately equal to 0.3 when the solids are uniformly distributed in the vessels.

  12. Vacuum Stability in Split SUSY and Little Higgs Models

    NASA Astrophysics Data System (ADS)

    Datta, Alakabha; Zhang, Xinmin

    We study the stability of the effective Higgs potential in the split supersymmetry and Little Higgs models. In particular, we study the effects of higher dimensional operators in the effective potential on the Higgs mass predictions. We find that the size and sign of the higher dimensional operators can significantly change the Higgs mass required to maintain vacuum stability in Split SUSY models. In the Little Higgs models the effects of higher dimensional operators can be large because of a relatively lower cutoff scale. Working with a specific model we find that a contribution from the higher dimensional operator with coefficient of O(1) can destabilize the vacuum.

  13. Modelling and operation strategies of DLR's large scale thermocline test facility (TESIS)

    NASA Astrophysics Data System (ADS)

    Odenthal, Christian; Breidenbach, Nils; Bauer, Thomas

    2017-06-01

    In this work an overview of the TESIS:store thermocline test facility and its current construction status will be given. Based on this, the TESIS:store facility using sensible solid filler material is modelled with a fully transient model, implemented in MATLAB®. Results in terms of the impact of filler site and operation strategies will be presented. While low porosity and small particle diameters for the filler material are beneficial, operation strategy is one key element with potential for optimization. It is shown that plant operators have to ponder between utilization and exergetic efficiency. Different durations of the charging and discharging period enable further potential for optimizations.

  14. JELC-LITE: Unconventional Instructional Design for Special Operations Training

    NASA Technical Reports Server (NTRS)

    Friedman, Mark

    2012-01-01

    Current special operations staff training is based on the Joint Event Life Cycle (JELC). It addresses operational level tasks in multi-week, live military exercises which are planned over a 12 to 18 month timeframe. As the military experiences changing global mission sets, shorter training events using distributed technologies will increasingly be needed to augment traditional training. JELC-Lite is a new approach for providing relevant training between large scale exercises. This new streamlined, responsive training model uses distributed and virtualized training technologies to establish simulated scenarios. It keeps proficiency levels closer to optimal levels -- thereby reducing the performance degradation inherent in periodic training. It can be delivered to military as well as under-reached interagency groups to facilitate agile, repetitive training events. JELC-Lite is described by four phases paralleling the JELC, differing mostly in scope and scale. It has been successfully used with a Theater Special Operations Command and fits well within the current environment of reduced personnel and financial resources.

  15. Use of Nanostructures in Fabrication of Large Scale Electrochemical Film

    NASA Astrophysics Data System (ADS)

    Chen, Chien Chon; Chen, Shih Hsun; Shyu, Sheang Wen; Hsieh, Sheng Jen

    Control of electrochemical parameters when preparing small-scale samples for academic research is not difficult. In mass production environments, however, maintenance of constant current density and temperature become a critical issue. This article describes the design of several molds for large work pieces. These molds were designed to maintain constant current density and to facilitate the occurrence of electrochemical reactions in designated areas. Large-area thin films with fine nanostructure were successfully prepared using the designed electrochemical molds and containers. In addition, current density and temperature could be controlled well. This electrochemical system has been verified in many experimental operations, including etching of Al surfaces; electro-polishing of Al, Ti and stainless steel; and fabrication of anodic alumina oxide (AAO), Ti-TiO2 interference membrane, TiO2 nanotubes, AAO-TiO2 nanotubes, Ni nanowires and porous tungsten

  16. Plot-scale field experiment of surface hydrologic processes with EOS implications

    NASA Technical Reports Server (NTRS)

    Laymon, Charles A.; Macari, Emir J.; Costes, Nicholas C.

    1992-01-01

    Plot-scale hydrologic field studies were initiated at NASA Marshall Space Flight Center to a) investigate the spatial and temporal variability of surface and subsurface hydrologic processes, particularly as affected by vegetation, and b) develop experimental techniques and associated instrumentation methodology to study hydrologic processes at increasingly large spatial scales. About 150 instruments, most of which are remotely operated, have been installed at the field site to monitor ground atmospheric conditions, precipitation, interception, soil-water status, and energy flux. This paper describes the nature of the field experiment, instrumentation and sampling rationale, and presents preliminary findings.

  17. Particle-In-Cell Simulations of a Thermionic Converter

    NASA Astrophysics Data System (ADS)

    Clark, S. E.

    2017-12-01

    Simulations of thermionic converters are presented where cesium is used as a work function reducing agent in a nano-fabricated triode configuration. The cathode and anode are spaced on the order of 100 μm, and the grid structure has features on the micron scale near the anode. The hot side is operated near 1600 K, the cold side near 600 K, and the converter has the potential to convert heat to DC electrical current upwards of 20% efficiency. Affordable and robust thermionic converters have the potential to displace century old mechanical engines and turbines as a primary means of electrical power generation in the near future. High efficiency converters that operate at a small scale could be used to generate power locally and alleviate the need for large scale power transmission systems. Electron and negative cesium ion back emission from the anode are considered, as well as device longevity and fabrication feasibility.

  18. Particle-In-Cell Simulations of a Thermionic Converter

    NASA Astrophysics Data System (ADS)

    Clark, Stephen

    2017-10-01

    Simulations of thermionic converters are presented where cesium is used as a work function reducing agent in a nano-fabricated triode configuration. The cathode and anode are spaced on the order of 100 μm, and the grid structure has features on the micron scale near the anode. The hot side is operated near 1600 K, the cold side near 600 K, and the converter has the potential to convert heat to DC electrical current upwards of 20% efficiency. Affordable and robust thermionic converters have the potential to displace century old mechanical engines and turbines as a primary means of electrical power generation in the near future. High efficiency converters that operate at a small scale could be used to generate power locally and alleviate the need for large scale power transmission systems. Electron and negative cesium ion back emission from the anode are considered, as well as device longevity and fabrication feasibility.

  19. Material System Engineering for Advanced Electrocaloric Cooling Technology

    NASA Astrophysics Data System (ADS)

    Qian, Xiaoshi

    Electrocaloric effect refers to the entropy change and/or temperature change in dielectrics caused by the electric field induced polarization change. Recent discovery of giant ECE provides an opportunity to realize highly efficient cooling devices for a broad range of applications ranging from household appliances to industrial applications, from large-scale building thermal management to micro-scale cooling devices. The advances of electrocaloric (EC) based cooling device prototypes suggest that highly efficient cooling devices with compact size are achievable, which could lead to revolution in next generation refrigeration technology. This dissertation focuses on both EC based materials and cooling devices with their recent advances that address practical issues. Based on better understandings in designing an EC device, several EC material systems are studied and improved to promote the performances of EC based cooling devices. In principle, applying an electric field to a dielectric would cause change of dipolar ordering states and thus a change of dipolar entropy. Giant ECE observed in ferroelectrics near ferroelectric-paraelectric (FE-PE) transition temperature is owing to the large dipolar orientation change, between random-oriented dipolar states in paraelectric phase and spontaneous-ordered dipolar states in ferroelectric phases, which is induced by external electric fields. Besides pursuing large ECE, studies on EC cooling devices indicated that EC materials are required to possess wide operational temperature window, in which large ECE can be maintained for efficient operations. Although giant ECE was first predicted in ferroelectric polymers, where the large effect exhibits near FEPE phase transition, the narrow operation temperature window poses obstacles for these normal ferroelectrics to be conveniently perform in wide range of applications. In this dissertation, we demonstrated that the normal ferroelectric polymers can be converted to relaxor ferroelectric polymers which possess both giant ECE (27 Kelvin temperature drop) and much wider operating temperature window (over 50 kelvin covering RT) by proper defect modification which delicately tailors ferroelectrics in meso-, micro- and molecular scales. In addition, in order to be practical, EC device requires EC material can be driven at low electric fields upon achieve the large ECE. It is demonstrated in this dissertation that by facially modifying materials structure in meso-, micro- and molecular scale, lowfield ECE can be greatly improved. Large ECE, induced by low electric fields and existing in wide temperature window, is a major improvement in EC materials for practical applications. Besides EC polymers, this thesis also investigated EC ceramics. Due to several unique opportunities offered by the EC ceramics, Ba(ZrxTi 1-x)O3 (BZT), that is studied. (i) This class of EC ceramics offers a possibility to explore the invariant critical point (ICP), which maximizes the number of coexistent phase and provides a nearly vanishing energy barrier for switching among different phases. As demonstrated in this thesis, the BZT bulk ceramics at x˜ 0.2 exhibits a large adiabatic temperature drop DeltaTc=4.5 K, a large isothermal entropy change DeltaS = 8 Jkg-1K-1, a large EC coefficient (|DeltaT c/DeltaE| = 0.52x10-6 KmV-1 and DeltaS/DeltaE=0.93x10 -6 Jmkg-1K-1V-1) over a wide operating temperature range Tspan>30K. (ii) The thermal conductivity of EC ceramics is in general, much higher than that of EC polymers, and consequently they will allow EC cooling configurations which are not accessible by the EC polymers. Moreover, in the same device configuration, the high thermal conductivity of EC ceramics (kappa> 5 W/mK, compared with EC polymer, ˜ 0.25 W/mK) allows higher operation frequency and therefore a higher cooling power. (iii) Well-established fabrication processes of multilayer ceramic capacitor (MLCC) provide a foundation for the EC ceramic toward mass production. In this thesis, BZT thick film double layers have been fabricated and large ECE has been directly measured. EC induced temperature drop (DeltaT) around 6.3 °C and entropy change (DeltaS) of 11.0 Jkg-1K -1 are observed under an electric field of DeltaE=14.6 MV/m at 40 °C was observed in BZT thick film double layers. The result encourages further investigations on ECE in MLCC for practical applications. (Abstract shortened by ProQuest.).

  20. Evaluating Commercial and Private Cloud Services for Facility-Scale Geodetic Data Access, Analysis, and Services

    NASA Astrophysics Data System (ADS)

    Meertens, C. M.; Boler, F. M.; Ertz, D. J.; Mencin, D.; Phillips, D.; Baker, S.

    2017-12-01

    UNAVCO, in its role as a NSF facility for geodetic infrastructure and data, has succeeded for over two decades using on-premises infrastructure, and while the promise of cloud-based infrastructure is well-established, significant questions about suitability of such infrastructure for facility-scale services remain. Primarily through the GeoSciCloud award from NSF EarthCube, UNAVCO is investigating the costs, advantages, and disadvantages of providing its geodetic data and services in the cloud versus using UNAVCO's on-premises infrastructure. (IRIS is a collaborator on the project and is performing its own suite of investigations). In contrast to the 2-3 year time scale for the research cycle, the time scale of operation and planning for NSF facilities is for a minimum of five years and for some services extends to a decade or more. Planning for on-premises infrastructure is deliberate, and migrations typically take months to years to fully implement. Migrations to a cloud environment can only go forward with similar deliberate planning and understanding of all costs and benefits. The EarthCube GeoSciCloud project is intended to address the uncertainties of facility-level operations in the cloud. Investigations are being performed in a commercial cloud environment (Amazon AWS) during the first year of the project and in a private cloud environment (NSF XSEDE resource at the Texas Advanced Computing Center) during the second year. These investigations are expected to illuminate the potential as well as the limitations of running facility scale production services in the cloud. The work includes running parallel equivalent cloud-based services to on premises services and includes: data serving via ftp from a large data store, operation of a metadata database, production scale processing of multiple months of geodetic data, web services delivery of quality checked data and products, large-scale compute services for event post-processing, and serving real time data from a network of 700-plus GPS stations. The evaluation is based on a suite of metrics that we have developed to elucidate the effectiveness of cloud-based services in price, performance, and management. Services are currently running in AWS and evaluation is underway.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Hong -Yi; Leung, L. Ruby; Tesfa, Teklu

    A new large-scale stream temperature model has been developed within the Community Earth System Model (CESM) framework. The model is coupled with the Model for Scale Adaptive River Transport (MOSART) that represents river routing and a water management model (WM) that represents the effects of reservoir operations and water withdrawals on flow regulation. The coupled models allow the impacts of reservoir operations and withdrawals on stream temperature to be explicitly represented in a physically based and consistent way. The models have been applied to the Contiguous United States driven by observed meteorological forcing. It is shown that the model ismore » capable of reproducing stream temperature spatiotemporal variation satisfactorily by comparison against the observed streamflow from over 320 USGS stations. Including water management in the models improves the agreement between the simulated and observed streamflow at a large number of stream gauge stations. Both climate and water management are found to have important influence on the spatiotemporal patterns of stream temperature. More interestingly, it is quantitatively estimated that reservoir operation could cool down stream temperature in the summer low-flow season (August – October) by as much as 1~2oC over many places, as water management generally mitigates low flow, which has important implications to aquatic ecosystems. In conclusion, sensitivity of the simulated stream temperature to input data and reservoir operation rules used in the WM model motivates future directions to address some limitations in the current modeling framework.« less

  2. Site Environmental Report for Calendar Year 2000. DOE Operations at The Boeing Company, Rocketdyne Propulsion & Power

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rutherford, Phil; Samuels, Sandy; Lee, Majelle

    2001-09-01

    This Annual Site Environmental Report (ASER) for 2000 describes the environmental conditions related to work performed for the Department of Energy (DOE) at Area IV of the Rocketdyne Santa Susana Field Laboratory (SSFL). In the past, these operations included development, fabrication, and disassembly of nuclear reactors, reactor fuel, and other radioactive materials, under the former Atomics International (AI) Division. Other activities included the operation of large-scale liquid metal facilities for testing of liquid metal fast breeder components at the Energy Technology Engineering Center (ETEC), a government-owned company-operated, test facility within Area IV. All nuclear work was terminated in 1988, andmore » subsequently, all radiological work has been directed toward decontamination and decommissioning (D&D) of the previously used nuclear facilities and associated site areas. Large-scale D&D activities of the sodium test facilities began in 1996. Results of the radiological monitoring program for the calendar year of 2000 continue to indicate no significant releases of radioactive material from Rocketdyne sites. All potential exposure pathways are sampled and/or monitored, including air, soil, surface water, groundwater, direct radiation, transfer of property (land, structures, waste), and recycling. All radioactive wastes are processed for disposal at DOE disposal sites and other sites approved by DOE and licensed for radioactive waste. Liquid radioactive wastes are not released into the environment and do not constitute an exposure pathway.« less

  3. Decision Support System for Reservoir Management and Operation in Africa

    NASA Astrophysics Data System (ADS)

    Navar, D. A.

    2016-12-01

    Africa is currently experiencing a surge in dam construction for flood control, water supply and hydropower production, but ineffective reservoir management has caused problems in the region, such as water shortages, flooding and loss of potential hydropower generation. Our research aims to remedy ineffective reservoir management by developing a novel Decision Support System(DSS) to equip water managers with a technical planning tool based on the state of the art in hydrological sciences. The DSS incorporates a climate forecast model, a hydraulic model of the watershed, and an optimization model to effectively plan for the operation of a system of cascade large-scale reservoirs for hydropower production, while treating water supply and flood control as constraints. Our team will use the newly constructed hydropower plants in the Omo Gibe basin of Ethiopia as the test case. Using the basic HIDROTERM software developed in Brazil, the General Algebraic Modeling System (GAMS) utilizes a combination of linear programing (LP) and non-linear programming (NLP) in conjunction with real time hydrologic and energy demand data to optimize the monthly and daily operations of the reservoir system. We compare the DSS model results with the current reservoir operating policy used by the water managers of that region. We also hope the DSS will eliminate the current dangers associated with the mismanagement of large scale water resources projects in Africa.

  4. Programming in a proposed 9X distributed Ada

    NASA Technical Reports Server (NTRS)

    Waldrop, Raymond S.; Volz, Richard A.; Goldsack, Stephen J.

    1990-01-01

    The proposed Ada 9X constructs for distribution was studied. The goal was to select suitable test cases to help in the evaluation of the proposed constructs. The examples were to be considered according to the following requirements: real time operation; fault tolerance at several different levels; demonstration of both distributed and massively parallel operation; reflection of realistic NASA programs; illustration of the issues of configuration, compilation, linking, and loading; indications of the consequences of using the proposed revisions for large scale programs; and coverage of the spectrum of communication patterns such as predictable, bursty, small and large messages. The first month was spent identifying possible examples and judging their suitability for the project.

  5. GraphReduce: Processing Large-Scale Graphs on Accelerator-Based Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sengupta, Dipanjan; Song, Shuaiwen; Agarwal, Kapil

    2015-11-15

    Recent work on real-world graph analytics has sought to leverage the massive amount of parallelism offered by GPU devices, but challenges remain due to the inherent irregularity of graph algorithms and limitations in GPU-resident memory for storing large graphs. We present GraphReduce, a highly efficient and scalable GPU-based framework that operates on graphs that exceed the device’s internal memory capacity. GraphReduce adopts a combination of edge- and vertex-centric implementations of the Gather-Apply-Scatter programming model and operates on multiple asynchronous GPU streams to fully exploit the high degrees of parallelism in GPUs with efficient graph data movement between the host andmore » device.« less

  6. Repository Planning, Design, and Engineering: Part II-Equipment and Costing.

    PubMed

    Baird, Phillip M; Gunter, Elaine W

    2016-08-01

    Part II of this article discusses and provides guidance on the equipment and systems necessary to operate a repository. The various types of storage equipment and monitoring and support systems are presented in detail. While the material focuses on the large repository, the requirements for a small-scale startup are also presented. Cost estimates and a cost model for establishing a repository are presented. The cost model presents an expected range of acquisition costs for the large capital items in developing a repository. A range of 5,000-7,000 ft(2) constructed has been assumed, with 50 frozen storage units, to reflect a successful operation with growth potential. No design or engineering costs, permit or regulatory costs, or smaller items such as the computers, software, furniture, phones, and barcode readers required for operations have been included.

  7. Cosmology Large Angular Scale Surveyor (CLASS) Focal Plane Development

    NASA Technical Reports Server (NTRS)

    Chuss, D. T.; Ali, A.; Amiri, M.; Appel, J.; Bennett, C. L.; Colazo, F.; Denis, K. L.; Dunner, R.; Essinger-Hileman, T.; Eimer, J.; hide

    2015-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) will measure the polarization of the Cosmic Microwave Background to search for and characterize the polarized signature of inflation. CLASS will operate from the Atacama Desert and observe approx.70% of the sky. A variable-delay polarization modulator provides modulation of the polarization at approx.10Hz to suppress the 1/f noise of the atmosphere and enable the measurement of the large angular scale polarization modes. The measurement of the inflationary signal across angular scales that spans both the recombination and reionization features allows a test of the predicted shape of the polarized angular power spectra in addition to a measurement of the energy scale of inflation. CLASS is an array of telescopes covering frequencies of 38, 93, 148, and 217 GHz. These frequencies straddle the foreground minimum and thus allow the extraction of foregrounds from the primordial signal. Each focal plane contains feedhorn-coupled transition-edge sensors that simultaneously detect two orthogonal linear polarizations. The use of single-crystal silicon as the dielectric for the on-chip transmission lines enables both high efficiency and uniformity in fabrication. Integrated band definition has been implemented that both controls the bandpass of the single-mode transmission on the chip and prevents stray light from coupling to the detectors.

  8. Cosmology Large Angular Scale Surveyor (CLASS) Focal Plane Development

    NASA Astrophysics Data System (ADS)

    Chuss, D. T.; Ali, A.; Amiri, M.; Appel, J.; Bennett, C. L.; Colazo, F.; Denis, K. L.; Dünner, R.; Essinger-Hileman, T.; Eimer, J.; Fluxa, P.; Gothe, D.; Halpern, M.; Harrington, K.; Hilton, G.; Hinshaw, G.; Hubmayr, J.; Iuliano, J.; Marriage, T. A.; Miller, N.; Moseley, S. H.; Mumby, G.; Petroff, M.; Reintsema, C.; Rostem, K.; U-Yen, K.; Watts, D.; Wagner, E.; Wollack, E. J.; Xu, Z.; Zeng, L.

    2016-08-01

    The Cosmology Large Angular Scale Surveyor (CLASS) will measure the polarization of the Cosmic Microwave Background to search for and characterize the polarized signature of inflation. CLASS will operate from the Atacama Desert and observe ˜ 70 % of the sky. A variable-delay polarization modulator provides modulation of the polarization at ˜ 10 Hz to suppress the 1/ f noise of the atmosphere and enable the measurement of the large angular scale polarization modes. The measurement of the inflationary signal across angular scales that spans both the recombination and reionization features allows a test of the predicted shape of the polarized angular power spectra in addition to a measurement of the energy scale of inflation. CLASS is an array of telescopes covering frequencies of 38, 93, 148, and 217 GHz. These frequencies straddle the foreground minimum and thus allow the extraction of foregrounds from the primordial signal. Each focal plane contains feedhorn-coupled transition-edge sensors that simultaneously detect two orthogonal linear polarizations. The use of single-crystal silicon as the dielectric for the on-chip transmission lines enables both high efficiency and uniformity in fabrication. Integrated band definition has been implemented that both controls the bandpass of the single-mode transmission on the chip and prevents stray light from coupling to the detectors.

  9. Review of problems in the small-scale farm production of ethanol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, H.M.

    1983-07-01

    This report reviews the current status of small, farmer-operated ethanol production facilities. The characteristics and operating problems associated with present plants are reviewed with respect to technical, economic, and institutional issues. Information was obtained from recent publications and numerous telephone calls to state and federal officials and the producers themselves. It is concluded that, in most parts of the country, small-scale alcohol production has been reduced to relatively few farm plants, due primarily to several unfavorable economic factors. While both large and small facilities have been squeezed by rising feedstock costs and lower alcohol selling prices, the farmer-producer is burdenedmore » by additional constraints because of the small scale of his operations. It is not usually profitable for him to recover all the valuable by-products from the feedstock, such as gluten, corn oil, and carbon dioxide from corn conversion. He may not be able to use or market the wet alcohol and stillage he produces. Other difficulties often include high fuel costs, lack of financial and technical assistance, and excessive labor requirements.« less

  10. Large-Scale Optimization for Bayesian Inference in Complex Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willcox, Karen; Marzouk, Youssef

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of themore » SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.« less

  11. Final Report: Large-Scale Optimization for Bayesian Inference in Complex Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghattas, Omar

    2013-10-15

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimiza- tion) Project focuses on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimiza- tion and inversion methods. Our research is directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. Our efforts are integrated in the context of a challenging testbed problem that considers subsurface reacting flow and transport. The MIT component of the SAGUAROmore » Project addresses the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas-Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to- observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as "reduce then sample" and "sample then reduce." In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.« less

  12. Extending SME to Handle Large-Scale Cognitive Modeling.

    PubMed

    Forbus, Kenneth D; Ferguson, Ronald W; Lovett, Andrew; Gentner, Dedre

    2017-07-01

    Analogy and similarity are central phenomena in human cognition, involved in processes ranging from visual perception to conceptual change. To capture this centrality requires that a model of comparison must be able to integrate with other processes and handle the size and complexity of the representations required by the tasks being modeled. This paper describes extensions to Structure-Mapping Engine (SME) since its inception in 1986 that have increased its scope of operation. We first review the basic SME algorithm, describe psychological evidence for SME as a process model, and summarize its role in simulating similarity-based retrieval and generalization. Then we describe five techniques now incorporated into the SME that have enabled it to tackle large-scale modeling tasks: (a) Greedy merging rapidly constructs one or more best interpretations of a match in polynomial time: O(n 2 log(n)); (b) Incremental operation enables mappings to be extended as new information is retrieved or derived about the base or target, to model situations where information in a task is updated over time; (c) Ubiquitous predicates model the varying degrees to which items may suggest alignment; (d) Structural evaluation of analogical inferences models aspects of plausibility judgments; (e) Match filters enable large-scale task models to communicate constraints to SME to influence the mapping process. We illustrate via examples from published studies how these enable it to capture a broader range of psychological phenomena than before. Copyright © 2016 Cognitive Science Society, Inc.

  13. Community and occupational health concerns in pork production: a review.

    PubMed

    Donham, K J

    2010-04-01

    Public concerns relative to adverse consequences of large-scale livestock production have been increasingly voiced since the late 1960s. Numerous regional, national, and international conferences have been held on the subject since 1994. This paper provides a review of the literature on the community and occupational health concerns of large-scale livestock production with a focus on pork production. The industry has recognized the concerns of the public, and the national and state pork producer groups are including these issues as an important component of their research and policy priorities. One reason large-scale livestock production has raised concern is that a significant component of the industry has separated from traditional family farming and has developed like other industries in management, structure, and concentration. The magnitude of the problem cited by environmental groups has often been criticized by the pork production industry for lack of science-based evidence to document environmental concerns. In addition to general environmental concerns, occupational health of workers has become more relevant because many operations now are employing more than 10 employees, which brings many operations in the United States under the scrutiny of the US Occupational Safety and Health Administration. In this paper, the scientific literature is reviewed relative to the science basis of occupational and environmental impacts on community and worker health. Further, recommendations are made to help promote sustainability of the livestock industry within the context of maintaining good stewardship of our environmental and human capital.

  14. Los Alamos NEP research in advanced plasma thrusters

    NASA Technical Reports Server (NTRS)

    Schoenberg, Kurt; Gerwin, Richard

    1991-01-01

    Research was initiated in advanced plasma thrusters that capitalizes on lab capabilities in plasma science and technology. The goal of the program was to examine the scaling issues of magnetoplasmadynamic (MPD) thruster performance in support of NASA's MPD thruster development program. The objective was to address multi-megawatt, large scale, quasi-steady state MPD thruster performance. Results to date include a new quasi-steady state operating regime which was obtained at space exploration initiative relevant power levels, that enables direct coaxial gun-MPD comparisons of thruster physics and performance. The radiative losses are neglible. Operation with an applied axial magnetic field shows the same operational stability and exhaust plume uniformity benefits seen in MPD thrusters. Observed gun impedance is in close agreement with the magnetic Bernoulli model predictions. Spatial and temporal measurements of magnetic field, electric field, plasma density, electron temperature, and ion/neutral energy distribution are underway. Model applications to advanced mission logistics are also underway.

  15. Detection and analysis of part load and full load instabilities in a real Francis turbine prototype

    NASA Astrophysics Data System (ADS)

    Presas, Alexandre; Valentin, David; Egusquiza, Eduard; Valero, Carme

    2017-04-01

    Francis turbines operate in many cases out of its best efficiency point, in order to regulate their output power according to the instantaneous energy demand of the grid. Therefore, it is of paramount importance to analyse and determine the unstable operating points for these kind of units. In the framework of the HYPERBOLE project (FP7-ENERGY-2013-1; Project number 608532) a large Francis unit was investigated numerically, experimentally in a reduced scale model and also experimentally and numerically in the real prototype. This paper shows the unstable operating points identified during the experimental tests on the real Francis unit and the analysis of the main characteristics of these instabilities. Finally, it is shown that similar phenomena have been identified on previous research in the LMH (Laboratory for Hydraulic Machines, Lausanne) with the reduced scale model.

  16. From bioterrorism exercise to real-life public health crisis: lessons for emergency hotline operations.

    PubMed

    Uscher-Pines, Lori; Bookbinder, Sylvia H; Miro, Suzanne; Burke, Thomas

    2007-01-01

    Although public health agencies routinely operate hotlines to communicate key messages to the public, they are rarely evaluated to improve hotline management. Since its creation in 2003, the New Jersey Department of Health & Senior Services' Emergency Communications Center has confronted two large-scale incidents that have tested its capabilities in this area. The influenza vaccine shortage of 2004 and the April 2005 TOPOFF 3 full-scale bioterrorism exercise provided both real-life and simulated crisis situations from which to derive general insights into the strengths and weaknesses of hotline administration. This article identifies problems in the areas of staff and message management by analyzing call volume data and the qualitative observations of group feedback sessions and semistructured interviews with hotline staff. It also makes recommendations based on lessons learned to improve future hotline operations in public health emergencies.

  17. Effects of local and large-scale climate patterns on estuarine resident fishes: The example of Pomatoschistus microps and Pomatoschistus minutus

    NASA Astrophysics Data System (ADS)

    Nyitrai, Daniel; Martinho, Filipe; Dolbeth, Marina; Rito, João; Pardal, Miguel A.

    2013-12-01

    Large-scale and local climate patterns are known to influence several aspects of the life cycle of marine fish. In this paper, we used a 9-year database (2003-2011) to analyse the populations of two estuarine resident fishes, Pomatoschistus microps and Pomatoschistus minutus, in order to determine their relationships with varying environmental stressors operating over local and large scales. This study was performed in the Mondego estuary, Portugal. Firstly, the variations in abundance, growth, population structure and secondary production were evaluated. These species appeared in high densities in the beginning of the study period, with subsequent occasional high annual density peaks, while their secondary production was lower in dry years. The relationships between yearly fish abundance and the environmental variables were evaluated separately for both species using Spearman correlation analysis, considering the yearly abundance peaks for the whole population, juveniles and adults. Among the local climate patterns, precipitation, river runoff, salinity and temperature were used in the analyses, and North Atlantic Oscillation (NAO) index and sea surface temperature (SST) were tested as large-scale factors. For P. microps, precipitation and NAO were the significant factors explaining abundance of the whole population, the adults and the juveniles as well. Regarding P. minutus, for the whole population, juveniles and adults river runoff was the significant predictor. The results for both species suggest a differential influence of climate patterns on the various life cycle stages, confirming also the importance of estuarine resident fishes as indicators of changes in local and large-scale climate patterns, related to global climate change.

  18. Civil Military Cooperation for Counterterrorism Operations within the United States

    DTIC Science & Technology

    2013-03-01

    as well as the subsequent attempts to contaminate Americans with anthrax , dramatically exposed the nation’s vulnerabilities to domestic terrorism and...terrorism, natural disasters, large-scale cyber-attacks, and pandemics .”23 One of the primary concerns for the U.S. is the dangerous type of weapons that

  19. The Hybrid Mindset and Operationalizing Innovation: Toward a Theory of Hybrid

    DTIC Science & Technology

    2014-05-22

    relatively large-scale military clashes between Israel and Hezbollah, Operations Accountability (25-31 July 1993) and Grapes of Wrath (11-27 April 1996... Grapes of Wrath (11-27 April 1996) provided Nasrallah and Hezbollah with intelligence on how IDF would attack into southern Lebanon. In addition

  20. LARGE-SCALE BIOSLURPING OPERATIONS USED FOR FUEL RECOVERY

    EPA Science Inventory

    Since 1996, the US Air Force has been using bioslurping to recover JP-5 fuel from unsaturated soil at a facility on the island of Diego Garcia, in the Indian Ocean. To date, more that 100,000 gallons of fuel have been recovered by the bioslurping system. Bioslurping augmented b...

  1. Bottomland hardwood afforestation: State of the art

    Treesearch

    Emile S. Gardiner; D. Ramsey Russell; Mark Oliver; Lamar C. Dorris

    2000-01-01

    Over the past decade, land managers have implemented large-scale afforestation operations across the Southern United States to rehabilitate agricultural land historically converted from bottomland hardwood forest cover types. These afforestation efforts were initially concentrated on public land managed by State or Federal Government agencies, but have later shifted...

  2. An integrated approach to mapping forest conditions in the Southern Appalachians (North Carolina)

    Treesearch

    Weimin Xi; Lei Wang; Andrew G Birt; Maria D. Tchakerian; Robert N. Coulson; Kier D. Klepzig

    2009-01-01

    Accurate and continuous forest cover information is essential for forest management and restoration (SAMAB 1996, Xi et al. 2007). Ground-truthed, spatially explicit forest data, however, are often limited to federally managed land or large-scale commercial forestry operations where forest inventories are regularly collected. Moreover,...

  3. Limited Aspects of Reality: Frames of Reference in Language Assessment

    ERIC Educational Resources Information Center

    Fulcher, Glenn; Svalberg, Agneta

    2013-01-01

    Language testers operate within two frames of reference: norm-referenced (NRT) and criterion-referenced testing (CRT). The former underpins the world of large-scale standardized testing that prioritizes variability and comparison. The latter supports substantive score meaning in formative and domain specific assessment. Some claim that the…

  4. Data related uncertainty in near-surface vulnerability assessments for agrochemicals in the San Joaquin Valley

    USDA-ARS?s Scientific Manuscript database

    Precious groundwater resources across the USA have been contaminated due to decades-long nonpoint-source applications of agricultural chemicals. Assessing the impact of past, ongoing, and future chemical applications for large-scale agriculture operations is timely for designing best-management prac...

  5. Automated geographic registration and radiometric correction for UAV-based mosaics

    USDA-ARS?s Scientific Manuscript database

    Texas A&M University has been operating a large-scale, UAV-based, agricultural remote-sensing research project since 2015. To use UAV-based images in agricultural production, many high-resolution images must be mosaicked together to create an image of an agricultural field. Two key difficulties to s...

  6. Characterizing a shallow groundwater system beneath irrigated sugarcane with electrical resistivity and radon (Rn-222), Puunene, Hawaii

    USDA-ARS?s Scientific Manuscript database

    In this study, we use a combination of electrical resistivity profiling and radon (222Rn) measurements to characterize a shallow groundwater system beneath the last remaining, large-scale sugarcane plantation on Maui, Hawaii. Hawaiian Commercial & Sugar Company has continuously operated a sugarcane...

  7. A Case for Data Commons

    PubMed Central

    Grossman, Robert L.; Heath, Allison; Murphy, Mark; Patterson, Maria; Wells, Walt

    2017-01-01

    Data commons collocate data, storage, and computing infrastructure with core services and commonly used tools and applications for managing, analyzing, and sharing data to create an interoperable resource for the research community. An architecture for data commons is described, as well as some lessons learned from operating several large-scale data commons. PMID:29033693

  8. Calving distributions of individual bulls in multiple-sire pastures

    USDA-ARS?s Scientific Manuscript database

    The objective of this project was to quantify patterns in the calving rate of sires in multiple-sire pastures over seven years at a large-scale cow-calf operation. Data consisted of reproductive and genomic records from multiple-sire breeding pastures (n=33) at the United States Meat Animal Research...

  9. Keep New Mexico Beautiful, Recycling Project Successful

    ERIC Educational Resources Information Center

    Bickel, Victor R.

    1975-01-01

    Through the efforts of community groups, the support of local industries, and the state government, Keep New Mexico Beautiful, Inc. (KNMB) is now operating a large-scale recycling business. KNMB has been able to save tons of natural resources, provide local employment, and educate the public to this environmental concern. (MA)

  10. Algorithms for Mathematical Programming with Emphasis on Bi-level Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldfarb, Donald; Iyengar, Garud

    2014-05-22

    The research supported by this grant was focused primarily on first-order methods for solving large scale and structured convex optimization problems and convex relaxations of nonconvex problems. These include optimal gradient methods, operator and variable splitting methods, alternating direction augmented Lagrangian methods, and block coordinate descent methods.

  11. Market scenarios and alternative administrative frameworks for US educational satellite systems

    NASA Technical Reports Server (NTRS)

    Walkmeyer, J. E., Jr.; Morgan, R. P.; Singh, J. P.

    1975-01-01

    Costs and benefits of developing an operational educational satellite system in the U.S. are analyzed. Scenarios are developed for each educational submarket and satellite channel and ground terminal requirements for a large-scale educational telecommunications system are estimated. Alternative organizational frameworks for such a system are described.

  12. Detecting Item Drift in Large-Scale Testing

    ERIC Educational Resources Information Center

    Guo, Hongwen; Robin, Frederic; Dorans, Neil

    2017-01-01

    The early detection of item drift is an important issue for frequently administered testing programs because items are reused over time. Unfortunately, operational data tend to be very sparse and do not lend themselves to frequent monitoring analyses, particularly for on-demand testing. Building on existing residual analyses, the authors propose…

  13. Large Scale IR Evaluation

    ERIC Educational Resources Information Center

    Pavlu, Virgil

    2008-01-01

    Today, search engines are embedded into all aspects of digital world: in addition to Internet search, all operating systems have integrated search engines that respond even as you type, even over the network, even on cell phones; therefore the importance of their efficacy and efficiency cannot be overstated. There are many open possibilities for…

  14. Large-Scale Biomonitoring of Remote and Threatened Ecosystems via High-Throughput Sequencing

    PubMed Central

    Gibson, Joel F.; Shokralla, Shadi; Curry, Colin; Baird, Donald J.; Monk, Wendy A.; King, Ian; Hajibabaei, Mehrdad

    2015-01-01

    Biodiversity metrics are critical for assessment and monitoring of ecosystems threatened by anthropogenic stressors. Existing sorting and identification methods are too expensive and labour-intensive to be scaled up to meet management needs. Alternately, a high-throughput DNA sequencing approach could be used to determine biodiversity metrics from bulk environmental samples collected as part of a large-scale biomonitoring program. Here we show that both morphological and DNA sequence-based analyses are suitable for recovery of individual taxonomic richness, estimation of proportional abundance, and calculation of biodiversity metrics using a set of 24 benthic samples collected in the Peace-Athabasca Delta region of Canada. The high-throughput sequencing approach was able to recover all metrics with a higher degree of taxonomic resolution than morphological analysis. The reduced cost and increased capacity of DNA sequence-based approaches will finally allow environmental monitoring programs to operate at the geographical and temporal scale required by industrial and regulatory end-users. PMID:26488407

  15. Cruise noise of the 2/9th scale model of the Large-scale Advanced Propfan (LAP) propeller, SR-7A

    NASA Technical Reports Server (NTRS)

    Dittmar, James H.; Stang, David B.

    1987-01-01

    Noise data on the Large-scale Advanced Propfan (LAP) propeller model SR-7A were taken in the NASA Lewis Research Center 8 x 6 foot Wind Tunnel. The maximum blade passing tone noise first rises with increasing helical tip Mach number to a peak level, then remains the same or decreases from its peak level when going to higher helical tip Mach numbers. This trend was observed for operation at both constant advance ratio and approximately equal thrust. This noise reduction or, leveling out at high helical tip Mach numbers, points to the use of higher propeller tip speeds as a possible method to limit airplane cabin noise while maintaining high flight speed and efficiency. Projections of the tunnel model data are made to the full scale LAP propeller mounted on the test bed aircraft and compared with predictions. The prediction method is found to be somewhat conservative in that it slightly overpredicts the projected model data at the peak.

  16. Cruise noise of the 2/9 scale model of the Large-scale Advanced Propfan (LAP) propeller, SR-7A

    NASA Technical Reports Server (NTRS)

    Dittmar, James H.; Stang, David B.

    1987-01-01

    Noise data on the Large-scale Advanced Propfan (LAP) propeller model SR-7A were taken in the NASA Lewis Research Center 8 x 6 foot Wind Tunnel. The maximum blade passing tone noise first rises with increasing helical tip Mach number to a peak level, then remains the same or decreases from its peak level when going to higher helical tip Mach numbers. This trend was observed for operation at both constant advance ratio and approximately equal thrust. This noise reduction or, leveling out at high helical tip Mach numbers, points to the use of higher propeller tip speeds as a possible method to limit airplane cabin noise while maintaining high flight speed and efficiency. Projections of the tunnel model data are made to the full scale LAP propeller mounted on the test bed aircraft and compared with predictions. The prediction method is found to be somewhat conservative in that it slightly overpredicts the projected model data at the peak.

  17. Natural snowfall reveals large-scale flow structures in the wake of a 2.5-MW wind turbine.

    PubMed

    Hong, Jiarong; Toloui, Mostafa; Chamorro, Leonardo P; Guala, Michele; Howard, Kevin; Riley, Sean; Tucker, James; Sotiropoulos, Fotis

    2014-06-24

    To improve power production and structural reliability of wind turbines, there is a pressing need to understand how turbines interact with the atmospheric boundary layer. However, experimental techniques capable of quantifying or even qualitatively visualizing the large-scale turbulent flow structures around full-scale turbines do not exist today. Here we use snowflakes from a winter snowstorm as flow tracers to obtain velocity fields downwind of a 2.5-MW wind turbine in a sampling area of ~36 × 36 m(2). The spatial and temporal resolutions of the measurements are sufficiently high to quantify the evolution of blade-generated coherent motions, such as the tip and trailing sheet vortices, identify their instability mechanisms and correlate them with turbine operation, control and performance. Our experiment provides an unprecedented in situ characterization of flow structures around utility-scale turbines, and yields significant insights into the Reynolds number similarity issues presented in wind energy applications.

  18. Living with heterogeneities in bioreactors: understanding the effects of environmental gradients on cells.

    PubMed

    Lara, Alvaro R; Galindo, Enrique; Ramírez, Octavio T; Palomares, Laura A

    2006-11-01

    The presence of spatial gradients in fundamental culture parameters, such as dissolved gases, pH, concentration of substrates, and shear rate, among others, is an important problem that frequently occurs in large-scale bioreactors. This problem is caused by a deficient mixing that results from limitations inherent to traditional scale-up methods and practical constraints during large-scale bioreactor design and operation. When cultured in a heterogeneous environment, cells are continuously exposed to fluctuating conditions as they travel through the various zones of a bioreactor. Such fluctuations can affect cell metabolism, yields, and quality of the products of interest. In this review, the theoretical analyses that predict the existence of environmental gradients in bioreactors and their experimental confirmation are reviewed. The origins of gradients in common culture parameters and their effects on various organisms of biotechnological importance are discussed. In particular, studies based on the scale-down methodology, a convenient tool for assessing the effect of environmental heterogeneities, are surveyed.

  19. Graph processing platforms at scale: practices and experiences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lim, Seung-Hwan; Lee, Sangkeun; Brown, Tyler C

    2015-01-01

    Graph analysis unveils hidden associations of data in many phenomena and artifacts, such as road network, social networks, genomic information, and scientific collaboration. Unfortunately, a wide diversity in the characteristics of graphs and graph operations make it challenging to find a right combination of tools and implementation of algorithms to discover desired knowledge from the target data set. This study presents an extensive empirical study of three representative graph processing platforms: Pegasus, GraphX, and Urika. Each system represents a combination of options in data model, processing paradigm, and infrastructure. We benchmarked each platform using three popular graph operations, degree distribution,more » connected components, and PageRank over a variety of real-world graphs. Our experiments show that each graph processing platform shows different strength, depending the type of graph operations. While Urika performs the best in non-iterative operations like degree distribution, GraphX outputforms iterative operations like connected components and PageRank. In addition, we discuss challenges to optimize the performance of each platform over large scale real world graphs.« less

  20. Requirements for a mobile communications satellite system. Volume 3: Large space structures measurements study

    NASA Technical Reports Server (NTRS)

    Akle, W.

    1983-01-01

    This study report defines a set of tests and measurements required to characterize the performance of a Large Space System (LSS), and to scale this data to other LSS satellites. Requirements from the Mobile Communication Satellite (MSAT) configurations derived in the parent study were used. MSAT utilizes a large, mesh deployable antenna, and encompasses a significant range of LSS technology issues in the areas of structural/dynamics, control, and performance predictability. In this study, performance requirements were developed for the antenna. Special emphasis was placed on antenna surface accuracy, and pointing stability. Instrumentation and measurement systems, applicable to LSS, were selected from existing or on-going technology developments. Laser ranging and angulation systems, presently in breadboard status, form the backbone of the measurements. Following this, a set of ground, STS, and GEO-operational were investigated. A third scale (15 meter) antenna system as selected for ground characterization followed by STS flight technology development. This selection ensures analytical scaling from ground-to-orbit, and size scaling. Other benefits are cost and ability to perform reasonable ground tests. Detail costing of the various tests and measurement systems were derived and are included in the report.

  1. Low-Complexity Polynomial Channel Estimation in Large-Scale MIMO With Arbitrary Statistics

    NASA Astrophysics Data System (ADS)

    Shariati, Nafiseh; Bjornson, Emil; Bengtsson, Mats; Debbah, Merouane

    2014-10-01

    This paper considers pilot-based channel estimation in large-scale multiple-input multiple-output (MIMO) communication systems, also known as massive MIMO, where there are hundreds of antennas at one side of the link. Motivated by the fact that computational complexity is one of the main challenges in such systems, a set of low-complexity Bayesian channel estimators, coined Polynomial ExpAnsion CHannel (PEACH) estimators, are introduced for arbitrary channel and interference statistics. While the conventional minimum mean square error (MMSE) estimator has cubic complexity in the dimension of the covariance matrices, due to an inversion operation, our proposed estimators significantly reduce this to square complexity by approximating the inverse by a L-degree matrix polynomial. The coefficients of the polynomial are optimized to minimize the mean square error (MSE) of the estimate. We show numerically that near-optimal MSEs are achieved with low polynomial degrees. We also derive the exact computational complexity of the proposed estimators, in terms of the floating-point operations (FLOPs), by which we prove that the proposed estimators outperform the conventional estimators in large-scale MIMO systems of practical dimensions while providing a reasonable MSEs. Moreover, we show that L needs not scale with the system dimensions to maintain a certain normalized MSE. By analyzing different interference scenarios, we observe that the relative MSE loss of using the low-complexity PEACH estimators is smaller in realistic scenarios with pilot contamination. On the other hand, PEACH estimators are not well suited for noise-limited scenarios with high pilot power; therefore, we also introduce the low-complexity diagonalized estimator that performs well in this regime. Finally, we ...

  2. ORIGAMI Automator Primer. Automated ORIGEN Source Terms and Spent Fuel Storage Pool Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wieselquist, William A.; Thompson, Adam B.; Bowman, Stephen M.

    2016-04-01

    Source terms and spent nuclear fuel (SNF) storage pool decay heat load analyses for operating nuclear power plants require a large number of Oak Ridge Isotope Generation and Depletion (ORIGEN) calculations. SNF source term calculations also require a significant amount of bookkeeping to track quantities such as core and assembly operating histories, spent fuel pool (SFP) residence times, heavy metal masses, and enrichments. The ORIGEN Assembly Isotopics (ORIGAMI) module in the SCALE code system provides a simple scheme for entering these data. However, given the large scope of the analysis, extensive scripting is necessary to convert formats and process datamore » to create thousands of ORIGAMI input files (one per assembly) and to process the results into formats readily usable by follow-on analysis tools. This primer describes a project within the SCALE Fulcrum graphical user interface (GUI) called ORIGAMI Automator that was developed to automate the scripting and bookkeeping in large-scale source term analyses. The ORIGAMI Automator enables the analyst to (1) easily create, view, and edit the reactor site and assembly information, (2) automatically create and run ORIGAMI inputs, and (3) analyze the results from ORIGAMI. ORIGAMI Automator uses the standard ORIGEN binary concentrations files produced by ORIGAMI, with concentrations available at all time points in each assembly’s life. The GUI plots results such as mass, concentration, activity, and decay heat using a powerful new ORIGEN Post-Processing Utility for SCALE (OPUS) GUI component. This document includes a description and user guide for the GUI, a step-by-step tutorial for a simplified scenario, and appendices that document the file structures used.« less

  3. Development and Application of a Process-based River System Model at a Continental Scale

    NASA Astrophysics Data System (ADS)

    Kim, S. S. H.; Dutta, D.; Vaze, J.; Hughes, J. D.; Yang, A.; Teng, J.

    2014-12-01

    Existing global and continental scale river models, mainly designed for integrating with global climate model, are of very course spatial resolutions and they lack many important hydrological processes, such as overbank flow, irrigation diversion, groundwater seepage/recharge, which operate at a much finer resolution. Thus, these models are not suitable for producing streamflow forecast at fine spatial resolution and water accounts at sub-catchment levels, which are important for water resources planning and management at regional and national scale. A large-scale river system model has been developed and implemented for water accounting in Australia as part of the Water Information Research and Development Alliance between Australia's Bureau of Meteorology (BoM) and CSIRO. The model, developed using node-link architecture, includes all major hydrological processes, anthropogenic water utilisation and storage routing that influence the streamflow in both regulated and unregulated river systems. It includes an irrigation model to compute water diversion for irrigation use and associated fluxes and stores and a storage-based floodplain inundation model to compute overbank flow from river to floodplain and associated floodplain fluxes and stores. An auto-calibration tool has been built within the modelling system to automatically calibrate the model in large river systems using Shuffled Complex Evolution optimiser and user-defined objective functions. The auto-calibration tool makes the model computationally efficient and practical for large basin applications. The model has been implemented in several large basins in Australia including the Murray-Darling Basin, covering more than 2 million km2. The results of calibration and validation of the model shows highly satisfactory performance. The model has been operalisationalised in BoM for producing various fluxes and stores for national water accounting. This paper introduces this newly developed river system model describing the conceptual hydrological framework, methods used for representing different hydrological processes in the model and the results and evaluation of the model performance. The operational implementation of the model for water accounting is discussed.

  4. Lessons learned from post-wildfire monitoring and implications for land management and regional drinking water treatability in Southern Rockies of Alberta

    NASA Astrophysics Data System (ADS)

    Diiwu, J.; Silins, U.; Kevin, B.; Anderson, A.

    2008-12-01

    Like many areas of the Rocky Mountains, Alberta's forests on the eastern slopes of the Rockies have been shaped by decades of successful fire suppression. These forests are at high risk to fire and large scale insect infestation, and climate change will continue to increase these risks. These headwaters forests provide the vast majority of usable surface water supplies to large region of the province, and large scale natural disasters can have dramatic effects on water quality and water availability. The population in the region has steadily increased and now this area is the main source water for many Alberta municipalities, including the City of Calgary, which has a population of over one million. In 2003 a fire burned 21,000 ha in the southern foothills area. The government land managers were concerned about the downstream implications of the fire and salvage operations, however there was very limited scientific information to guide the decision making. This led to establishment of the Southern Rockies Watershed Project, which is a partnership between Alberta Sustainable Resource Development, the provincial government department responsible for land management and the University of Alberta. After five years of data collection, the project has produced quantitative information that was not previously available about the effects of fire and management interventions such as salvage logging on headwaters and regional water quality. This information can be used to make decisions on forest operations, fire suppression, and post-fire salvage operations. In the past few years this project has captured the interest of large municipalities and water treatment researchers who are keen to investigate the potential implications of large natural disturbances to large and small drinking water treatment facilities. Examples from this project will be used to highlight the challenges and successes encountered while bridging the gap between science and land management policy.

  5. Integration and initial operation of the multi-component large ring laser structure ROMY

    NASA Astrophysics Data System (ADS)

    Schreiber, Karl Ulrich; Igel, Heiner; Wassermann, Joachim; Gebauer, André; Simonelli, Andrea; Bernauer, Felix; Donner, Stefanie; Hadziioannou, Celine; Egdorf, Sven; Wells, Jon-Paul

    2017-04-01

    Rotation sensing for the geosciences requires a high sensor resolution of the order of 10 pico- radians per second or even less. An optical Sagnac interferometer offers this sensitivity, provided that the scale factor can be made very large. We have designed and built a multi- component ring laser system, consisting of 4 individual large ring lasers, each covering an area of more than 62 square m. The rings are orientated in the shape of a tetrahedron, so that all 3 spatial directions are covered, allowing also for some redundancy. We report on the initial operation of the free running gyroscopes in their underground facility in order to establish a performance estimate for the ROMY ring laser structure. Preliminary results suggest that the quantum noise limit is lower than that of the G ring laser.

  6. DOE High Performance Computing Operational Review (HPCOR): Enabling Data-Driven Scientific Discovery at HPC Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Richard; Allcock, William; Beggio, Chris

    2014-10-17

    U.S. Department of Energy (DOE) High Performance Computing (HPC) facilities are on the verge of a paradigm shift in the way they deliver systems and services to science and engineering teams. Research projects are producing a wide variety of data at unprecedented scale and level of complexity, with community-specific services that are part of the data collection and analysis workflow. On June 18-19, 2014 representatives from six DOE HPC centers met in Oakland, CA at the DOE High Performance Operational Review (HPCOR) to discuss how they can best provide facilities and services to enable large-scale data-driven scientific discovery at themore » DOE national laboratories. The report contains findings from that review.« less

  7. AirSTAR: A UAV Platform for Flight Dynamics and Control System Testing

    NASA Technical Reports Server (NTRS)

    Jordan, Thomas L.; Foster, John V.; Bailey, Roger M.; Belcastro, Christine M.

    2006-01-01

    As part of the NASA Aviation Safety Program at Langley Research Center, a dynamically scaled unmanned aerial vehicle (UAV) and associated ground based control system are being developed to investigate dynamics modeling and control of large transport vehicles in upset conditions. The UAV is a 5.5% (seven foot wingspan), twin turbine, generic transport aircraft with a sophisticated instrumentation and telemetry package. A ground based, real-time control system is located inside an operations vehicle for the research pilot and associated support personnel. The telemetry system supports over 70 channels of data plus video for the downlink and 30 channels for the control uplink. Data rates are in excess of 200 Hz. Dynamic scaling of the UAV, which includes dimensional, weight, inertial, actuation, and control system scaling, is required so that the sub-scale vehicle will realistically simulate the flight characteristics of the full-scale aircraft. This testbed will be utilized to validate modeling methods, flight dynamics characteristics, and control system designs for large transport aircraft, with the end goal being the development of technologies to reduce the fatal accident rate due to loss-of-control.

  8. Windvan laser study

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The goal of defining a CO2 laser transmitter approach suited to Shuttle Coherent Atmospheric Lidar Experiment (SCALE) requirements is discussed. The adaptation of the existing WINDVAN system to the shuttle environment is addressed. The size, weight, reliability, and efficiency of the existing WINDVAN system are largely compatible with SCALE requirements. Repacking is needed for compatibility with vacuum and thermal environments. Changes are required to ensure survival through launch and landing, mechanical, vibration, and acoustic loads. Existing WINDVAN thermal management approaches depending on convection need to be upgraded zero gravity operations.

  9. Black Hole Entropy Calculated via Wavefunction Approximations on a Schwarzschild Spacetime

    DTIC Science & Technology

    2015-05-18

    dimension of μA is kg2m2s−2 which is the expected dimension . The μ2B has an extra unit of length in the numerator but is also divided by another factor...phenomena. The two ideas were General Relativity (GR) and Quantum Mechanics (QM). General relativity describes physics on large scales with masses the size...operator ̂L = ̂r × ̂p. These operators can be written in three dimensions in a compact way by using the del operator4 ∇ = ∂xî+ ∂y ĵ + ∂zk̂, ̂p

  10. A distributed computing approach to mission operations support. [for spacecraft

    NASA Technical Reports Server (NTRS)

    Larsen, R. L.

    1975-01-01

    Computing mission operation support includes orbit determination, attitude processing, maneuver computation, resource scheduling, etc. The large-scale third-generation distributed computer network discussed is capable of fulfilling these dynamic requirements. It is shown that distribution of resources and control leads to increased reliability, and exhibits potential for incremental growth. Through functional specialization, a distributed system may be tuned to very specific operational requirements. Fundamental to the approach is the notion of process-to-process communication, which is effected through a high-bandwidth communications network. Both resource-sharing and load-sharing may be realized in the system.

  11. Transportation and operations aspects of space energy systems

    NASA Technical Reports Server (NTRS)

    Woodcock, Gordon R.

    1989-01-01

    A brief comparative analysis was made for three concepts of supplying large-scale electrical energy to Earth from space. The concepts were: (1) mining helium-3 on the Moon and returning it to Earth; (2) constructing solar power satellites in geosynchronous orbit from lunar materials (the energy is beamed by microwave to receivers on Earth); and (3) constructing power collection and beaming systems on the Moon itself and transmitting the energy to Earth by microwave. This analysis concerned mainly space transportation and operations, but each of the systems is briefly characterized to provide a basis for space transportation and operations analysis.

  12. The curious case of large-N expansions on a (pseudo)sphere

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polyakov, Alexander M.; Saleem, Zain H.; Stokes, James

    We elucidate the large-N dynamics of one-dimensional sigma models with spherical and hyperbolic target spaces and find a duality between the Lagrange multiplier and the angular momentum. In the hyperbolic model we propose a new class of operators based on the irreducible representations of hyperbolic space. We also uncover unexpected zero modes which lead to the double scaling of the 1/N expansion and explore these modes using Gelfand-Dikiy equations.

  13. The curious case of large-N expansions on a (pseudo)sphere

    DOE PAGES

    Polyakov, Alexander M.; Saleem, Zain H.; Stokes, James

    2015-02-03

    We elucidate the large-N dynamics of one-dimensional sigma models with spherical and hyperbolic target spaces and find a duality between the Lagrange multiplier and the angular momentum. In the hyperbolic model we propose a new class of operators based on the irreducible representations of hyperbolic space. We also uncover unexpected zero modes which lead to the double scaling of the 1/N expansion and explore these modes using Gelfand-Dikiy equations.

  14. Pynamic: the Python Dynamic Benchmark

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, G L; Ahn, D H; de Supinksi, B R

    2007-07-10

    Python is widely used in scientific computing to facilitate application development and to support features such as computational steering. Making full use of some of Python's popular features, which improve programmer productivity, leads to applications that access extremely high numbers of dynamically linked libraries (DLLs). As a result, some important Python-based applications severely stress a system's dynamic linking and loading capabilities and also cause significant difficulties for most development environment tools, such as debuggers. Furthermore, using the Python paradigm for large scale MPI-based applications can create significant file IO and further stress tools and operating systems. In this paper, wemore » present Pynamic, the first benchmark program to support configurable emulation of a wide-range of the DLL usage of Python-based applications for large scale systems. Pynamic has already accurately reproduced system software and tool issues encountered by important large Python-based scientific applications on our supercomputers. Pynamic provided insight for our system software and tool vendors, and our application developers, into the impact of several design decisions. As we describe the Pynamic benchmark, we will highlight some of the issues discovered in our large scale system software and tools using Pynamic.« less

  15. HFSB-seeding for large-scale tomographic PIV in wind tunnels

    NASA Astrophysics Data System (ADS)

    Caridi, Giuseppe Carlo Alp; Ragni, Daniele; Sciacchitano, Andrea; Scarano, Fulvio

    2016-12-01

    A new system for large-scale tomographic particle image velocimetry in low-speed wind tunnels is presented. The system relies upon the use of sub-millimetre helium-filled soap bubbles as flow tracers, which scatter light with intensity several orders of magnitude higher than micron-sized droplets. With respect to a single bubble generator, the system increases the rate of bubbles emission by means of transient accumulation and rapid release. The governing parameters of the system are identified and discussed, namely the bubbles production rate, the accumulation and release times, the size of the bubble injector and its location with respect to the wind tunnel contraction. The relations between the above parameters, the resulting spatial concentration of tracers and measurement of dynamic spatial range are obtained and discussed. Large-scale experiments are carried out in a large low-speed wind tunnel with 2.85 × 2.85 m2 test section, where a vertical axis wind turbine of 1 m diameter is operated. Time-resolved tomographic PIV measurements are taken over a measurement volume of 40 × 20 × 15 cm3, allowing the quantitative analysis of the tip-vortex structure and dynamical evolution.

  16. Ecological impacts of large-scale disposal of mining waste in the deep sea

    PubMed Central

    Hughes, David J.; Shimmield, Tracy M.; Black, Kenneth D.; Howe, John A.

    2015-01-01

    Deep-Sea Tailings Placement (DSTP) from terrestrial mines is one of several large-scale industrial activities now taking place in the deep sea. The scale and persistence of its impacts on seabed biota are unknown. We sampled around the Lihir and Misima island mines in Papua New Guinea to measure the impacts of ongoing DSTP and assess the state of benthic infaunal communities after its conclusion. At Lihir, where DSTP has operated continuously since 1996, abundance of sediment infauna was substantially reduced across the sampled depth range (800–2020 m), accompanied by changes in higher-taxon community structure, in comparison with unimpacted reference stations. At Misima, where DSTP took place for 15 years, ending in 2004, effects on community composition persisted 3.5 years after its conclusion. Active tailings deposition has severe impacts on deep-sea infaunal communities and these impacts are detectable at a coarse level of taxonomic resolution. PMID:25939397

  17. The Full-Scale Prototype for the Fluorescence Detector Array of Single-Pixel Telescopes

    NASA Astrophysics Data System (ADS)

    Fujii, T.; Malacari, M.; Bellido, J. A.; Farmer, J.; Galimova, A.; Horvath, P.; Hrabovsky, M.; Mandat, D.; Matalon, A.; Matthews, J. N.; Merolle, M.; Ni, X.; Nozka, L.; Palatka, M.; Pech, M.; Privitera, P.; Schovanek, P.; Thomas, S. B.; Travnicek, P.

    The Fluorescence detector Array of Single-pixel Telescopes (FAST) is a design concept for the next generation of ultrahigh-energy cosmic ray (UHECR) observatories, addressing the requirements for a large-area, low-cost detector suitable for measuring the properties of the highest energy cosmic rays. In the FAST design, a large field of view is covered by a few pixels at the focal plane of a mirror or Fresnel lens. Motivated by the successful detection of UHECRs using a prototype comprised of a single 200 mm photomultiplier-tube and a 1 m2 Fresnel lens system, we have developed a new "full-scale" prototype consisting of four 200 mm photomultiplier-tubes at the focus of a segmented mirror of 1.6 m in diameter. We report on the status of the full-scale prototype, including test measurements made during first light operation at the Telescope Array site in central Utah, U.S.A.

  18. Multi-scale Modeling of Radiation Damage: Large Scale Data Analysis

    NASA Astrophysics Data System (ADS)

    Warrier, M.; Bhardwaj, U.; Bukkuru, S.

    2016-10-01

    Modification of materials in nuclear reactors due to neutron irradiation is a multiscale problem. These neutrons pass through materials creating several energetic primary knock-on atoms (PKA) which cause localized collision cascades creating damage tracks, defects (interstitials and vacancies) and defect clusters depending on the energy of the PKA. These defects diffuse and recombine throughout the whole duration of operation of the reactor, thereby changing the micro-structure of the material and its properties. It is therefore desirable to develop predictive computational tools to simulate the micro-structural changes of irradiated materials. In this paper we describe how statistical averages of the collision cascades from thousands of MD simulations are used to provide inputs to Kinetic Monte Carlo (KMC) simulations which can handle larger sizes, more defects and longer time durations. Use of unsupervised learning and graph optimization in handling and analyzing large scale MD data will be highlighted.

  19. Ecological impacts of large-scale disposal of mining waste in the deep sea.

    PubMed

    Hughes, David J; Shimmield, Tracy M; Black, Kenneth D; Howe, John A

    2015-05-05

    Deep-Sea Tailings Placement (DSTP) from terrestrial mines is one of several large-scale industrial activities now taking place in the deep sea. The scale and persistence of its impacts on seabed biota are unknown. We sampled around the Lihir and Misima island mines in Papua New Guinea to measure the impacts of ongoing DSTP and assess the state of benthic infaunal communities after its conclusion. At Lihir, where DSTP has operated continuously since 1996, abundance of sediment infauna was substantially reduced across the sampled depth range (800-2020 m), accompanied by changes in higher-taxon community structure, in comparison with unimpacted reference stations. At Misima, where DSTP took place for 15 years, ending in 2004, effects on community composition persisted 3.5 years after its conclusion. Active tailings deposition has severe impacts on deep-sea infaunal communities and these impacts are detectable at a coarse level of taxonomic resolution.

  20. A real-time interferometer technique for compressible flow research

    NASA Technical Reports Server (NTRS)

    Bachalo, W. D.; Houser, M. J.

    1984-01-01

    Strengths and shortcomings in the application of interferometric techniques to transonic flow fields are examined and an improved method is elaborated. Such applications have demonstrated the value of interferometry in obtaining data for compressible flow research. With holographic techniques, interferometry may be applied in large scale facilities without the use of expensive optics or elaborate vibration isolation equipment. Results obtained using holographic interferometry and other methods demonstrate that reliable qualitative and quantitative data can be acquired. Nevertheless, the conventional method can be difficult to set up and apply, and it cannot produce real-time data. A new interferometry technique is investigated that promises to be easier to apply and can provide real-time information. This single-beam technique has the necessary insensitivity to vibration for large scale wind tunnel operations. Capabilities of the method and preliminary tests on some laboratory scale flow fluids are described.

  1. The global palm oil sector must change to save biodiversity and improve food security in the tropics.

    PubMed

    Azhar, Badrul; Saadun, Norzanalia; Prideaux, Margi; Lindenmayer, David B

    2017-12-01

    Most palm oil currently available in global markets is sourced from certified large-scale plantations. Comparatively little is sourced from (typically uncertified) smallholders. We argue that sourcing sustainable palm oil should not be determined by commercial certification alone and that the certification process should be revisited. There are so-far unrecognized benefits of sourcing palm oil from smallholders that should be considered if genuine biodiversity conservation is to be a foundation of 'environmentally sustainable' palm oil production. Despite a lack of certification, smallholder production is often more biodiversity-friendly than certified production from large-scale plantations. Sourcing palm oil from smallholders also alleviates poverty among rural farmers, promoting better conservation outcomes. Yet, certification schemes - the current measure of 'sustainability' - are financially accessible only for large-scale plantations that operate as profit-driven monocultures. Industrial palm oil is expanding rapidly in regions with weak environmental laws and enforcement. This warrants the development of an alternative certification scheme for smallholders. Greater attention should be directed to deforestation-free palm oil production in smallholdings, where production is less likely to cause large scale biodiversity loss. These small-scale farmlands in which palm oil is mixed with other crops should be considered by retailers and consumers who are interested in promoting sustainable palm oil production. Simultaneously, plantation companies should be required to make their existing production landscapes more compatible with enhanced biodiversity conservation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Large-angle cosmic microwave background anisotropies in an open universe

    NASA Technical Reports Server (NTRS)

    Kamionkowski, Marc; Spergel, David N.

    1994-01-01

    If the universe is open, scales larger than the curvature scale may be probed by observation of large-angle fluctuations in the cosmic microwave background (CMB). We consider primordial adiabatic perturbations and discuss power spectra that are power laws in volume, wavelength, and eigenvalue of the Laplace operator. Such spectra may have arisen if, for example, the universe underwent a period of `frustated' inflation. The resulting large-angle anisotropies of the CMB are computed. The amplitude generally increases as Omega is decreased but decreases as h is increased. Interestingly enough, for all three Ansaetze, anisotropies on angular scales larger than the curvature scale are suppressed relative to the anisotropies on scales smaller than the curvature scale, but cosmic variance makes discrimination between various models difficult. Models with 0.2 approximately less than Omega h approximately less than 0.3 appear compatible with CMB fluctuations detected by Cosmic Background Explorer Satellite (COBE) and the Tenerife experiment and with the amplitude and spectrum of fluctuations of galaxy counts in the APM, CfA, and 1.2 Jy IRAS surveys. COBE normalization for these models yields sigma(sub 8) approximately = 0.5 - 0.7. Models with smaller values of Omega h when normalized to COBE require bias factors in excess of 2 to be compatible with the observed galaxy counts on the 8/h Mpc scale. Requiring that the age of the universe exceed 10 Gyr implies that Omega approximately greater than 0.25, while requiring that from the last-scattering term in the Sachs-Wolfe formula, large-angle anisotropies come primarily from the decay of potential fluctuations at z approximately less than 1/Omega. Thus, if the universe is open, COBE has been detecting temperature fluctuations produced at moderate redshift rather than at z approximately 1300.

  3. Novel approach for extinguishing large-scale coal fires using gas-liquid foams in open pit mines.

    PubMed

    Lu, Xinxiao; Wang, Deming; Qin, Botao; Tian, Fuchao; Shi, Guangyi; Dong, Shuaijun

    2015-12-01

    Coal fires are a serious threat to the workers' security and safe production in open pit mines. The coal fire source is hidden and innumerable, and the large-area cavity is prevalent in the coal seam after the coal burned, causing the conventional extinguishment technology difficult to work. Foams are considered as an efficient means of fire extinguishment in these large-scale workplaces. A noble foam preparation method is introduced, and an original design of cavitation jet device is proposed to add foaming agent stably. The jet cavitation occurs when the water flow rate and pressure ratio reach specified values. Through self-building foaming system, the high performance foams are produced and then infused into the blast drilling holes at a large flow. Without complicated operation, this system is found to be very suitable for extinguishing large-scale coal fires. Field application shows that foam generation adopting the proposed key technology makes a good fire extinguishment effect. The temperature reduction using foams is 6-7 times higher than water, and CO concentration is reduced from 9.43 to 0.092‰ in the drilling hole. The coal fires are controlled successfully in open pit mines, ensuring the normal production as well as the security of personnel and equipment.

  4. Neutrinoless double beta decay and QCD running at low energy scales

    NASA Astrophysics Data System (ADS)

    González, M.; Hirsch, M.; Kovalenko, S. G.

    2018-06-01

    There is a common belief that the main uncertainties in the theoretical analysis of neutrinoless double beta (0 ν β β ) decay originate from the nuclear matrix elements. Here, we uncover another previously overlooked source of potentially large uncertainties stemming from nonperturbative QCD effects. Recently perturbative QCD corrections have been calculated for all dimension 6 and 9 effective operators describing 0 ν β β -decay and their importance for a reliable treatment of 0 ν β β -decay has been demonstrated. However, these perturbative results are valid at energy scales above ˜1 GeV , while the typical 0 ν β β scale is about ˜100 MeV . In view of this fact we examine the possibility of extrapolating the perturbative results towards sub-GeV nonperturbative scales on the basis of the QCD coupling constant "freezing" behavior using background perturbation theory. Our analysis suggests that such an infrared extrapolation does modify the perturbative results for both short-range and long-range mechanisms of 0 ν β β -decay in general only moderately. We also discuss that the tensor⊗tensor effective operator cannot appear alone in the low energy limit of any renormalizable high-scale model and then demonstrate that all five linearly independent combinations of the scalar and tensor operators, which can appear in renormalizable models, are infrared stable.

  5. Behavior of a high-temperature superconducting conductor on a round core cable at current ramp rates as high as 67.8 kA s-1 in background fields of up to 19 T

    NASA Astrophysics Data System (ADS)

    Michael, P. C.; Bromberg, L.; van der Laan, D. C.; Noyes, P.; Weijers, H. W.

    2016-04-01

    High temperature superconducting (HTS) conductor-on-round-core (CORC®) cables have been developed for use in power transmission systems and large high-field magnets. The use of high-current conductors for large-scale magnets reduces system inductance and limits the peak voltage needed for ramped field operation. A CORC® cable contains a large number of RE-Ba2Cu3O7-δ (RE = rare earth) (REBCO) coated conductors, helically wound in multiple layers on a thin, round former. Large-scale applications, such as fusion and accelerator magnets, require current ramp rates of several kilo-Amperes per second during pulsed operation. This paper presents results that demonstrate the electromagnetic stability of a CORC® cable during transient conditions. Measurements were performed at 4.2 K using a 1.55 m long CORC® cable in background fields of up to 19 T. Repeated current pulses in a background field of 19 T at current ramp rates of up to 67.8 kA s-1 to approximately 90% of the cable’s quench current at that field, did not show any sign of degradation in cable performance due to excessive ac loss or electromagnetic instability. The very high current ramp rates applied during these tests were used to compensate, to the extent possible, the limited cable length accommodated by the test facility, assuming that the measured results could be extrapolated to longer length cables operated at proportionally lower current ramp rates. No shift of the superconducting transition to lower current was measured when the current ramp rate was increased from 25 A s-1 to 67.8 kA s-1. These results demonstrate the viability of CORC® cables for use in low-inductance magnets that operate at moderate to high current ramp rates.

  6. Unstructured-grid coastal ocean modelling in Southern Adriatic and Northern Ionian Seas

    NASA Astrophysics Data System (ADS)

    Federico, Ivan; Pinardi, Nadia; Coppini, Giovanni; Oddo, Paolo

    2016-04-01

    The Southern Adriatic Northern Ionian coastal Forecasting System (SANIFS) is a short-term forecasting system based on unstructured grid approach. The model component is built on SHYFEM finite element three-dimensional hydrodynamic model. The operational chain exploits a downscaling approach starting from the Mediterranean oceanographic-scale model MFS (Mediterranean Forecasting System, operated by INGV). The implementation set-up has been designed to provide accurate hydrodynamics and active tracer processes in the coastal waters of Southern Eastern Italy (Apulia, Basilicata and Calabria regions), where the model is characterized by a variable resolution in range of 50-500 m. The horizontal resolution is also high in open-sea areas, where the elements size is approximately 3 km. The model is forced: (i) at the lateral open boundaries through a full nesting strategy directly with the MFS (temperature, salinity, non-tidal sea surface height and currents) and OTPS (tidal forcing) fields; (ii) at surface through two alternative atmospheric forcing datasets (ECMWF and COSMOME) via MFS-bulk-formulae. Given that the coastal fields are driven by a combination of both local/coastal and deep ocean forcings propagating along the shelf, the performance of SANIFS was verified first (i) at the large and shelf-coastal scales by comparing with a large scale CTD survey and then (ii) at the coastal-harbour scale by comparison with CTD, ADCP and tide gauge data. Sensitivity tests were performed on initialization conditions (mainly focused on spin-up procedures) and on surface boundary conditions by assessing the reliability of two alternative datasets at different horizontal resolution (12.5 and 7 km). The present work highlights how downscaling could improve the simulation of the flow field going from typical open-ocean scales of the order of several km to the coastal (and harbour) scales of tens to hundreds of meters.

  7. Structure and Variability of Water Vapor in the Upper Troposphere and Lower Stratosphere

    NASA Technical Reports Server (NTRS)

    Salby, Murry L.

    2001-01-01

    Upper-tropospheric humidity (UTH) has been synoptically mapped via an algorithm that rejects small-scale undersampled variance, which is intrinsic to asymptotic measurements of water vapor, cloud, and other convective properties. Mapped distributions of UTH have been used, jointly with high-resolution Global Cloud Imagery (GCI), to study how the upper troposphere is humidified. The time-mean distribution of UTH is spatially correlated to the time-mean distribution of cold cloud fraction (eta)(sub c) (T < than 230 K). Regions of large UTH coincide with regions of large eta(sub c), which mark deep convection. They also coincide with regions of reduced vertical stability, in which the vertical gradient of theta is weakened by convective mixing. Coldest cloud cover is attended convective overshoots above the local tropopause, which is simultaneously coldest and highest. Together, these features reflect the upper-troposphere being ventilated by convection, which mixes in moist air from lower levels. Histograms of UTH and eta(sub c) have been applied to construct the joint probability density function, which quantifies the relationship between these properties. The expected value of UTH in convective regions is strongly correlated to the expected value of eta(sub c). In ensembles of asymptotic samples, the correlation between epsilon[UTH] and epsilon[eta(sub c)] exceeds 0.80. As these expectations reflect the most likely values, the strong correlation between epsilon[UTH] and epsilon[eta(sub c)] indicates that the large-scale organization of UTH is strongly shaped by convective pumping of moisture from lower levels. The same relationship holds for unsteady fields - even though, instantaneously, those fields are comprised almost entirely of small-scale convective structure. The spatial autocorrelation of UTH, constructed at high resolution from overpass data along ascending and descending tracks of the orbit, is limited to only a couple of degrees in the horizontal. This mirrors the spatial autocorrelation of eta(sub c), which likewise operates coherently on short scales. The short correlation scale of UTH, which reflects the scale of individual convective systems, is comparable to the spacing of retrievals from MLS. These scales are undersampled in the asynoptic measurements. Despite their prevalence, the mapping algorithm described above successfully recovers synoptic behavior operating coherently on large scales. It reveals eastward migration of anomalous UTH from the Indian ocean to the central Pacific, in association with the modulation of convection by the Madden-Julian oscillation. Additional information is contained in the original extended abstract.

  8. Commercial use of remote sensing in agriculture: a case study

    NASA Astrophysics Data System (ADS)

    Gnauck, Gary E.

    1999-12-01

    Over 25 years of research have clearly shown that an analysis of remote sensing imagery can provide information on agricultural crops. Most of this research has been funded by and directed toward the needs of government agencies. Commercial use of agricultural remote sensing has been limited to very small-scale operations supplying remote sensing services to a few selected customers. Datron/Transco Inc. undertook an internally funded remote sensing program directed toward the California cash crop industry (strawberries, lettuce, tomatoes, other fresh vegetables and cotton). The objectives of this program were twofold: (1) to assess the need and readiness of agricultural land managers to adopt remote sensing as a management tool, and (2) determine what technical barriers exist to large-scale implementation of this technology on a commercial basis. The program was divided into three phases: Planning, Engineering Test and Evaluation, and Commercial Operations. Findings: Remote sensing technology can deliver high resolution multispectral imagery with rapid turnaround, that can provide information on crop stress insects, disease and various soil parameters. The limiting factors to the use of remote sensing in agriculture are a lack of familiarization by the land managers, difficulty in translating 'information' into increased revenue or reduced cost for the land manager, and the large economies of scale needed to make the venture commercially viable.

  9. Superconductor bearings, flywheels and transportation

    NASA Astrophysics Data System (ADS)

    Werfel, F. N.; Floegel-Delor, U.; Rothfeld, R.; Riedel, T.; Goebel, B.; Wippich, D.; Schirrmeister, P.

    2012-01-01

    This paper describes the present status of high temperature superconductors (HTS) and of bulk superconducting magnet devices, their use in bearings, in flywheel energy storage systems (FESS) and linear transport magnetic levitation (Maglev) systems. We report and review the concepts of multi-seeded REBCO bulk superconductor fabrication. The multi-grain bulks increase the averaged trapped magnetic flux density up to 40% compared to single-grain assembly in large-scale applications. HTS magnetic bearings with permanent magnet (PM) excitation were studied and scaled up to maximum forces of 10 kN axially and 4.5 kN radially. We examine the technology of the high-gradient magnetic bearing concept and verify it experimentally. A large HTS bearing is tested for stabilizing a 600 kg rotor of a 5 kWh/250 kW flywheel system. The flywheel rotor tests show the requirement for additional damping. Our compact flywheel system is compared with similar HTS-FESS projects. A small-scale compact YBCO bearing with in situ Stirling cryocooler is constructed and investigated for mobile applications. Next we show a successfully developed modular linear Maglev system for magnetic train operation. Each module levitates 0.25t at 10 mm distance during one-day operation without refilling LN2. More than 30 vacuum cryostats containing multi-seeded YBCO blocks are fabricated and are tested now in Germany, China and Brazil.

  10. A Web Service Implementation for Large-Scale Automation, Visualization and Real-Time Program Awareness via Lexical Link Analysis

    DTIC Science & Technology

    2011-04-30

    internal constructs f l f t th h l i l li k l i (LLA)? 3 use u or managemen , roug ex ca n ana ys s LLA Methodology Can Help! Warfighters RDTE...information, including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports...categories of interest in various spreadsheets). This year, we started to develop LLA from a demonstration to an operational capability and facilitate a

  11. High-level neutron coincidence counter maintenance manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swansen, J.; Collinsworth, P.

    1983-05-01

    High-level neutron coincidence counter operational (field) calibration and usage is well known. This manual makes explicit basic (shop) check-out, calibration, and testing of new units and is a guide for repair of failed in-service units. Operational criteria for the major electronic functions are detailed, as are adjustments and calibration procedures, and recurrent mechanical/electromechanical problems are addressed. Some system tests are included for quality assurance. Data on nonstandard large-scale integrated (circuit) components and a schematic set are also included.

  12. New World Vistas: Air and Space Power for the 21st Century. Directed Energy Volume

    DTIC Science & Technology

    1995-01-01

    single mode diode pumped Thulium doped glass fiber laser. Full scale 5-10 watt devices have operated in the laboratory at overall efficiencies of 10...operating in the 900-950 nm range together with the development of ytterbium (Yb) doped laser crystals. The Yb ion generates roughly one third as much...mirror in the high power oscillator resonator . Since a potentially large amount of power is dissipated in the nonlinear medium, careful attention to

  13. Design of an Airlift Bioreactor

    DOE Data Explorer

    Jiao, Yongqin; Park, Dan; Ho, Lewis

    2017-03-13

    An important consideration for the process design is cell immobilization-enabled flow-through operation. Large-scale biosorption relies on cells that are immobilized on a supporting substrate and used to 'attract' metal ions. Cell immobilization allows easy separation of the feed solution and REEs that are attached to the cell surface. It also allows continuous operation without the need of energy-intensive centrifugation or filtration. Lightweight, high surface area, low cost (~$200/m3) high-density polyethylene (HDPE) plastic disks are used as cell carriers for biofilm formation.

  14. The UCLA Design Diversity Experiment (DEDIX) system: A distributed testbed for multiple-version software

    NASA Technical Reports Server (NTRS)

    Avizienis, A.; Gunningberg, P.; Kelly, J. P. J.; Strigini, L.; Traverse, P. J.; Tso, K. S.; Voges, U.

    1986-01-01

    To establish a long-term research facility for experimental investigations of design diversity as a means of achieving fault-tolerant systems, a distributed testbed for multiple-version software was designed. It is part of a local network, which utilizes the Locus distributed operating system to operate a set of 20 VAX 11/750 computers. It is used in experiments to measure the efficacy of design diversity and to investigate reliability increases under large-scale, controlled experimental conditions.

  15. Space Station services and design features for users

    NASA Technical Reports Server (NTRS)

    Kurzhals, Peter R.; Mckinney, Royce L.

    1987-01-01

    The operational design features and services planned for the NASA Space Station will furnish, in addition to novel opportunities and facilities, lower costs through interface standardization and automation and faster access by means of computer-aided integration and control processes. By furnishing a basis for large-scale space exploitation, the Space Station will possess industrial production and operational services capabilities that may be used by the private sector for commercial ventures; it could also ultimately support lunar and planetary exploration spacecraft assembly and launch facilities.

  16. Extended-Range High-Resolution Dynamical Downscaling over a Continental-Scale Domain

    NASA Astrophysics Data System (ADS)

    Husain, S. Z.; Separovic, L.; Yu, W.; Fernig, D.

    2014-12-01

    High-resolution mesoscale simulations, when applied for downscaling meteorological fields over large spatial domains and for extended time periods, can provide valuable information for many practical application scenarios including the weather-dependent renewable energy industry. In the present study, a strategy has been proposed to dynamically downscale coarse-resolution meteorological fields from Environment Canada's regional analyses for a period of multiple years over the entire Canadian territory. The study demonstrates that a continuous mesoscale simulation over the entire domain is the most suitable approach in this regard. Large-scale deviations in the different meteorological fields pose the biggest challenge for extended-range simulations over continental scale domains, and the enforcement of the lateral boundary conditions is not sufficient to restrict such deviations. A scheme has therefore been developed to spectrally nudge the simulated high-resolution meteorological fields at the different model vertical levels towards those embedded in the coarse-resolution driving fields derived from the regional analyses. A series of experiments were carried out to determine the optimal nudging strategy including the appropriate nudging length scales, nudging vertical profile and temporal relaxation. A forcing strategy based on grid nudging of the different surface fields, including surface temperature, soil-moisture, and snow conditions, towards their expected values obtained from a high-resolution offline surface scheme was also devised to limit any considerable deviation in the evolving surface fields due to extended-range temporal integrations. The study shows that ensuring large-scale atmospheric similarities helps to deliver near-surface statistical scores for temperature, dew point temperature and horizontal wind speed that are better or comparable to the operational regional forecasts issued by Environment Canada. Furthermore, the meteorological fields resulting from the proposed downscaling strategy have significantly improved spatiotemporal variance compared to those from the operational forecasts, and any time series generated from the downscaled fields do not suffer from discontinuities due to switching between the consecutive forecasts.

  17. SCALES: SEVIRI and GERB CaL/VaL area for large-scale field experiments

    NASA Astrophysics Data System (ADS)

    Lopez-Baeza, Ernesto; Belda, Fernando; Bodas, Alejandro; Crommelynck, Dominique; Dewitte, Steven; Domenech, Carlos; Gimeno, Jaume F.; Harries, John E.; Jorge Sanchez, Joan; Pineda, Nicolau; Pino, David; Rius, Antonio; Saleh, Kauzar; Tarruella, Ramon; Velazquez, Almudena

    2004-02-01

    The main objective of the SCALES Project is to exploit the unique opportunity offered by the recent launch of the first European METEOSAT Second Generation geostationary satellite (MSG-1) to generate and validate new radiation budget and cloud products provided by the GERB (Geostationary Earth Radiation Budget) instrument. SCALES" specific objectives are: (i) definition and characterization of a large reasonably homogeneous area compatible to GERB pixel size (around 50 x 50 km2), (ii) validation of GERB TOA radiances and fluxes derived by means of angular distribution models, (iii) development of algorithms to estimate surface net radiation from GERB TOA measurements, and (iv) development of accurate methodologies to measure radiation flux divergence and analyze its influence on the thermal regime and dynamics of the atmosphere, also using GERB data. SCALES is highly innovative: it focuses on a new and unique space instrument and develops a new specific validation methodology for low resolution sensors that is based on the use of a robust reference meteorological station (Valencia Anchor Station) around which 3D high resolution meteorological fields are obtained from the MM5 Meteorological Model. During the 1st GERB Ground Validation Campaign (18th-24th June, 2003), CERES instruments on Aqua and Terra provided additional radiance measurements to support validation efforts. CERES instruments operated in the PAPS mode (Programmable Azimuth Plane Scanning) focusing the station. Ground measurements were taken by lidar, sun photometer, GPS precipitable water content, radiosounding ascents, Anchor Station operational meteorological measurements at 2m and 15m., 4 radiation components at 2m, and mobile stations to characterize a large area. In addition, measurements during LANDSAT overpasses on June 14th and 30th were also performed. These activities were carried out within the GIST (GERB International Science Team) framework, during GERB Commissioning Period.

  18. Impacts and Viability of Open Source Software on Earth Science Metadata Clearing House and Service Registry Applications

    NASA Astrophysics Data System (ADS)

    Pilone, D.; Cechini, M. F.; Mitchell, A.

    2011-12-01

    Earth Science applications typically deal with large amounts of data and high throughput rates, if not also high transaction rates. While Open Source is frequently used for smaller scientific applications, large scale, highly available systems frequently fall back to "enterprise" class solutions like Oracle RAC or commercial grade JEE Application Servers. NASA's Earth Observing System Data and Information System (EOSDIS) provides end-to-end capabilities for managing NASA's Earth science data from multiple sources - satellites, aircraft, field measurements, and various other programs. A core capability of EOSDIS, the Earth Observing System (EOS) Clearinghouse (ECHO), is a highly available search and order clearinghouse of over 100 million pieces of science data that has evolved from its early R&D days to a fully operational system. Over the course of this maturity ECHO has largely transitioned from commercial frameworks, databases, and operating systems to Open Source solutions...and in some cases, back. In this talk we discuss the progression of our technological solutions and our lessons learned in the areas of: ? High performance, large scale searching solutions ? GeoSpatial search capabilities and dealing with multiple coordinate systems ? Search and storage of variable format source (science) data ? Highly available deployment solutions ? Scalable (elastic) solutions to visual searching and image handling Throughout the evolution of the ECHO system we have had to evaluate solutions with respect to performance, cost, developer productivity, reliability, and maintainability in the context of supporting global science users. Open Source solutions have played a significant role in our architecture and development but several critical commercial components remain (or have been reinserted) to meet our operational demands.

  19. Fast numerical methods for simulating large-scale integrate-and-fire neuronal networks.

    PubMed

    Rangan, Aaditya V; Cai, David

    2007-02-01

    We discuss numerical methods for simulating large-scale, integrate-and-fire (I&F) neuronal networks. Important elements in our numerical methods are (i) a neurophysiologically inspired integrating factor which casts the solution as a numerically tractable integral equation, and allows us to obtain stable and accurate individual neuronal trajectories (i.e., voltage and conductance time-courses) even when the I&F neuronal equations are stiff, such as in strongly fluctuating, high-conductance states; (ii) an iterated process of spike-spike corrections within groups of strongly coupled neurons to account for spike-spike interactions within a single large numerical time-step; and (iii) a clustering procedure of firing events in the network to take advantage of localized architectures, such as spatial scales of strong local interactions, which are often present in large-scale computational models-for example, those of the primary visual cortex. (We note that the spike-spike corrections in our methods are more involved than the correction of single neuron spike-time via a polynomial interpolation as in the modified Runge-Kutta methods commonly used in simulations of I&F neuronal networks.) Our methods can evolve networks with relatively strong local interactions in an asymptotically optimal way such that each neuron fires approximately once in [Formula: see text] operations, where N is the number of neurons in the system. We note that quantifications used in computational modeling are often statistical, since measurements in a real experiment to characterize physiological systems are typically statistical, such as firing rate, interspike interval distributions, and spike-triggered voltage distributions. We emphasize that it takes much less computational effort to resolve statistical properties of certain I&F neuronal networks than to fully resolve trajectories of each and every neuron within the system. For networks operating in realistic dynamical regimes, such as strongly fluctuating, high-conductance states, our methods are designed to achieve statistical accuracy when very large time-steps are used. Moreover, our methods can also achieve trajectory-wise accuracy when small time-steps are used.

  20. Spatial Modeling and Uncertainty Assessment of Fine Scale Surface Processes Based on Coarse Terrain Elevation Data

    NASA Astrophysics Data System (ADS)

    Rasera, L. G.; Mariethoz, G.; Lane, S. N.

    2017-12-01

    Frequent acquisition of high-resolution digital elevation models (HR-DEMs) over large areas is expensive and difficult. Satellite-derived low-resolution digital elevation models (LR-DEMs) provide extensive coverage of Earth's surface but at coarser spatial and temporal resolutions. Although useful for large scale problems, LR-DEMs are not suitable for modeling hydrologic and geomorphic processes at scales smaller than their spatial resolution. In this work, we present a multiple-point geostatistical approach for downscaling a target LR-DEM based on available high-resolution training data and recurrent high-resolution remote sensing images. The method aims at generating several equiprobable HR-DEMs conditioned to a given target LR-DEM by borrowing small scale topographic patterns from an analogue containing data at both coarse and fine scales. An application of the methodology is demonstrated by using an ensemble of simulated HR-DEMs as input to a flow-routing algorithm. The proposed framework enables a probabilistic assessment of the spatial structures generated by natural phenomena operating at scales finer than the available terrain elevation measurements. A case study in the Swiss Alps is provided to illustrate the methodology.

  1. Achieving online consent to participation in large-scale gene-environment studies: a tangible destination.

    PubMed

    Wood, Fiona; Kowalczuk, Jenny; Elwyn, Glyn; Mitchell, Clive; Gallacher, John

    2011-08-01

    Population based genetics studies are dependent on large numbers of individuals in the pursuit of small effect sizes. Recruiting and consenting a large number of participants is both costly and time consuming. We explored whether an online consent process for large-scale genetics studies is acceptable for prospective participants using an example online genetics study. We conducted semi-structured interviews with 42 members of the public stratified by age group, gender and newspaper readership (a measure of social status). Respondents were asked to use a website designed to recruit for a large-scale genetic study. After using the website a semi-structured interview was conducted to explore opinions and any issues they would have. Responses were analysed using thematic content analysis. The majority of respondents said they would take part in the research (32/42). Those who said they would decline to participate saw fewer benefits from the research, wanted more information and expressed a greater number of concerns about the study. Younger respondents had concerns over time commitment. Middle aged respondents were concerned about privacy and security. Older respondents were more altruistic in their motivation to participate. Common themes included trust in the authenticity of the website, security of personal data, curiosity about their own genetic profile, operational concerns and a desire for more information about the research. Online consent to large-scale genetic studies is likely to be acceptable to the public. The online consent process must establish trust quickly and effectively by asserting authenticity and credentials, and provide access to a range of information to suit different information preferences.

  2. Applications of the ram accelerator to hypervelocity aerothermodynamic testing

    NASA Technical Reports Server (NTRS)

    Bruckner, A. P.; Knowlen, C.; Hertzberg, A.

    1992-01-01

    A ram accelerator used as a hypervelocity launcher for large-scale aeroballistic range applications in hypersonics and aerodynamics research is presented. It is an in-bore ramjet device in which a projectile shaped like the centerbody of a supersonic ramjet is propelled down a stationary tube filled with a tailored combustible gas mixture. Ram accelerator operation has been demonstrated at 39 mm and 90 mm bores, supporting the proposition that this launcher concept can be scaled up to very large bore diameters of the order of 30-60 cm. It is concluded that high quality data obtained from the tube wall and projectile during the aceleration process itself are very useful for understanding aerothermodynamics of hypersonic flow in general, and for providing important CFD validation benchmarks.

  3. Large scale cryogenic fluid systems testing

    NASA Technical Reports Server (NTRS)

    1992-01-01

    NASA Lewis Research Center's Cryogenic Fluid Systems Branch (CFSB) within the Space Propulsion Technology Division (SPTD) has the ultimate goal of enabling the long term storage and in-space fueling/resupply operations for spacecraft and reusable vehicles in support of space exploration. Using analytical modeling, ground based testing, and on-orbit experimentation, the CFSB is studying three primary categories of fluid technology: storage, supply, and transfer. The CFSB is also investigating fluid handling, advanced instrumentation, and tank structures and materials. Ground based testing of large-scale systems is done using liquid hydrogen as a test fluid at the Cryogenic Propellant Tank Facility (K-site) at Lewis' Plum Brook Station in Sandusky, Ohio. A general overview of tests involving liquid transfer, thermal control, pressure control, and pressurization is given.

  4. Thermal Stress FE Analysis of Large-scale Gas Holder Under Sunshine Temperature Field

    NASA Astrophysics Data System (ADS)

    Li, Jingyu; Yang, Ranxia; Wang, Hehui

    2018-03-01

    The temperature field and thermal stress of Man type gas holder is simulated by using the theory of sunshine temperature field based on ASHRAE clear-sky model and the finite element method. The distribution of surface temperature and thermal stress of gas holder under the given sunshine condition is obtained. The results show that the thermal stress caused by sunshine can be identified as one of the important factors for the failure of local cracked oil leakage which happens on the sunny side before on the shady side. Therefore, it is of great importance to consider the sunshine thermal load in the stress analysis, design and operation of large-scale steel structures such as the gas holder.

  5. Microbial advanced biofuels production: overcoming emulsification challenges for large-scale operation.

    PubMed

    Heeres, Arjan S; Picone, Carolina S F; van der Wielen, Luuk A M; Cunha, Rosiane L; Cuellar, Maria C

    2014-04-01

    Isoprenoids and alkanes produced and secreted by microorganisms are emerging as an alternative biofuel for diesel and jet fuel replacements. In a similar way as for other bioprocesses comprising an organic liquid phase, the presence of microorganisms, medium composition, and process conditions may result in emulsion formation during fermentation, hindering product recovery. At the same time, a low-cost production process overcoming this challenge is required to make these advanced biofuels a feasible alternative. We review the main mechanisms and causes of emulsion formation during fermentation, because a better understanding on the microscale can give insights into how to improve large-scale processes and the process technology options that can address these challenges. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Parallel Visualization of Large-Scale Aerodynamics Calculations: A Case Study on the Cray T3E

    NASA Technical Reports Server (NTRS)

    Ma, Kwan-Liu; Crockett, Thomas W.

    1999-01-01

    This paper reports the performance of a parallel volume rendering algorithm for visualizing a large-scale, unstructured-grid dataset produced by a three-dimensional aerodynamics simulation. This dataset, containing over 18 million tetrahedra, allows us to extend our performance results to a problem which is more than 30 times larger than the one we examined previously. This high resolution dataset also allows us to see fine, three-dimensional features in the flow field. All our tests were performed on the Silicon Graphics Inc. (SGI)/Cray T3E operated by NASA's Goddard Space Flight Center. Using 511 processors, a rendering rate of almost 9 million tetrahedra/second was achieved with a parallel overhead of 26%.

  7. Preparing Laboratory and Real-World EEG Data for Large-Scale Analysis: A Containerized Approach

    PubMed Central

    Bigdely-Shamlo, Nima; Makeig, Scott; Robbins, Kay A.

    2016-01-01

    Large-scale analysis of EEG and other physiological measures promises new insights into brain processes and more accurate and robust brain–computer interface models. However, the absence of standardized vocabularies for annotating events in a machine understandable manner, the welter of collection-specific data organizations, the difficulty in moving data across processing platforms, and the unavailability of agreed-upon standards for preprocessing have prevented large-scale analyses of EEG. Here we describe a “containerized” approach and freely available tools we have developed to facilitate the process of annotating, packaging, and preprocessing EEG data collections to enable data sharing, archiving, large-scale machine learning/data mining and (meta-)analysis. The EEG Study Schema (ESS) comprises three data “Levels,” each with its own XML-document schema and file/folder convention, plus a standardized (PREP) pipeline to move raw (Data Level 1) data to a basic preprocessed state (Data Level 2) suitable for application of a large class of EEG analysis methods. Researchers can ship a study as a single unit and operate on its data using a standardized interface. ESS does not require a central database and provides all the metadata data necessary to execute a wide variety of EEG processing pipelines. The primary focus of ESS is automated in-depth analysis and meta-analysis EEG studies. However, ESS can also encapsulate meta-information for the other modalities such as eye tracking, that are increasingly used in both laboratory and real-world neuroimaging. ESS schema and tools are freely available at www.eegstudy.org and a central catalog of over 850 GB of existing data in ESS format is available at studycatalog.org. These tools and resources are part of a larger effort to enable data sharing at sufficient scale for researchers to engage in truly large-scale EEG analysis and data mining (BigEEG.org). PMID:27014048

  8. Lightweight moving radiators for heat rejection in space

    NASA Technical Reports Server (NTRS)

    Knapp, K.

    1981-01-01

    Low temperature droplet stream radiators, using nonmetallic fluids, can be used to radiate large amounts of waste heat from large space facilities. Moving belt radiators are suitable for use on a smaller scale, radiating as few as 10 kW from shuttle related operations. If appropriate seal technology can be developed, moving belt radiators may prove to be important for high temperature systems as well. Droplet stream radiators suitable for operation at peak temperatures near 300 K and 1000 K were studied using both freezing and nonfreezing droplets. Moving belt radiators were also investigated for operation in both temperature ranges. The potential mass and performance characteristics of both concepts were estimated on the basis of parametric variations of analytical point designs. These analyses included all consideration of the equipment required to operate the moving radiator system and take into account the mass of fluid lost by evaporation during mission lifetimes. Preliminary results indicate that low temperature droplet stream radiator appears to offer the greatest potential for improvement over conventional flat plate radiators.

  9. Design and Implementation of a Metadata-rich File System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ames, S; Gokhale, M B; Maltzahn, C

    2010-01-19

    Despite continual improvements in the performance and reliability of large scale file systems, the management of user-defined file system metadata has changed little in the past decade. The mismatch between the size and complexity of large scale data stores and their ability to organize and query their metadata has led to a de facto standard in which raw data is stored in traditional file systems, while related, application-specific metadata is stored in relational databases. This separation of data and semantic metadata requires considerable effort to maintain consistency and can result in complex, slow, and inflexible system operation. To address thesemore » problems, we have developed the Quasar File System (QFS), a metadata-rich file system in which files, user-defined attributes, and file relationships are all first class objects. In contrast to hierarchical file systems and relational databases, QFS defines a graph data model composed of files and their relationships. QFS incorporates Quasar, an XPATH-extended query language for searching the file system. Results from our QFS prototype show the effectiveness of this approach. Compared to the de facto standard, the QFS prototype shows superior ingest performance and comparable query performance on user metadata-intensive operations and superior performance on normal file metadata operations.« less

  10. Testing of the NASA Hypersonics Project Combined Cycle Engine Large Scale Inlet Mode Transition Experiment (CCE LlMX)

    NASA Technical Reports Server (NTRS)

    Saunders, J. D.; Stueber, T. J.; Thomas, S. R.; Suder, K. L.; Weir, L. J.; Sanders, B. W.

    2012-01-01

    Status on an effort to develop Turbine Based Combined Cycle (TBCC) propulsion is described. This propulsion technology can enable reliable and reusable space launch systems. TBCC propulsion offers improved performance and safety over rocket propulsion. The potential to realize aircraft-like operations and reduced maintenance are additional benefits. Among most the critical TBCC enabling technologies are: 1) mode transition from turbine to scramjet propulsion, 2) high Mach turbine engines and 3) TBCC integration. To address these TBCC challenges, the effort is centered on a propulsion mode transition experiment and includes analytical research. The test program, the Combined-Cycle Engine Large Scale Inlet Mode Transition Experiment (CCE LIMX), was conceived to integrate TBCC propulsion with proposed hypersonic vehicles. The goals address: (1) dual inlet operability and performance, (2) mode-transition sequences enabling a switch between turbine and scramjet flow paths, and (3) turbine engine transients during transition. Four test phases are planned from which a database can be used to both validate design and analysis codes and characterize operability and integration issues for TBCC propulsion. In this paper we discuss the research objectives, features of the CCE hardware and test plans, and status of the parametric inlet characterization testing which began in 2011. This effort is sponsored by the NASA Fundamental Aeronautics Hypersonics project

  11. A large-scale dataset of solar event reports from automated feature recognition modules

    NASA Astrophysics Data System (ADS)

    Schuh, Michael A.; Angryk, Rafal A.; Martens, Petrus C.

    2016-05-01

    The massive repository of images of the Sun captured by the Solar Dynamics Observatory (SDO) mission has ushered in the era of Big Data for Solar Physics. In this work, we investigate the entire public collection of events reported to the Heliophysics Event Knowledgebase (HEK) from automated solar feature recognition modules operated by the SDO Feature Finding Team (FFT). With the SDO mission recently surpassing five years of operations, and over 280,000 event reports for seven types of solar phenomena, we present the broadest and most comprehensive large-scale dataset of the SDO FFT modules to date. We also present numerous statistics on these modules, providing valuable contextual information for better understanding and validating of the individual event reports and the entire dataset as a whole. After extensive data cleaning through exploratory data analysis, we highlight several opportunities for knowledge discovery from data (KDD). Through these important prerequisite analyses presented here, the results of KDD from Solar Big Data will be overall more reliable and better understood. As the SDO mission remains operational over the coming years, these datasets will continue to grow in size and value. Future versions of this dataset will be analyzed in the general framework established in this work and maintained publicly online for easy access by the community.

  12. Evaluation of Penalized and Nonpenalized Methods for Disease Prediction with Large-Scale Genetic Data.

    PubMed

    Won, Sungho; Choi, Hosik; Park, Suyeon; Lee, Juyoung; Park, Changyi; Kwon, Sunghoon

    2015-01-01

    Owing to recent improvement of genotyping technology, large-scale genetic data can be utilized to identify disease susceptibility loci and this successful finding has substantially improved our understanding of complex diseases. However, in spite of these successes, most of the genetic effects for many complex diseases were found to be very small, which have been a big hurdle to build disease prediction model. Recently, many statistical methods based on penalized regressions have been proposed to tackle the so-called "large P and small N" problem. Penalized regressions including least absolute selection and shrinkage operator (LASSO) and ridge regression limit the space of parameters, and this constraint enables the estimation of effects for very large number of SNPs. Various extensions have been suggested, and, in this report, we compare their accuracy by applying them to several complex diseases. Our results show that penalized regressions are usually robust and provide better accuracy than the existing methods for at least diseases under consideration.

  13. On the Path to SunShot. Emerging Issues and Challenges in Integrating Solar with the Distribution System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmintier, Bryan; Broderick, Robert; Mather, Barry

    2016-05-01

    This report analyzes distribution-integration challenges, solutions, and research needs in the context of distributed generation from PV (DGPV) deployment to date and the much higher levels of deployment expected with achievement of the U.S. Department of Energy's SunShot targets. Recent analyses have improved estimates of the DGPV hosting capacities of distribution systems. This report uses these results to statistically estimate the minimum DGPV hosting capacity for the contiguous United States using traditional inverters of approximately 170 GW without distribution system modifications. This hosting capacity roughly doubles if advanced inverters are used to manage local voltage and additional minor, low-cost changesmore » could further increase these levels substantially. Key to achieving these deployment levels at minimum cost is siting DGPV based on local hosting capacities, suggesting opportunities for regulatory, incentive, and interconnection innovation. Already, pre-computed hosting capacity is beginning to expedite DGPV interconnection requests and installations in select regions; however, realizing SunShot-scale deployment will require further improvements to DGPV interconnection processes, standards and codes, and compensation mechanisms so they embrace the contributions of DGPV to system-wide operations. SunShot-scale DGPV deployment will also require unprecedented coordination of the distribution and transmission systems. This includes harnessing DGPV's ability to relieve congestion and reduce system losses by generating closer to loads; minimizing system operating costs and reserve deployments through improved DGPV visibility; developing communication and control architectures that incorporate DGPV into system operations; providing frequency response, transient stability, and synthesized inertia with DGPV in the event of large-scale system disturbances; and potentially managing reactive power requirements due to large-scale deployment of advanced inverter functions. Finally, additional local and system-level value could be provided by integrating DGPV with energy storage and 'virtual storage,' which exploits improved management of electric vehicle charging, building energy systems, and other large loads. Together, continued innovation across this rich distribution landscape can enable the very-high deployment levels envisioned by SunShot.« less

  14. Large-Scale Science Observatories: Building on What We Have Learned from USArray

    NASA Astrophysics Data System (ADS)

    Woodward, R.; Busby, R.; Detrick, R. S.; Frassetto, A.

    2015-12-01

    With the NSF-sponsored EarthScope USArray observatory, the Earth science community has built the operational capability and experience to tackle scientific challenges at the largest scales, such as a Subduction Zone Observatory. In the first ten years of USArray, geophysical instruments were deployed across roughly 2% of the Earth's surface. The USArray operated a rolling deployment of seismic stations that occupied ~1,700 sites across the USA, made co-located atmospheric observations, occupied hundreds of sites with magnetotelluric sensors, expanded a backbone reference network of seismic stations, and provided instruments to PI-led teams that deployed thousands of additional seismic stations. USArray included a comprehensive outreach component that directly engaged hundreds of students at over 50 colleges and universities to locate station sites and provided Earth science exposure to roughly 1,000 landowners who hosted stations. The project also included a comprehensive data management capability that received, archived and distributed data, metadata, and data products; data were acquired and distributed in real time. The USArray project was completed on time and under budget and developed a number of best practices that can inform other large-scale science initiatives that the Earth science community is contemplating. Key strategies employed by USArray included: using a survey, rather than hypothesis-driven, mode of observation to generate comprehensive, high quality data on a large-scale for exploration and discovery; making data freely and openly available to any investigator from the very onset of the project; and using proven, commercial, off-the-shelf systems to ensure a fast start and avoid delays due to over-reliance on unproven technology or concepts. Scope was set ambitiously, but managed carefully to avoid overextending. Configuration was controlled to ensure efficient operations while providing consistent, uniform observations. Finally, community governance structures were put in place to ensure a focus on science needs and goals, to provide an informed review of the project's results, and to carefully balance consistency of observations with technical evolution. We will summarize lessons learned from USArray and how these can be applied to future efforts such as SZO.

  15. Maximizing algebraic connectivity in air transportation networks

    NASA Astrophysics Data System (ADS)

    Wei, Peng

    In air transportation networks the robustness of a network regarding node and link failures is a key factor for its design. An experiment based on the real air transportation network is performed to show that the algebraic connectivity is a good measure for network robustness. Three optimization problems of algebraic connectivity maximization are then formulated in order to find the most robust network design under different constraints. The algebraic connectivity maximization problem with flight routes addition or deletion is first formulated. Three methods to optimize and analyze the network algebraic connectivity are proposed. The Modified Greedy Perturbation Algorithm (MGP) provides a sub-optimal solution in a fast iterative manner. The Weighted Tabu Search (WTS) is designed to offer a near optimal solution with longer running time. The relaxed semi-definite programming (SDP) is used to set a performance upper bound and three rounding techniques are discussed to find the feasible solution. The simulation results present the trade-off among the three methods. The case study on two air transportation networks of Virgin America and Southwest Airlines show that the developed methods can be applied in real world large scale networks. The algebraic connectivity maximization problem is extended by adding the leg number constraint, which considers the traveler's tolerance for the total connecting stops. The Binary Semi-Definite Programming (BSDP) with cutting plane method provides the optimal solution. The tabu search and 2-opt search heuristics can find the optimal solution in small scale networks and the near optimal solution in large scale networks. The third algebraic connectivity maximization problem with operating cost constraint is formulated. When the total operating cost budget is given, the number of the edges to be added is not fixed. Each edge weight needs to be calculated instead of being pre-determined. It is illustrated that the edge addition and the weight assignment can not be studied separately for the problem with operating cost constraint. Therefore a relaxed SDP method with golden section search is developed to solve both at the same time. The cluster decomposition is utilized to solve large scale networks.

  16. Assembling Large, Multi-Sensor Climate Datasets Using the SciFlo Grid Workflow System

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Manipon, G.; Xing, Z.; Fetzer, E.

    2008-12-01

    NASA's Earth Observing System (EOS) is the world's most ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the A-Train platforms (AIRS, AMSR-E, MODIS, MISR, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over periods of years to decades. However, moving from predominantly single-instrument studies to a multi-sensor, measurement-based model for long-duration analysis of important climate variables presents serious challenges for large-scale data mining and data fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another instrument (MODIS), and to a model (ECMWF), stratify the comparisons using a classification of the cloud scenes from CloudSat, and repeat the entire analysis over years of AIRS data. To perform such an analysis, one must discover & access multiple datasets from remote sites, find the space/time matchups between instruments swaths and model grids, understand the quality flags and uncertainties for retrieved physical variables, and assemble merged datasets for further scientific and statistical analysis. To meet these large-scale challenges, we are utilizing a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data query, access, subsetting, co-registration, mining, fusion, and advanced statistical analysis. SciFlo is a semantically-enabled ("smart") Grid Workflow system that ties together a peer-to-peer network of computers into an efficient engine for distributed computation. The SciFlo workflow engine enables scientists to do multi-instrument Earth Science by assembling remotely-invokable Web Services (SOAP or http GET URLs), native executables, command-line scripts, and Python codes into a distributed computing flow. A scientist visually authors the graph of operation in the VizFlow GUI, or uses a text editor to modify the simple XML workflow documents. The SciFlo client & server engines optimize the execution of such distributed workflows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. The engine transparently moves data to the operators, and moves operators to the data (on the dozen trusted SciFlo nodes). SciFlo also deploys a variety of Data Grid services to: query datasets in space and time, locate & retrieve on-line data granules, provide on-the-fly variable and spatial subsetting, and perform pairwise instrument matchups for A-Train datasets. These services are combined into efficient workflows to assemble the desired large-scale, merged climate datasets. SciFlo is currently being applied in several large climate studies: comparisons of aerosol optical depth between MODIS, MISR, AERONET ground network, and U. Michigan's IMPACT aerosol transport model; characterization of long-term biases in microwave and infrared instruments (AIRS, MLS) by comparisons to GPS temperature retrievals accurate to 0.1 degrees Kelvin; and construction of a decade-long, multi-sensor water vapor climatology stratified by classified cloud scene by bringing together datasets from AIRS/AMSU, AMSR-E, MLS, MODIS, and CloudSat (NASA MEASUREs grant, Fetzer PI). The presentation will discuss the SciFlo technologies, their application in these distributed workflows, and the many challenges encountered in assembling and analyzing these massive datasets.

  17. Assembling Large, Multi-Sensor Climate Datasets Using the SciFlo Grid Workflow System

    NASA Astrophysics Data System (ADS)

    Wilson, B.; Manipon, G.; Xing, Z.; Fetzer, E.

    2009-04-01

    NASA's Earth Observing System (EOS) is an ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the "A-Train" platforms (AIRS, AMSR-E, MODIS, MISR, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over periods of years to decades. However, moving from predominantly single-instrument studies to a multi-sensor, measurement-based model for long-duration analysis of important climate variables presents serious challenges for large-scale data mining and data fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another instrument (MODIS), and to a model (ECMWF), stratify the comparisons using a classification of the "cloud scenes" from CloudSat, and repeat the entire analysis over years of AIRS data. To perform such an analysis, one must discover & access multiple datasets from remote sites, find the space/time "matchups" between instruments swaths and model grids, understand the quality flags and uncertainties for retrieved physical variables, assemble merged datasets, and compute fused products for further scientific and statistical analysis. To meet these large-scale challenges, we are utilizing a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data query, access, subsetting, co-registration, mining, fusion, and advanced statistical analysis. SciFlo is a semantically-enabled ("smart") Grid Workflow system that ties together a peer-to-peer network of computers into an efficient engine for distributed computation. The SciFlo workflow engine enables scientists to do multi-instrument Earth Science by assembling remotely-invokable Web Services (SOAP or http GET URLs), native executables, command-line scripts, and Python codes into a distributed computing flow. A scientist visually authors the graph of operation in the VizFlow GUI, or uses a text editor to modify the simple XML workflow documents. The SciFlo client & server engines optimize the execution of such distributed workflows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. The engine transparently moves data to the operators, and moves operators to the data (on the dozen trusted SciFlo nodes). SciFlo also deploys a variety of Data Grid services to: query datasets in space and time, locate & retrieve on-line data granules, provide on-the-fly variable and spatial subsetting, perform pairwise instrument matchups for A-Train datasets, and compute fused products. These services are combined into efficient workflows to assemble the desired large-scale, merged climate datasets. SciFlo is currently being applied in several large climate studies: comparisons of aerosol optical depth between MODIS, MISR, AERONET ground network, and U. Michigan's IMPACT aerosol transport model; characterization of long-term biases in microwave and infrared instruments (AIRS, MLS) by comparisons to GPS temperature retrievals accurate to 0.1 degrees Kelvin; and construction of a decade-long, multi-sensor water vapor climatology stratified by classified cloud scene by bringing together datasets from AIRS/AMSU, AMSR-E, MLS, MODIS, and CloudSat (NASA MEASUREs grant, Fetzer PI). The presentation will discuss the SciFlo technologies, their application in these distributed workflows, and the many challenges encountered in assembling and analyzing these massive datasets.

  18. Enterprise PACS and image distribution.

    PubMed

    Huang, H K

    2003-01-01

    Around the world now, because of the need to improve operation efficiency and better cost effective healthcare, many large-scale healthcare enterprises have been formed. Each of these enterprises groups hospitals, medical centers, and clinics together as one enterprise healthcare network. The management of these enterprises recognizes the importance of using PACS and image distribution as a key technology in cost-effective healthcare delivery in the enterprise level. As a result, many large-scale enterprise level PACS/image distribution pilot studies, full design and implementation, are underway. The purpose of this paper is to provide readers an overall view of the current status of enterprise PACS and image distribution. reviews three large-scale enterprise PACS/image distribution systems in USA, Germany, and South Korean. The concept of enterprise level PACS/image distribution, its characteristics and ingredients are then discussed. Business models for enterprise level implementation available by the private medical imaging and system integration industry are highlighted. One current system under development in designing a healthcare enterprise level chest tuberculosis (TB) screening in Hong Kong is described in detail. Copyright 2002 Elsevier Science Ltd.

  19. Conversion of magnetic energy to runaway kinetic energy during the termination of runaway current on the J-TEXT tokamak

    NASA Astrophysics Data System (ADS)

    Dai, A. J.; Chen, Z. Y.; Huang, D. W.; Tong, R. H.; Zhang, J.; Wei, Y. N.; Ma, T. K.; Wang, X. L.; Yang, H. Y.; Gao, H. L.; Pan, Y.; the J-TEXT Team

    2018-05-01

    A large number of runaway electrons (REs) with energies as high as several tens of mega-electron volt (MeV) may be generated during disruptions on a large-scale tokamak. The kinetic energy carried by REs is eventually deposited on the plasma-facing components, causing damage and posing a threat on the operation of the tokamak. The remaining magnetic energy following a thermal quench is significant on a large-scale tokamak. The conversion of magnetic energy to runaway kinetic energy will increase the threat of runaway electrons on the first wall. The magnetic energy dissipated inside the vacuum vessel (VV) equals the decrease of initial magnetic energy inside the VV plus the magnetic energy flowing into the VV during a disruption. Based on the estimated magnetic energy, the evolution of magnetic-kinetic energy conversion are analyzed through three periods in disruptions with a runaway current plateau.

  20. A novel heuristic algorithm for capacitated vehicle routing problem

    NASA Astrophysics Data System (ADS)

    Kır, Sena; Yazgan, Harun Reşit; Tüncel, Emre

    2017-09-01

    The vehicle routing problem with the capacity constraints was considered in this paper. It is quite difficult to achieve an optimal solution with traditional optimization methods by reason of the high computational complexity for large-scale problems. Consequently, new heuristic or metaheuristic approaches have been developed to solve this problem. In this paper, we constructed a new heuristic algorithm based on the tabu search and adaptive large neighborhood search (ALNS) with several specifically designed operators and features to solve the capacitated vehicle routing problem (CVRP). The effectiveness of the proposed algorithm was illustrated on the benchmark problems. The algorithm provides a better performance on large-scaled instances and gained advantage in terms of CPU time. In addition, we solved a real-life CVRP using the proposed algorithm and found the encouraging results by comparison with the current situation that the company is in.

Top